Article Search
1 Result(s) Found

Rajat Kumar (Tech Head)
August 01, 2017
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize web sites.
Top Authors
-
Rajat Kumar
30 Posts | 28 Points
Popular Posts
- Difference between serialize() & FormData() methods in jQuery. (19)
- Difference between SMTP, IMAP and POP3 in mail & data transmission. (10)
- Basics of Cyber security to protect information & data from hackers (Session-1). (4)
- Concepts & techniques to find vulnerability and prevent the cyber-attacks (Session-2). (3)
- Complete list of PHP Supported Timezones for date and time functions. (2)
Top Topics
Popular Tags
match type President of india on-page SEOphrase match relationship sets on-page seo techniques indexing PPC adwordsheading tags E-R diagram Entity-Relationship model SERP marketinggoogle stadia games new adwords 2018malware UX local seo techniquesis google stadia worth it SEO technique update angular to latest versionhtml data relationship SMTP WHM