Article Search

Rajat Kumar (Tech Head)
August 01, 2017
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize web sites.
Top Authors
-
Rajat Kumar
36 Posts | 28 Points
Popular Posts
- Difference between serialize() & FormData() methods in jQuery. (19)
- Difference between SMTP, IMAP and POP3 in mail & data transmission. (10)
- Basics of Cyber security to protect information & data from hackers (Session-1). (4)
- Concepts & techniques to find vulnerability and prevent the cyber-attacks (Session-2). (3)
- E-R Model Diagram and Extended E-R Feature in DBMS (3)
Top Topics
Popular Tags
webmasteris google stadia going to fail responsive site PDFgoogle stadia price WHMShared hosting twitter card tags information management CSSexact matchDoS foundation of SEO video adshtmlgoogle stadia release date search engine crawlers relational modelconversion value data entity ROAS rel attribute social media optimizationgoogle stadia android technical SEO factors