Article Search


1 Result(s) Found
How-to-create-and-configure-robots.txt-file
Rajat Kumar (Tech Head)
August 01, 2017

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize web sites.

0 Comments  |   0 Kudos   |   821 Views




Ads Here