Article Search
1 Result(s) Found

Rajat Kumar (Tech Head)
August 01, 2017
The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. Robots are often used by search engines to categorize web sites.
Top Authors
-
Rajat Kumar
30 Posts | 28 Points
Popular Posts
- Difference between serialize() & FormData() methods in jQuery. (19)
- Difference between SMTP, IMAP and POP3 in mail & data transmission. (10)
- Basics of Cyber security to protect information & data from hackers (Session-1). (4)
- Concepts & techniques to find vulnerability and prevent the cyber-attacks (Session-2). (3)
- E-R Model Diagram and Extended E-R Feature in DBMS (3)
Top Topics
Popular Tags
phishing generalization secure networksFormData VPN open graph tags data steal search engine crawlers online marketing technical seo checklistROI avg conversion valuejquery extended E-R features update global packages adwords MySql on-page SEO search marketingrobots.txt file web developmentgoogle stadia alternative entity sets specializationMicrosoft windows 10