Jump to content

What is robots.txt?


PoolMaster

Recommended Posts

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.

Link to comment
Share on other sites

It's basically the file with set of instructions for various robots, mainly search one who will visit your website each from different search engine and that set will set how it should behave with your site, which page to visit and which is not allowed or at all anyway. Very usefull thing by many reasons.

Link to comment
Share on other sites

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned. 

Link to comment
Share on other sites

The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. ... The slash after “Disallow” tells the robot to not visit any pages on the site.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...