Jump to content

What is the purpose of using robots.txt?


Papon

Recommended Posts

The robots exclusion standard, also known as the robots exclusion protocol or simply robots.txt, is a standard used by websites to communicate with web crawlers and other web robots. The standard specifies how to inform the web robot about which areas of the website should not be processed or scanned.
 

Link to comment
Share on other sites

Hello,

Robots.txt, is a standard used by websites to communicate with web crawlers and other web robots.

Robots.txt is a text file. It is through this file. It gives instruction to search engine crawlers about indexing and caching of a webpage , file of a website or directory, domain.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...