Jump to content

how robots.txt is useful for SEO?


rscomponentseo

Recommended Posts

The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. Let's say a search engine is about to visit a site.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...