Jump to content

What is robots.txt?


handmaderug

Recommended Posts

A robots.txt file tells search engine crawlers which pages or files the crawler can or can't request from your site. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a web page out of Google. To keep a web page out of Google, you should use noindex directives, or password-protect your page.

Link to comment
Share on other sites

  • 3 weeks later...

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...