Jump to content

Explain Spiders, Robots, and Crawlers


smartlinktechlab

Recommended Posts

Also known as Robot, Bot or Spider. These are programs used by search engines to explore the Internet and automatically download web content available on web sites.  Many sites, in particular search engines, use crawlers to maintain an updated database.

 

Link to comment
Share on other sites

Spiders and crawlers are responsible for indexing and retrieving results in search results of a Search Engine. Google Bot is the crawler of  Google.

Web crawlers go through web pages, look for relevant keywords, hyperlinks, and content, and bring information back to the web servers for indexing.

Robots have same functionality, you can also block a particular page of a website from crawling with the help of robots.txt file.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...