What is Crawler
What is Crawler

Crawler: –  The Crawler is known as a  “Bot” or a  “Spider”. This part of the Search engine wanders the web, following links and picking up the information for its database. Crawlers do most of their work at time of the Day when search engines are less busy, but they typically visited frequently updated pages more often.Another Definition of Crawler: –  A crawler is a program that visit Websites and read their pages and other information in order to create entries for Search Engine Index. There Several Search Engine on the web and all have such a program which is also known as a “Crawler” or a “Bot”. Crawlers are typically programmed to visit sites that have been submitted by there owner as new or updated. Entire sites or specific pages can be selectively visited and indexed. Crawler read one pages at a time and following links to read other pages of the sites until all pages have been read.

Web Crawler Architecture