(1) Also called a "spider," "robot" (bot) and "intelligent agent," a crawler is a program that search engines such as Google and Bing use to index all the pages on a website by following the links from page to page. Crawlers start with a list of known sites and go from there. The search engine summarizes the content and adds the links to their indexes.
Crawlers are also used to locate Web pages that sell a particular product or to find blogs that have opinions about a product. See
surface Web and
bot.
(2) Software that captures a website for browsing offline. See
offline browser.