2 replies
  • SEO
  • |
A crawler is a program that visits Web sites and reads their pages and other information in order to create entries for a search engine index. The major search engines on the Web all have such a program, which is also known as a "spider" or a "bot." Crawlers are typically programmed to visit sites that have been submitted by their owners as new or updated. Entire sites or specific pages can be selectively visited and indexed. Crawlers apparently gained the name because they crawl through a site a page at a time, following the links to other pages on the site until all pages have been read.
#crawler
  • Profile picture of the author socialbookmark
    Yes. There is a good article explains it well here:
    Code:
    en.wikipedia.org/wiki/Web_crawler
    Warriorforum has many professional users that are familiar with these primitive information. So its better to write new things that users learn more.
    Signature

    I love warriorforum. zendegiyesabz

    {{ DiscussionBoard.errors[6034694].message }}
  • Profile picture of the author jackjohns
    A web crawler is a way for the search engines and other users to consistently ensure that their databases are up-to-date. Also, Web crawlers not only keep a copy of all the visited pages for later processing.
    {{ DiscussionBoard.errors[6035071].message }}

Trending Topics