Web crawler

Support for phpBB mods/hacks.

Moderator: Moderators

Web crawler

Postby flowrencemary on Wed Apr 29, 2009 11:01 am

A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner. Other terms for Web crawlers are ants, automatic indexers, bots, and worms[1] or Web spider, Web robot, or—especially in the FOAF community—Web scutter[2].

This process is called Web crawling or spidering. Many sites, in particular search engines, use spidering as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. Crawlers can also be used for automating maintenance tasks on a Web site, such as checking links or validating HTML code. Also, crawlers can be used to gather specific types of information from Web pages, such as harvesting e-mail addresses (usually for spam).

seo india
Registered User
Registered User
Posts: 8
Joined: Wed Apr 15, 2009 10:44 am

Return to phpBB Mods

Who is online

Users browsing this forum: No registered users and 1 guest