Adsense web, Tools, PLR articles, Ebooks SEBENAGHAU: Spidering

AdBrite

Friday, October 2, 2009

Spidering

Before a search engine can tell you where a file or document
is, it must be found. To find information on the hundreds of
millions of Web pages that exist, a search engine employs
special software robots, called spiders, to build lists of
the words found on Web sites.

When a spider is building its lists, the process is called
Web crawling.

In order to build and maintain a useful list of words, a
search engine's spiders have to look at a lot of pages. How
does any spider start its travels over the Web? The usual
starting points are lists of heavily used servers and very
popular pages. The spider will begin with a popular site,
indexing the words on its pages and following every link
found within the site. In this way, the spidering system
quickly begins to travel, spreading out across the most
widely used portions of the Web.

No comments:

Post a Comment