Spiders

From S23Wiki
Jump to: navigation, search



WebSpider:

What Is a Web Spider anyway?

A Web spider, also known as a robot, or a worm, or a wanderer or a webcrawler, is actually a computer program that traverses the Web's hypertext structure by retrieving a document and recursively retrieving all documents that are referenced. This program retrieves information from remote sites using the standard Web protocol-Hypertext Transfer Protocol (HTTP).

The term "spiders" or "worm," however, may give the impression that this program actually moves from one site to another and multiplies itself as it moves. These are actually some characteristics of a virus, and that is why the term "robot" could be a better name for a program that traverses the Web to retrieve information. The terms "spiders" and "robot" are used interchangeably in this article.