Crawling is the
process by which Googlebot discovers new and updated pages to be
added to the Google index. We use a huge set of computers to fetch
(or "crawl") billions of pages on the web. The program that does the
fetching is called Googlebot(also known as a robot, bot, or spider).
No comments:
Post a Comment