Crawling December: The how and why of Googlebot crawling
You may have heard that Google Search needs to do a bit of work before a web page can show up in Google Search results. One of these steps is called crawling. Crawling for Google Search is done by Googlebot, a program running on Google servers that retrieves a URL and handles things like network errors, redirects, and other small complications that it might encounter as it works its way through the web. But there are a few details that aren't often talked about. Each week this month we're going to explore some of those details as they may have a significant effect on how your sites are crawled.
