How Googlebot handles crawling after major website changes

In a webmaster hangout on Google, Google’s John Mueller explained how Googlebot handles crawling after major website changes:

“If we recognize that there are significant changes on the website, we will just try to crawl the known URLs a little bit faster.

So we have a bunch of URLs that we already know from your website, and we basically decide: ‘Oh, we want to make sure that our index is as fresh as possible. Therefore, we will take this list of URLs and crawl them as quickly as we can.’ It depends on your server, what we think your server can take. But we’ll try to get through that a little bit faster than normal.

That’s particularly the case where we find significant changes across a site with regards to maybe structured data or with the URL choices, with real canonical redirects, those kinds of things.”

This basically means that there can be increased Googlebot crawling activity after a major website change. You can view the view here here:

Help Googlebot to crawl your pages

If you want to make sure that Googlebot can parse your pages as easily as possible, check them with the website audit tool:

Check your pages

Tom Cassy

Tom Cassy is the CEO of SEOprofiler. He blogs about search engine optimization and website marketing topics at “http://blog.seoprofiler.com”.