Should You Block Google From Crawling Your Slower Pages?

Google’s head of web spam, Matt Cutts put out a new video discussing site speed’s impact on rankings. This is not the first time Cutts has addressed the issue, but it’s a somewhat di...
Should You Block Google From Crawling Your Slower Pages?
Written by Chris Crum
  • Google’s head of web spam, Matt Cutts put out a new video discussing site speed’s impact on rankings. This is not the first time Cutts has addressed the issue, but it’s a somewhat different take than we’ve seen before, as it’s in direct response to the following user-submitted question:

    You mentioned that site speed is a factor in ranking. On some pages, our site uses complex queries to return the users request, giving a slow pagetime. should we not allow Googlebot to index these pages to improve our overall site speed?

    “I would say, in general, I would let Googlebot crawl the same pages that users see,” says Cutts. “The rule of thumb is this. Only something like 1 out of 100 searches are affected by our page speed mechanism that says, things that are too slow rank lower. And if it’s 1 out of a 100 searches, that’s 1 out of roughly 1,000 websites. So if you really think that you might be in the 1 out of 1,000, that you’re the slowest, then maybe that’s something to consider.”

    “But in general, most of the time, as long as your browser isn’t timing out, as long as it’s not starting to be flaky, you should be in relatively good shape,” he continues. “You might, however, think about the user experience. If users have to wait 8, 9, 10, 20 seconds in order to get a page back, a lot of people don’t stick around that long. So there’s a lot of people that will do things like cache results and then compute them on the fly later. And you can fold in the new results.”

    “But if it’s at all possible to pre-compute the results, or cache them, or do some sort of way to speed things up, that’s great for users,” Cutts says. “Typically, as long as there is just a few number of pages that are very slow or if the site overall is fast, it’s not the kind of thing that you need to worry about. So you might want to pay attention to making it faster just for the user experience.But it sounds like I wouldn’t necessarily block those slower pages out from Googlebot unless you’re worried that you’re in one of those 1 out of a 1,000, where you’re really, really the outlier in terms of not being the fastest possible site.”

    In November, we referenced another video Cutts did talking about page speed, where he also dropped the “1 out of 100 searches” stat. He said basically not to overly stress about speed as a ranking factor. Both the new video and that video were actually uploaded to YouTube in August, so this advice is already older than it appears. Today’s video, however, was just made public by Google, so it stands to reason that the advice from the company remains the same.

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit