Request Media Kit

Matt Cutts Talks About How Google Handles Ajax

Google’s Matt Cutts put up a new Webmaster Help video, discussing how Google deals with Ajax. He takes on the following user-submitted question: How effective is Google now at handling content s...
Matt Cutts Talks About How Google Handles Ajax
Written by Chris Crum
  • Google’s Matt Cutts put up a new Webmaster Help video, discussing how Google deals with Ajax. He takes on the following user-submitted question:

    How effective is Google now at handling content supplied via Ajax, is this likely to improve in the future?

    “Well, let me take Ajax, which is Asynchronous Javascript, and make it just Javascript for the time being,” says Cutts. “Google is getting more effective over time, so we actually have the ability not just to scan in strings of Javascript to look for URLs, but to actually process some of the Javascript. And so that can help us improve our crawl coverage quite a bit, especially if people use Javascript to help with navigation or drop-downs or those kinds of things. So Asynchronous Javascript is a little bit more complicated, and that’s maybe further down the road, but the common case is Javascript.”

    “And we’re getting better, and we’re continuing to improve how well we’re able to process Javascript,” he continues. “In fact, let me just take a little bit of time and mention, if you block Javascript or CSS in your robots.txt, where Googlebot can’t crawl it, I would change that. I would recommend making it so that Googlebot can crawl the Javascript and can crawl the CSS, because that makes it a lot easier for us to figure out what’s going on if we’re processing the Javascript or if we’re seeing and able to process and get a better idea of what the page is like.”

    As a matter of fact, Cutts actually put out a separate video about this last month, in which he said, “If you block Googlebot from crawling javascript or CSS, please take a few minutes and take that out of the robots.txt and let us crawl the javascript. Let us crawl the CSS, and get a better idea of what’s going on on the page.”

    “So I absolutely would recommend trying to check through your robots.txt, and if you have disallow slash Javascript, or star JS, or star CS, go ahead and remove that, because that helps Googlebot get a better idea of what’s going on on the page,” he reiterates in the new video.

    In another new video, Cutts talks about why Google won’t remove pages from its index at your request.

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit