Google announced that it is deprecating its AJAX crawling scheme, and webmasters need to be aware of how things have changed to ensure Google is crawling their site correctly and in the most effective manner possible.
Does this have any bearing on your site? Let us know in the comments.
Specifically, Google is no longer recommending the AJAX crawling proposal it made six years ago. That was made at the time to benefit webmasters and users by making content from rich and interactive AJAX-based sites universally accessible through search results. Google said it believed this would “significantly improve the web.”
The technology has improved a great deal in six years, as you would probably expect (for perspective, 2009 was the year of the iPhone 3GS).
Last year, Google wrote a blog post about this and how it was starting to understand pages better. It also offered some information on things that may lead to a negative impact on search results for your site.
“For example, you can use the History API pushState() to ensure accessibility for a wider range of browsers (and our systems),” says Google Search Quality Analyst Kazushi Nagayama.
Nagayama shares a few Qs and As related to all of this, which should help webmasters better understand the preferred approach:
Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you’ve deprecated your recommendation?
A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you’re making the next update for your site. Instead of the _escaped_fragment_ URLs, we’ll generally crawl, render, and index the #! URLs.
Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects?
A: If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls. .
A: In general, websites shouldn’t pre-render pages only for Google — we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user’s experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.
Are you already doing things the right way, or do you need to make changes based on what Google had to say this week? Let us know in the comments.