It comes in the form of an addition to the Fetch as Google tool, which lets you see how Googlebot renders a page. Submit a URL with “Fetch and render” in the Fetch as Google feature under Crawl in Webmaster Tools.
“Googlebot follows the robots.txt directives for all files that it fetches,” Salant explains. “If you are disallowing crawling of some of these files (or if they are embedded from a third-party server that’s disallowing Googlebot’s crawling of them), we won’t be able to show them to you in the rendered view. Similarly, if the server fails to respond or returns errors, then we won’t be able to use those either (you can find similar issues in the Crawl Errors section of Webmaster Tools). If we run across either of these issues, we’ll show them below the preview image.”
Google recommends making sure Gooblebot can access any embedded resource that contributes to your site’s visible content or layout in any meaningful way to make it easier to use the new tool. You can leave out social media buttons, some fonts and/or analytics scripts, as they don’t “meaningfully contribute”. Google says these can be left disallowed from crawling.
Image via Google