As Google explains, doing so provides optimal rendering and indexing for your site, as disallowing crawling of Jacascript or CSS files harms how well the algorithms render and index content. This can lead to, in Google words, “suboptimal rankings”.
“Historically, Google indexing systems resembled old text-only browsers, such as Lynx, and that’s what our Webmaster Guidelines said,” says Google’s Pierre Far. “Now, with indexing based on page rendering, it’s no longer accurate to see our indexing systems as a text-only browser. Instead, a more accurate approximation is a modern web browser.”
“Just like modern browsers, our rendering engine might not support all of the technologies a page uses. Make sure your web design adheres to the principles of progressive enhancement as this helps our systems (and a wider range of browsers) see usable content and basic functionality when certain web design features are not yet supported,” Far adds. “Pages that render quickly not only help users get to your content easier, but make indexing of those pages more efficient too.”
After Google first made the rendering announcement, it released a tool called “Fetch and Render” under the Crawl section in Webmaster Tools.
Image via Google