The report provides the names of the hosts from which your site is using blocked resources. If you click on a row, it gives you the list of blocked resources and the pages that embed them. This should help you figure out the issues and take care of them so Google can better crawl and index your content.
Some resources will be hosted on your site, while others will be hosted on others. Clicking on a host will also give you a count of pages on your site affected by each blocked resource. Clicking on any blocked resource will give you a list of pages that load that resource. If you click on any page in the table hosting a blocked resource, you’ll get instructions for unblocking that particular resource.
In a help center article, Google runs down five steps for evaluating and redcuing your list of blocked resources:
1. Open the Blocked Resources Report to find a list of hosts of blocked resources on your site. Start with the hosts that you own, since you can directly update the robots.txt files, if needed.
2. Click a host on the report to see a list of blocked resources from that host. Go through the list and start with those that might affect the layout in a meaningful way. Less important resources, such as tracking pixels or counters, aren’t worth bothering with.
3. For each resource that affects layout, click to see a list of your pages that uses it. Click on any page in the list and follow the pop-up instructions for viewing the difference and updating the blocking robots.txt file. Fetch and render after each change to verify that the resource is now appearing.
4. Continue updating resources for a host until you’ve enabled Googlebot access to all the important blocked resources.
5. Move on to hosts that you don’t own, and if the resources have a strong visual impact, either contact the webmaster of those sites to ask them to consider unblocking the resource to Googlebot, or consider removing your page’s dependency on that resource.
There’s also an update to Fetch and Render, which shows how the blocked resources matter. When you request a URL to be fetched and rendered, it shows screenshots rendered both as Googlebot and as a typical user, so you get a better grasp on the problems.
“Webmaster Tools attempts to show you only the hosts that you might have influence over, so at the moment, we won’t show hosts that are used by many different sites (such as popular analytics services),” says Google webmaster trends analyst John Mueller. “Because it can be time-consuming (usually not for technical reasons!) to update all robots.txt files, we recommend starting with the resources that make the most important visual difference when blocked.”
In January, Google called on webmasters to offer suggestions for new features for Webmaster Tools. It set up a Google Moderator page where people could leave and vote on suggestions. Among the most popular suggestions were:
“I would like to see in WMT data from 12 months, not 3 as it is now :)”
“An automated action viewer, so webmasters can see if they were impacted by an algorithm such as Panda or Penguin>”
“Bounce back measuring tool. Did the user go back to Google for a similar search or did they find what they needed?”
Google has since given webmasters a new structured data tool.
Image via Google