Reports are surfacing throughout the blogosphere concerning the ever-growing recesses of Google’s supplemental results. Now more than ever, sites that were once highly ranked in the main index are beginning to find themselves in the confines of the supplemental index.
Some have likened Google’s supplemental index to a virtual refuse pile, an online prison where all sorts of outdated web content are doomed to a fate of obscurity for all time.
Others, such as Matt Cutts, believe that the fear and loathing (forgive me Dr. Thompson) that accompany supplemental results may be a tad melodramatic. In a blog entry last month, Matt talked a bit about the SI and the nature of its content:
As a reminder, supplemental results aren’t something to be afraid of; I’ve got pages from my site in the supplemental results, for example. A complete software rewrite of the infrastructure for supplemental results launched in Summer o’ 2005, and the supplemental results continue to get fresher.
Having urls in the supplemental results doesn’t mean that you have some sort of penalty at all; the main determinant of whether a url is in our main web index or in the supplemental index is PageRank. If you used to have pages in our main web index and now they’re in the supplemental results, a good hypothesis is that we might not be counting links to your pages with the same weight as we have in the past. The approach I’d recommend in that case is to use solid white-hat SEO to get high-quality links (e.g. editorially given by other sites on the basis of merit).
Don’t be afraid of the supplemental index? When is the last time you heard Google users praising the relevant information they found within the supplemental results?
And your expert advice is to get high quality links? Come on Matt, we’re not teaching SEO 101 here. This idea of quality linkage is not a new concept, and is certainly not unique to the dilemma of winding up in the supplemental index. Surely there must be some other factor that contribute to a site’s presence within the SI, or at least some methods of prevention?
Luckily for the faithful WebProNews readers, I’ve done some digging and have found some insightful commentary with methods to avoid the supplemental index.
We talk a lot about the external links coming into a site, but Andy Hagans tells us that internal links can be equally vital in avoiding the SI:
Get some links to internal pages. This is all about convincing Google your site doesn’t have “hollow shell syndrome”–when a site has, say, 20 pages, and a few dozen backlinks, but 100% of those backlinks are pointing to the homepage. Most often, the homepage of the site is in the normal index but all of the internal pages have gone supplemental.
I usually go “brute force” at one internal page and get 3 or 4 links to it (giving this one internal page so much link weight that Google pretty much has to index it); normally, GoogleBot revisits the entire site and re-crawls and indexes the other internal pages, too (up to a point: if the site has hundreds or thousands of pages, you’ll need to rinse+repeat this a few times).
Staying the realm of links, Aaron Wall fills us in on how the site you link to can also have an impact:
Since your site has a rather low PageRank you may want to only list your blogroll on your home page instead of every page of your blog. Take out other parts of your site that heavily duplicate each other from page to page. Also consider removing your sitewide links to some of the unimportant pages on your site to flow more of your link equity throughout your site.
I would also recommend removing the tagging pages on your site as your site is already navigable via your categories, and the tags create low value noise pages that reduce your link equity distributed on the quality pages. I also think it is foolish to link at all those auto-generated Technorati pages…that wastes a lot of your link authority.
I would also recommend not linking to some of the pages you don’t want Google to index, such as those printer friendly pages. You may also want to block those printer friendly URLs using the Robots.txt protocol.
This is just a sampling of the information that both Aaron and Andy offer up to those looking to stay out of the supplemental results. So if you’re not like Matt Cutts and you believe that the SI is something to be afraid, I suggest you check out what these guys have to say.