Google Scrubs SubDomains

    December 11, 2007

I had a chance to catch up with Matt Cutts and Vanessa Fox last week at the Las Vegas Pubcon. We talked about a variety of things – from the hacking of someone’s out of version WordPress blog to the recent PageRank updates at Google. One of the more recent and perhaps lesser-publicized changes we discussed however was Google’s recent decision to change the way they handle content on subdomains. You can find our video interview on the WebProNews Video blog incidentally.

Google Scrubs SubDomains
Google Scrubs SubDomains

A couple of weeks ago, Google made a change in their algorithm. Prior to the change, a subdomain was considered more or less like a separate URL. Matt explained the reason that presented a problem was due to the fact that multiple subdomains and main domains had a tendency to dominate search listings for some queries – specifically, for example, elaborate long tail query strings.

"We did hear complaints that for some types of searches (e.g. esoteric or long-tail searches), Google could return a search page with lots of results all from one domain. In the last few weeks we changed our algorithms to make that less likely to happen." – Matt Cutts

Matt described the changes as not something anyone should get too worked up over. Google will still return multiple results from a single domain where appropriate. However, you should no longer find whole pages of results totally dominated by one site and 5 or 6 of its subdomains.

We asked Matt if there was a preference or order of importance ascribed to domains vs. subdomains – in other words would Google tend to return one over the other in the results. He told us plainly that there is no algorithmic preference to subdomains vs. domains, so they are treated as equally relevant. Matt’s personal preference in this regard would be subdirectories over subdomains because he finds them more "convenient". Subdomains however, are useful he admits for separating dramatically different sections of a site — for example, or

At the end of the day, based on our conversation with Matt at Pubcon and from the comments he has made on his blog, this doesn’t sound like it’s going to have any huge impact on very many sites. If you had high results for a query on your subdomains, from what I gather, you can probably expect to retain those. If you had 8 out of the top 10 listings, you can probably look for some of those to drop out – though not necessarily lose your high positions for the ones that remain.

When Matt brought this up at the Pubcon in Las Vegas last week there was a significant amount of murmuring and buzz. That’s pretty much to be expected when you have Matt Cutts talking about an algorithmic change to Google at a conference full of webmasters.

In this case though, it doesn’t appear to be anything most of us need to be fretting over. After all, as Matt said on his blog, the change was actually implemented a couple of weeks ago and there haven’t been any frantic comments on his blog and I haven’t heard a peep about it in WebProWorld – or anywhere else for that matter. So, yes, there was an algo tweak concerning subdomains, but no, there probably isn’t any cause for panic.