Will Google’s Link Disavow Tool Come Back To Haunt Webmasters?

    October 19, 2012
    Chris Crum

Back in June, during the height of the Penguin update freakout, Google’s Matt Cutts hinted that Google would launch a “link disavow” tool, so that webmasters can tell Google the backlinks they want Google to ignore. This means links from around the web that are potentially hurting a site’s rankings in Google could be ignored, and no longer count against the site in question. This is something that many webmasters and SEOs have wanted for a long time, and especially since the Penguin update launched earlier this year. On Tuesday, Google made these dreams come true by finally launching the tool, after months of anticipation.

Is it what you hoped it would be? Do you intend to use it? Let us know in the comments.

How It Works

The tool tells users, “If you believe your site’s ranking is being harmed by low-quality links you do not control, you can ask Google not to take them into account when assessing your site.”

It is worth noting, however, that just because you use the tool, and tell Google to ignore certain links, it is not a guarantee that Google will listen. It’s more of a helpful suggestion. Google made this clear in the Q&A section of the blog post announcing the tool.

“This tool allows you to indicate to Google which links you would like to disavow, and Google will typically ignore those links,” Google Webmaster Trends Analyst Jonathan Simon says. “Much like with rel=’canonical’, this is a strong suggestion rather than a directive—Google reserves the right to trust our own judgment for corner cases, for example—but we will typically use that indication from you when we assess links.” He adds:

If you’ve ever been caught up in linkspam, you may have seen a message in Webmaster Tools about “unnatural links” pointing to your site. We send you this message when we see evidence of paid links, link exchanges, or other link schemes that violate our quality guidelines. If you get this message, we recommend that you remove from the web as many spammy or low-quality links to your site as possible. This is the best approach because it addresses the problem at the root. By removing the bad links directly, you’re helping to prevent Google (and other search engines) from taking action again in the future. You’re also helping to protect your site’s image, since people will no longer find spammy links pointing to your site on the web and jump to conclusions about your website or business.

If you’ve done as much as you can to remove the problematic links, and there are still some links you just can’t seem to get down, that’s a good time to visit our new Disavow links page.

With the tool, you simply upload a .txt file containing the links you want Google to disavow. You add one URL per line. You can block specific URLs or whole domains. To block a domain, use this format: domain:example.com. You can add comments by including a # before them. Google ignores the comments. The file size limit is 2MB.

If you haven’t watched it yet, watch Matt Cutts’ video explaining the tool. If it’s something you’re considering using, it’s definitely worth the ten minutes of your time:

Cutts warns repeatedly that most people will not want to use this tool, and you should really only use it if you’ve already tried hard to get the questionable links removed, but haven’t been able to get it done. For more details and minutia about how this tool works, there is a whole help center article dedicated to it.

Negative SEO

Negative SEO, a practice in which competitors attack a site with spammy links and whatnot, has been debated for a long time, and many will see this tool as a way to eliminate the effects fo this. Google has specifically responded to this.

“The primary purpose of this tool is to help clean up if you’ve hired a bad SEO or made mistakes in your own link-building,” says Simon. “If you know of bad link-building done on your behalf (e.g., paid posts or paid links that pass PageRank), we recommend that you contact the sites that link to you and try to get links taken off the public web first. You’re also helping to protect your site’s image, since people will no longer find spammy links and jump to conclusions about your website or business. If, despite your best efforts, you’re unable to get a few backlinks taken down, that’s a good time to use the Disavow Links tool.”

“In general, Google works hard to prevent other webmasters from being able to harm your ranking,” he adds. “However, if you’re worried that some backlinks might be affecting your site’s reputation, you can use the Disavow Links tool to indicate to Google that those links should be ignored. Again, we build our algorithms with an eye to preventing negative SEO, so the vast majority of webmasters don’t need to worry about negative SEO at all.”

Cutts also talked about the subject at PubCon, where the tool was announced. Search Engine Roundtable has a liveblogged account of what he said, which reads:

All the negative SEO complaints he sees, or most of it, is really not negative SEO hurting you. It is a much better use of your time to make your site better vs hurting someone else. At the same time, we’ve seen cases of this as an issue. I.e. buying a new domain and needing to clean up that site. There are people who want to go through this process. Plus SEOs that take on new clients that went through bad SEOs.

Warnings And Overreaction

Again, you don’t want to use the tool in most cases. It’s pretty much a last resort tactic for links you’re positive are hurting you, and can’t get removed otherwise. Google has warned repeatedly about this, as over-use of the tool can lead to webmatsers shooting themselves in the foot. If you use it willy nilly, you may be hurting your site by getting rid of links that were actually helping you in the first place.

It seems like common sense, but ever since the Penguin update, we’ve seen plenty of examples of webmasters frantically trying to get links removed that even they admit they would like to keep, if not for fear that Google might frown upon them (when in reality, it’s likely that they did not).

Aaron Wall from SEOBook makes some other interesting points on the warnings front. He writes:

The disavow tool is a loaded gun.

If you get the format wrong by mistake, you may end up taking out valuable links for long periods of time. Google advise that if this happens, you can still get your links back, but not immediately.

Could the use of the tool be seen as an admission of guilt? Matt gives examples of “bad” webmaster behavior, which comes across a bit like “webmasters confessing their sins!”. Is this the equivalent of putting up your hand and saying “yep, I bought links that even I think are dodgy!”? May as well paint a target on your back.

Google Wants To Depend More On Social And Authorship

If overreaction is an issue, and it seems fairly likely that it will be, despite Google’s warnings, this tool could really mess with how Google treats links, which have historically been the backbone of its algorithm.

“Links are one of the most well-known signals we use to order search results,” says Simon. “By looking at the links between pages, we can get a sense of which pages are reputable and important, and thus more likely to be relevant to our users. This is the basis of PageRank, which is one of more than 200 signals we rely on to determine rankings. Since PageRank is so well-known, it’s also a target for spammers, and we fight linkspam constantly with algorithms and by taking manual action.”

It will be interesting to see how Google treats the links webmasters tell it to ignore, which are not actually hurting them in the first place. I would not be surprised to see some in the industry test Google on this.

Google does not like it when people manipulate the way it counts links, yet they’ve just given webmasters a tool to do so, even if it’s kind of the opposite of the black hat techniques Google has always tried to eliminate (link schemes, paid links, etc.). Now (and we’ve seen this even before the tool existed), you potentially have webmasters trying to get rid of links that actually do have value, even in Google’s eyes. I mean, seriously, what are the odds that this tool will be used 100% how Google intends it to be used, which is apparently in rare circumstances?

Google seems to be grooming other signals to play a greater role in the algorithm. While they’re not there yet, based on various comments the company has made, social signals will almost certainly play an increasingly weighty role. CEO Larry Page was asked about this at a conference this week.

He responded, “I think it’s really important to know, again, who you’re with, what the community is – it’s really important to share things. It’s really important to know the identity of people so you can share things and comment on things and improve the search ecosystem, you know, as you – as a real person…I think all those things are absolutely crucial.”

“That’s why we’ve worked so hard on Google+, on making [it] an important part of search,” he continued. “Again, like Maps, we don’t see that as like something that’s like a separate dimension that’s never going to play into search. When you search for things, you want to know the kinds of things your friends have looked at, or recommended, or wrote about, or shared. I think that’s just kind of an obvious thing.”

“So I think in general, if the Internet’s working well, the information that’s available is shared with lots of different people and different companies and turned into experiences that work well for everyone,” he said. “You know, Google’s gotten where it is by searching all the world’s information, not just a little bit of it, right? And in general, I think people have been motivated to get that information searchable, because then we deliver users to those people with information.”

“So in general, I think that’s the right way to run the Internet as a healthy ecosystem,” Page concluded. “I think social data is obviously important and useful for that. We’d love to make use of that every way we can.”

As Google says, links are a direct target for manipulation, and social could be harder to fake (though there are certainly attempts, and there will be plenty more).

Another difficult signal to fake is authorship, which is why Google is really pushing for that now. In a recent Google+ Hangout, Matt Cutts said of authorship, “Sometimes you’ll have higher click through, and people will say, ‘Oh, that looks like a trusted resource.’ So there are ways that you can participate and sort of get ready for the longer term trend of getting to know not just that something was said, but who said it and how reputable they were.”

“I think if you look further out in the future and look at something that we call social signals or authorship or whatever you want to call it, in ten years, I think knowing that a really reputable guy – if Dan has written an article, whether it’s a comment on a forum or on a blog – I would still want to see that. So that’s the long-term trend,” he said.

“The idea is you want to have something that everybody can participate in and just make these sort of links, and then over time, as we start to learn more about who the high quality authors are, you could imagine that starting to affect rankings,” he pointed out.

So here you have Google (Matt Cutts specifically) telling you that authorship is going to become more important, and that you probably shouldn’t even use the new link-related tool that the company just launched.

Danny Sullivan asked Cutts, at PubCon, why Google doesn’t simply discount bad links to begin with, rather than “considering some of them as potentially negative votes.”

“After all, while it’s nice to have this new tool, it would be even better not to need it at all,” he writes. Cutts did not really answer that question.

Why do you think Google does not do as Danny suggests, and simply ignore the bad links to begin with? Do you think social and authorship signals will become more important than links? Share your thoughts about Google’s ranking strategy and the new tool in the comments.

Lead Image: The Shining (Warner Bros.)


Chris Crum
Chris Crum has been a part of the WebProNews team and the iEntry Network of B2B Publications since 2003. Follow Chris on Twitter, on StumbleUpon, on Pinterest and/or on Google: +Chris Crum.