Matt Cutts Shares Something You Should Know About Old Links

By: Chris Crum - May 22, 2012

Google’s Matt Cutts has put out a new Webmaster Help video discussing something that’s probably on a lot of webmasters’ minds these days: what if you linked to a good piece of content, but at some point, that content turned spammy, and your site is still linking to it?

In light of all the link warnings Google has been sending out, and the Penguin update, a lot of webmasters are freaking out about their link profiles, and want to eliminate any questionable links that might be sending Google signals that could lead to lower rankings.

A user submitted the following question to Cutts:

Site A links to Site B because Site B has content that would be useful to Site A’s end users, and Google indexes the appropriate page. After the page is indexed, Site B’s content changes and becomes spammy. Does Site A incur a penalty in this case?

“OK, so let’s make it concrete,” says Cutts. “Suppose I link to a great site. I love it, and so I link to it. I think it’s good for my users. Google finds that page. Everybody’s happy. Users are happy. Life is good. Except now, that site that I linked to went away. It didn’t pay its domain registration or whatever, and now becomes maybe an expired domain porn site, and it’s doing some really nasty stuff. Am I going to be penalized for that? In general, no.”

“It’s not the sort of thing where just having a few stale links that happen to link to spam are going to get you into problems,” he continues. “But if a vast majority of your site just happens to link to a whole bunch of really spammy porn or off-topic stuff, then that can start to affect your site’s reputation. We look at the overall nature of the web, and certain amount of links are always going stale, going 404, pointing to information that can change or that can become spammy.”

“And so it’s not the case that just because you have one link that happens to go to bad content because the content has changed since you made that link, that you’re going to run into an issue,” he concludes. “At the same time, we are able to suss out in a lot of ways when people are trying to link to abusive or manipulative or deceptive or malicious sites. So in the general case, I wouldn’t worry about it at all. If you are trying to hide a whole bunch of spammy links, then that might be the sort of thing that you need to worry about, but just a particular site that happened to go bad, and you don’t know about every single site, and you don’t re-check every single link on your site, that’s not the sort of thing that I would worry about.”

Of course, a lot more people are worried about negative SEO practices, and inbound links, rather than the sites they’re linking to themselves.

More Penguin coverage here.

Chris Crum

About the Author

Chris CrumChris Crum has been a part of the WebProNews team and the iEntry Network of B2B Publications since 2003. Follow Chris on Twitter, on StumbleUpon, on Pinterest and/or on Google: +Chris Crum.

View all posts by Chris Crum
  • ray

    Another example of Matt Cutts/Google being a complete joke. #SAD

    • John S. Britsios

      Ray, I full agree with you. What BS is he talking about? If Google is capable to identify such issues, why can’t they do that with “Negative SEO” attacks?

      • Steve G

        I tend to agree as well. I’m really surprised that somebody, like Matt Cutts, that is so heavily involved with search, wouldn’t recommend sending a spider over your pages periodically so that not only will you identify problems with a site, possibly before Google notices them, but as for external links, you can grab all those pages and look at what you are linking out to to insure they all still work. I spider my pages once a month at least to insure that my pages are of the highest quality I can make them. After all Panda was all about quality control. So I’m really surprised at what Matt Cutts said in this case.

  • Lisa

    See friends. Its very clear picture. Everybody here in seo is out to find links. In the process many may link to those sites which may turn spammy. I feel that google has placed a percentage signal, means if there is an overlimit of spammy sites linking to your site, then you are in trouble. Which means all depends upon your link building profiles.

    Its like evaluating old links

  • djben

    @Ray and John – you should be appreciative that Matt is trying to help us get it right… I personally am appreciative anyway. If you’re not doing anything wrong then you shouldn’t have anything to worry about.

  • Kate

    Obviously we can’t be responsible for every site we link to. I think google is smart enough to distinguish low quality link building from circumstances which don’t depend on us.

  • Daniel

    This whole debate about Good links turning bad, is a difficult one to deal with…

    It would be quite difficult for most sites to keep up to date on any changes to their links(sites they link to indirectly) especially as time goes by, and they build up a large number of links…

    Another thing is that I have noticed many sites, including large sites, that allow hard core spam comments on their sites, which should in theory be a big no….

    We are not talking one or two spam comments, either, and these comments are loaded with dodgy(nasty) links…..

    I would have thought this would cause some serious issues for that site, in the long run, at least….