Matt Cutts Talks About Duplicate Content With Regards To Disclaimers, Terms/Conditions

    July 22, 2013
    Chris Crum
    Comments are off for this post.

Google’s Matt Cutts has put out a new Webmaster Help video once again discussing duplicate content. This time it’s about duplicate content with regards to how it relates to legally required content, such as disclaimers and terms and conditions. The exact question Cutts responds to is:

How does duplicate copy that’s legally required (ie Terms & Conditions across multiple offers) affect performance in search?

Cutts notes that there was a follow-up comment to the question, saying that some in the financial services industry are interested in the answer.

“The answer is, I wouldn’t stress about this unless the content that you have is duplicated as spammy or keyword stuffing or something like that, you know, then we might be – an algorithm or a person might take action on – but if it’s legal boiler plate that’s sort of required to be there, we might, at most, might not want to count that, but it’s probably not going to cause you a big issue,” says Cutts.

“We do understand that lots of different places across the web do need to have various disclaimers, legal information, terms and conditions, that sort of stuff, and so it’s the sort of thing where if we were to not rank that stuff well, then that would probably hurt our overall search quality, so I wouldn’t stress about it,” he says.

So, long story short: don’t make your disclaimers and terms spammy, just like with any other content. As usual, if you play by the rules (Google’s quality guidelines), you should be fine.

  • HJ

    Again the objectives are right but the approach will need a lot of tuning. But that is the case with a lot of objectives. Even today if the original content is one some site which, for whatever reasons, is ranked little lower on authority index of Google, the original content will not rank but the duplicate content, copied on the so call higher authority site will. Lots of examples and experiences for the same.

    One point Google seems to be missing is that the same algorithm can not work for different kinds of sites. An article site, an educational or training site, a News site and a financial/market analysis site… all will need different eye and hence different algorithm for ranking.

  • HJ

    Some typos in the comment but please do not blame me… you may blame the weather :)

  • http://www.rocketaudit.com/ Brian Wells

    Interesting video. Definitely has some good points and tips, but as the commenter before me mentioned, there are a few things that Google left out and did not seem to take into consideration! There definitely is a much bigger picture out there.

    • http://www.rankwatch.com RankWatch

      Google as always concentrates on what matters. Google does or is capable of figuring out the hidden truth behind the duplicate content. There are some sections on the site which unintentionally and unavoidably will have near duplicate content and Google does extremely well to find out what to ignore and what not to.