Duplicate Content Not An Everyday Problem

By: Doug Caverly - June 3, 2009

If you’re responsible for a handful of blogs or sites, and have been wearing holes in a thesaurus to avoid using the same phrase twice, rest easy.  At SMX Advanced, Matt Cutts said some things about duplicate content that should comfort the average blogger or small business owner.

Coverage of SMX Advanced continues at WebProNews Videos.  Stay with WebProNews for more updates and videos from the event this week. 

Matt Cutts

Cutts was asked whether a network of co-branded job sites would be penalized for duplicate listings.  He answered, "Within one site, I wouldn’t worry as much about a duplicate content penalty.  We’ll just try to pick the best page."

Also, even when scores of sites are involved, the odds are good that there’s no need to worry.  Cutts continued, "If you have the same content on 200 different sites . . . is it typical that we give a duplicate content penalty for that?  No.  Definitely not."

For the sake of not giving an incomplete picture, though, we should note (as Cutts did) that Google remains focused on users having a positive experience, and it might be best in the 200-site situation to make PageRank flow to one original domain.

Not bad, right?  So again, as long as you’re not doing anything too unusual, don’t worry about leaving your thesaurus or Word’s trusty Shift-F7 combo alone for a while.

Doug Caverly

About the Author

Doug CaverlyDoug is a staff writer for WebProNews. Visit WebProNews for the latest eBusiness news.

View all posts by Doug Caverly
  • http://pravishseo.blogspot.com Pravish Thomas

    Well i would certainly say that duplicate penalty is much better to applied seriously coz many do spam using different key phrases with same content like we have in articles, pr.

    Yep i agree with Matt cutts on co-branded sites like job portals gets a good excuse of easing away with penalty issues. But at the same time more duplication i have found out specially is in e-commerce based sites, where many pages have been removed out from google’s index specially for AUTOPARTS category. So what i think is that if u give the freedom by saying that there won’t be any penalties then i hope there would be more spam. I think google has done some good work detecting duplicate pages specially in the e-commerce sector, so more duplicate penalties should be imposed, which would improve google’s search index.

    Like google restricted giving importance to “keywords” and gives importance to Title and descriptions – This was done to get rid of KEYWORD Spam, so once u put a penalty its better coz that would make spammers away and in return u get a clean indexed search results!!

    Penalties work!!!!!!!!!! Half the job is done then :)

  • http://codesucker.blogspot.com Codesucker

    Kudos on the post, Doug – glad we are openly talking about duplicate content risks.

    Sadly, this is Matt Cutts usual doublespeak.

    He’s asked if 200 sites with the same content will be penalized for duplicate content. He says no, then he turns around and says the opposite, the author claims:

    “For the sake of not giving an incomplete picture, though, we should note (as Cutts did) that Google remains focused on users having a positive experience, and it might be best in the 200-site situation to make PageRank flow to one original domain.”

    Might be BEST? Meaning what exactly? A positive experience, does that mean a site with the same content as 200 others might not be positive and it wont see results unless they all point at one domain? That’s not always the case.

    The biggest problem is that he’s being asked about co branded job sites, they certainly won’t be pointing at each other or the sources other than the listings – they are the end users of the content. It’s not like bloggers submitting posts to submission sites, which all do point back to the contents original URL – but these job sites aren’t going to be pointing at ONE domain, they are job listings that may be considered duplicate content in my eyes.

    Here is a more direct question I would like to humbly ask Mr. Cutts about the same situation:

    Even if the pages link to the job ads on popular sites, the HTML set of links and content that points to those sites are marked for dupe content penalties in my eyes, because they are all the same content without URLs originating to a common source. Am I wrong?

    Also, 200 is an extremely low number, you can have 200 syndicators without knowing about it, what if they dont link back at all – Why such a low number Mr Cutts? What exactly are you trying to hide in that Google Webspam Department secretly tucked away under the Pentagon?

    What if the content DOES link back to an original source, but you take that number up to say 10,000 different pages (submitting on large networks) which is a decent level of exposure for bloggers – are they going to get hit with dupe penalties? I just don’t like the mention of that very small number in a duplicate content conversation.

    Thanks for the article