Quantcast

SES 2006: A Case Of Duplicate Content

Get the WebProNews Newsletter:


[ Business]

Site publishers worry about being penalized for having duplicate content; the panel at SES 2006 in San Jose took on that topic in a session today.

Staff writer Doug Caverly of WebProNews filed this exclusive look at the SES 2006 San Jose session on Duplicate Content & Multiple Site Issues.

Anne Kennedy, a managing partner at Beyond Ink, moderated this session as attendees sought more information on the increasingly complex issue of duplicate content.

It isn’t just web pages anymore, but RSS syndication too, that concerns webmasters. Duplicate content is a problem because the search engines consider it one, Kennedy said. However, duplicate sites in different languages are not treated as duplicate content.

301 redirects are the friend of webmasters everywhere. Kennedy recommended that webmasters choose a single canonical domain and link all internal pages to it.

Shari Thurow of GrantasticDesigns noted that searchers do not want to find duplicate content in their search results. Although the definition of duplicate content is somewhat unclear, Thurow thinks search engines determine “resemblance” in content.

Robots.txt can be helpful here, as webmasters can designate areas of a website they do not want indexed. If they keep the crawlers out of pages that duplicate other pages, that should help keep a site from being penalized over content duplication.

RedZoneGlobal CTO Mikkel deMib Svendsen took the stage next, clad in a bright red jacket our Doug Caverly couldn’t help but to comment on (we’ll get you one for Christmas, Doug.)

Svendsen emphasized that linking issues need to be dealt with by webmasters, not a search site. “Whatever you do, don’t leave it to the engines to deal with!” he cautioned.

Yahoo’s Tim Converse an engineering manager who works on anti-spam issues, commented that “the ultimate goal . . . is just to present diverse results to the user.”

“There are a lot of good reasons for us to preserve duplicate content,” he said, citing wire stories as an example. “Most of the aspects of constructing a site don’t stray into abusive duplication. A lot of the duplication you find out on the web is actually that – repurposing.”

Google’s Matt Cutts stepped up next, and observed that abusive duplication does take place. As an example of something that is not abusive, Cutts cited multiple formats of the same file as one that would be ok. He also considers 301 redirects a better approach than using multiple domain names for the same content.

The question and answer session brought out the playfully sarcastic side of the panel. When Svendsen suggested, not quietly enough, that webmasters should “have fun and spam the engines,” there was laughter.

There was even more laughter when Cutts later answered that with, “if you can afford to have your site burned to the ground,” an interesting metaphor for being dropped from a search index.


Tag:

Add to Del.icio.us | Digg | Yahoo! My Web | Furl

Bookmark WebProNews:

David Utter is a staff writer for WebProNews covering technology and business.

SES 2006: A Case Of Duplicate Content
Comments Off
Top Rated White Papers and Resources

Comments are closed.

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom