Google’s recent Panda update has forced sites all over the web to step up and re-evaluate their content, and the policies by which their contributors must abide in order to keep said content looked upon in a positive light by Google.
Of course nobody wants to admit that Google’s changes are the real reason for the sudden interest or re-focus on quality, but it’s certainly become a much bigger focal point since that update shocked publishers in February. And let’s not forget that Panda has only been initiated in the U.S. so far. Many of the sites affected are bound to take more big hits as it rolls out in more countries (Google hasn’t provided a time frame for this).
This is likely another big reason content sites are scrambling to make improvements. We’ve looked at EzineArticles’ strategy a few times, as it was one of the top sites impacted by Panda, according to data from Sistrix – the most commonly looked at list:
We looked at some things Xomba (another site that was hit) is doing. We even looked at some things Demand Media has done with its enormous eHow property, which was actually positively impacted by the Panda update, despite many having expected it to be a top target.
You’ll notice that Examiner is on the list above, and it is indeed another site that it is often labeled a content farm, and criticized over the quality of its content. The site is also among those on a quest to improve quality and image (and rankings no doubt).
Examiner has actually put out a white paper called “Identifying Quality Content Online – Efficiently and Effectively,” which it says is only the first in a series that will discuss how it is addressing “today’s media industry challenges”. Here’s the abstract:
The staggering amount of content that makes its way online via self-publishing is both a blessing and a curse. Whether blogs, home videos, commentary, advice or leaks of government documents, there is so much content available that a difficult task facing audiences is how to sift through the torrent to find the treats. The task of identifying, presenting and distributing (through partners and search engines, as well as the programming on one’s own sites) the highest quality content amid a huge volume of contributors’ articles and video is not something that traditional media organizations generally face. It is a challenge that has arisen in response to the surge of content published on sites, like Examiner.com, that extend the opportunity to contribute to users across the Web. Brand association and crowd sourcing are two possible solutions for users, but they are imperfect barometers of quality. Examiner.com – which publishes an average of 3,000 stories per day – has created a “human-in-the-loop technology” that allows professional editors to identify and promote quality content.
“The most effective and efficient way to identify the best content on a site with a high volume of coverage is to combine empirical data that correlates to quality performance with assessments by the audience and by trained journalists who help determine how well told and credible the stories are,” concludes Examiner VP of Quality, Mitch Gelman.
“Once created, the quality data can be applied in numerous ways, including ensuring that the best content is the most easily accessible to users, partners and advertisers,” he added. “In order to have relevant, timely data, the quality cycle must be maintained with vigilance and enhanced as the skills of contributors evolve.”
You can read the paper in its entirety here.
What do you think about Examiner’s approach to the issue? What content sites are doing it right? Share your thoughts.