Request Media Kit

SEOmoz Takes On Webspam With Ambitious Project, Talks Penguin Update

SEOmoz is working on a new spam research project aimed at classifying, identifying and removing (or at least limiting) the link juice that spam pages and sites can pass – a pretty ambitious goal...
SEOmoz Takes On Webspam With Ambitious Project, Talks Penguin Update
Written by Chris Crum
  • SEOmoz is working on a new spam research project aimed at classifying, identifying and removing (or at least limiting) the link juice that spam pages and sites can pass – a pretty ambitious goal, to say the least. Can SEOmoz do this better than Google itself?

    CEO Rand Fishkin announced the project on Google+ Monday evening, acknowledging that his company is “certainly not going to be as good at it or as scaled as Google,” but that it’s making for interesting research.

    Fishkin tells WebProNews that Google’s Penguin update was not the motivator behind the project, though he did have this to say about the update:

    “In terms of Penguin – it’s done a nice job of waking up a lot of folks who never thought Google would take this type of aggressive, anti-manipulative action, but I think the execution’s actually somewhat less high quality than what Google usually rolls out (lots of search results that look very strange or clearly got worse, and plenty of sites that probably shouldn’t have been hit).”

    You can read more about Penguin via our various articles on the topic here.

    “We’ve been wanting to work on this for a long time, but our data scientist was previously tied up on other items (and we’ve just hired a research assistant for the project),” Fishkin tells us. “The original catalyst was the vast quantity of emails and questions we get about whether a page/site is ‘safe’ to acquire links from, or whether certain offers (you know the kind – ‘$100 for 50 permanent text links guaranteed to boost your Google rankings!’) were worthwhile.”

    “Tragically, there’s a lot of money flowing from people who can barely afford it, but don’t know better to spammers who know that what they’re building could hurt their customers, and Google refuses to take action to show which spam they know about,” he continues. “Our eventual goal is to build a metric marketers and site owners can use to get a rough sense of a site’s potential spamminess in comparison to others.”

    “A score (or scores) of some kind would (eventually, assuming the project goes well) be included in Mozscape/OSE showing the spamminess of inlinks/outlinks,” he explained in the Google+ announcement.

    According to Fishkin, the SEOmoz algorithms will be conservative and focus on the most obvious and manipulative forms of spam. “For example, we’d probably catch a lot of very obvious/bad link farms, but not necessarily many private blog networks or paid links from reputable sites,” he said in response to a comment on his Google+ post.

    Also in the comments, Fishkin indicated that data would be presented in a ‘matches patterns of sites we’ve seen Google penalize/ban” kind of way than a “‘you are definitely webspam’ type of thing.”

    The data scientist Fishkin spoke of will present the findings at the company’s Mozcon event in July. Fishin expects an actual product launch late this year or early next year.

    Earlier this month, the company announced that it has raised $18 million in VC funding.

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit