What SEOs Want from Google’s Webspam Team in ’09

Matt Cutts Fishes for Responses

Get the WebProNews Newsletter:

[ Search]


UPDATE: Cutts has tallied the demands so far and shares them in a list here.

Original article: Matt Cutts asked on his blog what people wanted to see Google’s webspam team tackle in 2009. Of course, he has gotten a lot of responses (many from SEOs and Internet marketers), and will likely get a lot more. I guess this is the webspam equivalent of Google’s AdWords Wishlist and Google Mobile Product Ideas page, both of which call on users for ideas for improvements.

Some suggestions and responses from people include:

Matt Cutts
              Matt Cutts

– cloaking

– a paid link reporting firefox plugin

– spam punishment through decreasing PR

– improving local spam

– more info regarding penalties

– eliminating content "pretenders"

– more focus on duplicate sites

– fewer sites requiring logins early in SERPs

– if it’s banned in search, ban it in Adwords and leave room for quality advertisers.

– scraper sites

– block spammers from ALL Google services

– Search Queries – Websites that take you to a search results page or something similar

– paid text links are still very successful after Google has made such a point against them

– There should be a SERP check for MFA sites which have some > zero content

– Google should pay more attention to DMOZ and Google Directory

– product review spam is out of control

– provide a Spam Detection API

– devalue ALL links for sites like, myspace, facebook, etc.

– less top positionswith A LOT of keyword stuffing

– have someone from the spam team LOOK at spam reports from the Google webmaster console and evaluate the site in the report taking MANUAL action

– Team up with Akismet and use their data for who’s spamming who.

– More focus on REAL spam like hacked backlinks, comment spammers, forum spammers etc

– less apparent value allocated to keyword rich domain names

– Shutting down all the splogs on Blogger!

There are plenty more where that came from and they are being added continuously. It is good that Google is giving its users places for feedback. They seem genuinely interested in letting everyone have a voice and potentially listening to that voice (or at least considering it). If you have something in mind that you would like Google’s webspam team to tackle this year, drop by Matt’s post and let him know.


What SEOs Want from Google’s Webspam Team in ’09
Top Rated White Papers and Resources
  • http://www.feedthegoogle.com Houston SEO

    I wonder how much of this is a wish list and how much will actually be implemented. Some of the items on that list seem to have remained on the “wish list” for many years.

    • Chris Crum

      Hard to tell, but at least they’re kind enough to ask.

      • Mitch


        and many of us are smart enough to sum up that kinda kindness!

        Hey – this is the internet, not your local corner store who really needs to look after its customers! ;-) The old days are gone!

      • Michael

        I don’t know how asking questions they already know the answer to is deemed kindness…but sure, OK.

        It seems to me, with regards to the spam team, that Google has subscribed to the school of thought that if you can’t fix it, at least look like you’re working on it.

        The fact is that G’s Spam team have had their 2009 agenda/priority list mapped out well before asking for input from SEO’s. It’s just a move to make people feel they are more “in the loop”, not an actual call-to-action list from a benevolent SE.

  • http://dofollow001.com/ AndyW

    * product review spam is out of control *

    The only one a truly agree with is above. This is completely out of control – oh, and affiliate links all over blogs.

    But, in many ways this is isn’t a Google problem to deal with – it’s a problem that bloggers themselves have to deal with by instituting a code of good blogging

    Reading all that list all in one go makes me shudder a bit really – it’s like a cyber-fascist vision of the internet and Google – ban this, ban that, clamp down on them, get rid of those…

    • Chris Crum

      And that’s not even the entire list :)

  • Guest

    Google groups is a nice free service. Unfortunately due to lots of spam much of which is either pornographic or from China many people are blocking posts from Google Groups. This makes GG almost useless.

    Spammers are reported but apparently no action is taken.

  • http://www.SearchRevenues.com Paul Denhup

    This is a big one for me. I get tired of arguing with Google’s otherwise excellent AdWords Trademark Complaint Procedure team over obvious Spam/Blog, “Splog,” sites trying to cash in on client’s legitimate trademarks. Overall, Google’s efforts to fight spam have improved over the years but so have the efforts of those who are unscrupulous.

  • http://www.learnseowithme.wordpress.com Learning SEO with me

    I’d like to see more real communication between webmasters and Google. I have a site that had a PR of 7, now 5. Why did it slip?

    I’d like to see some info in Webmaster tools about the issues regarding your site from Google.

    • http://www.music44.com Guest

      Right on. How about an answer when they change a legitimate business’ PageRank suddenly to 0 without warning? Seems in this case it may have been related to duplicate content… our best stab in the dark right now. But with the complete lack of communication, guesses in the dark is all we have to go on. BTW, you should see the variety of SEO opinions on the boards as to what the problem is… they are all over the map. And the deeper you go down the rabbit hole, the more complex all these issues become.

      Anyway, if for example duplicate content can be such a problem, how about giving us a “nofollow” tool that actually works? We put robots.txt “nofollow” and googlebot indexed these pages anyway. Stop penalizing us for errors googlebot makes by not following it’s own rules.

      Here’s an idea: How about putting a little TICK box in sitemaps, that gives webmasters the option to say “Index ONLY the pages googlebot can access which are listed here in my sitemap, and IGNORE every other page at this domain.” That would be nice, because we’d actually gain control over keeping SessionIDs and other content duplicating arguments out of the index.

  • http://web-tech.ga-usa.com Dale Reagan

    Mostly, I am pleasantly surprised with ‘good results’ that I see for long tail searches. For searches with fewer search terms the results are tougher to weed through.

    If I were a search engine I would:

    – continue to choose quality (instead of trying to ‘find spam’, how do I find ‘quality content’?) While both are important quality will always win/last – note that IMO the quality of results for Google searches continues to improve but I am noting peaks/valleys.

    – consider placing ‘index sites’ below ‘real content’ sites in search results; seems like there are new indexes every day all trying to sell you a ‘premium’ placement listing…

    – further refine what ‘local’ means – I have seen some dilution in quality for ‘local’ listings during the past month

    BTW – my def – quality = high relevance content + useful to the searcher; very few, if any ads… Highly relevant links to products are fine but not links to ‘the world’…


    • http://www.china-biz21.com Mitch


      I couldn’t agree more with you. I guess – spam filters itself out if more emphasis would be laid on “QUALITY”. This shouldn’t need too much of an explanation. it balls all down to “basics” which is often times forgotten these days.

      One more I would like to add here, though. When will Google actually rectify its spelling mistake in “Adwords” It is spelled “MADwords”! This is another “obvious” and – well – it’s not spam, but one could suspect another spelling mistake …

      Google – fix yourself before you go to war with others! That’d be on my wish list.

      Nevertheless – REAL spam needs to be eliminated and therefor we need to look into applying more “manual” detections if we can’t get our “robots” to do the job properly.


  • http://seo.propertyinwisconsin.com Rudy McCormick

    I would like to see more IP banning of people who use black hat tactics and buy links.
    I like keyword rich domains.
    Content should be king and more emphasis should be placed not only on h1,h2, h3 ect…but also on body text and outbound link text.

    Thanks Matt you guys have done a great job with all the changes and hopefully they will keep going in the right direction.

  • http://www.doggybehave.com/sitstayfetch-review.php sitstayfetch

    I would like to see more respect shown to the webmasters by google, and also commuication

  • Dan M

    Unfortunately product reviews have value, especially if you are researching a less than reputable company.

    I don’t know how they will be able to combat that or if they even should.

    I hope they will stop counting paid links b/c with a paid link it makes it much more about your advertising budget instead of the best content.

  • Guest

    Somehow, clamp down on websites that run the old reciprocal link scam on unsuspecting victims. I know of several sites that have set up link exchange programs with unsophisticated (but not necessarily low PR) sites, where the offending site uses blackhat techniques to hide their outbound links from the search engines. It looks to the recip sites like they are getting page rank juice, but in fact, the offending site is gaining a ton of one way links as far as Google is concerned. Matt: I can name names if you like.

  • http://www.avwebnet.com Robert Brady

    Google: sauna blog

    Most all of the websites that you will find are the free blogs that have been built around the “sauna” keyword, then these pages are filled with hidden (and sometimes obvious) anchor text links to items sold on amazon, or revenue producing links, especially adwords ads.

    One look and its obvious to the human that these were specifically created to rank for keywords, then lead the unwary traveler to click on one of the links or ads, thus generating revenue for the website owner, who could really care less about the products being linked to.

    I hope that there is a google algorythm filter for these misleading websites. . . . .

  • http://hubshout.com Adam

    Sorry guys – But reading this it is clear to me that many of these suggestions are simply unrealistic in the current SEO environment. The scale of the data and the speed at which the links move (new ones added, old ones removed, new pages posted) is simply staggering. The fact that Google does as good a job at determining what SEO components to pay attention to currently is amazing. But SEO is a high volume data game. Thus, things like manual review are really not possible.

    My biggest gripe is with the use of the term “spam”. It seems to be an ever-expanding term used to mean “anything I don’t like or causes me trouble.” This is not a workable definition and it actually hurts SEO. This is a confusing field for non-technical people. The fact that the term SPAM is defined differently by just about everybody doesn’t help. SPAM means email. We need different terms for different issues (fake blogs, paid links, duplicate content are all good). But let’s stop expanding SPAM and further confusing people trying to learn about SEO for the first time.

    • http://www.triton.net Theo

      Sounds to me like you are a pro with a brain, people get confused enough as it is to take and old comment about porn and twist it. I Know spam when I see it. If it wouldn’t make your mother proud don’t do it!

  • http://www.mulewagon.com The Mule Wagon

    …To pay more attention to The Open Directory again. This is an excellent resource for people genuinely researching non-commercial subjects.

  • http://randomplaza.com Richard C Mongler

    I personally think inclusion in dmoz should stop affecting google rank.

    • Ted

      You are soooo right. Black Hat forums auction off editorships. Editors don’t want competitors in their keyword space. At IR2008 a speaker recommended submiting to dmoz as part of an SEO campaign. I asked her in front of 1000 people when was the last time she got a site listed in dmoz. She said it had been over 3 years. She was a professional working with very real companies.

      The only way to save dmoz at this point is to make all its link no follow so the black hats leave.

      • Guest

        Fire the top editor for letting it turn into shit.

    • http://www.goldmineblogging.com Google Really Going To Cut Down The Spammers?

      Ha LMAO! This is so funny Cloaking? Are you people out of your mind at Google? They would go bankrupt in a faster rate then the US is financial collapse happenend. Comeon they cant afford this, and lets stop giving Google all the gravy. They really are not as slick as they say they are. They dont have really cool algorithms that go and check if somethings a real content provider, and not a content pretender.

  • http://www.realstudio.ro/ RealStudio Design

    It would be very nice if they could manually check reports …. but they would probably get millions each day, so human check is out of the question – unless they plan to employ every single person dismissed now that there’s the crisis.

  • Guest

    I am a strong proponent for the communication thing. As a webmaster, I get companies all the time instructing me on how to advertise their wares. I’m the expert on web design and promotion. The business owner is the expert in his or her wares and what potential customer look for in their industry – not me. So I let them guide me to a point.

    That point comes when they want me to send out a ton of email or come up with some other questionable way to attract visitors. The business owners are experts in their business – not on web design, search engine optimization or spam.

    If it wasn’t for me, a business owner who knows just enough about web design to be dangerous could shoot themselves in the foot by producing spammy content without realizing what they’re doing wrong – even though they are a very legitimate business

    I don’t like spam either but before we cut off the heads of spammers and mount them on poles, I think Google should at least have some way to let new webmasters know what they are doing is considered bad and give them a warning before breaking out the axe.

  • Join for Access to Our Exclusive Web Tools
  • Sidebar Top
  • Sidebar Middle
  • Sign Up For The Free Newsletter
  • Sidebar Bottom