Google Makes Whitelist Admission

Google’s often made a big deal about using automated solutions to handle problems.  The search giant hasn’t tried to address every single Google bomb individually, for example; it’s...
Google Makes Whitelist Admission
Written by
  • Google’s often made a big deal about using automated solutions to handle problems.  The search giant hasn’t tried to address every single Google bomb individually, for example; it’s preferred to make algorithm changes that can tackle lots of weaknesses at once.  Only now an admission related to whitelists is getting some attention.

    At SMX West, Danny Sullivan put a question to Google’s Matt Cutts: “So Google might decide there’s some particular signal within the overall ranking algorithm that works for say 99% of the sites as Google hopes, but maybe that also hits a few outlying sites in a way they wouldn’t expect – in a way they feel harms the search results – then Google might except those sites?”

    Cutts indicated that Google might indeed (no exact quote’s available).

    That admission’s earning Google some negative attention, given the company’s traditional claims.  Cade Metz pointed out that, if Google has been giving some sites special attention, European antitrust regulators might now want to revisit old decisions.

    Lots of small business owners and site administrators may have new – and not so polite – questions and/or accusations for Google, as well.

    Google has tried to address the matter, though.  The company said in a statement:

    “Our goal is to provide people with the most relevant answers as quickly as possible, and we do that primarily with computer algorithms.  In our experience, algorithms generate much better results than humans ranking websites page by page.  And given the hundreds of millions of queries we get every day, it wouldn’t be feasible to handle them manually anyway.

    That said, we do sometimes take manual action to deal with problems like malware and copyright infringement.  Like other search engines (including Microsoft’s Bing), we also use exception lists when specific algorithms inadvertently impact websites, and when we believe an exception list will significantly improve search quality.  We don’t keep a master list protecting certain sites from all changes to our algorithms.

    The most common manual exceptions we make are for sites that get caught by SafeSearch-a tool that gives people a way to filter adult content from their results.  For example, “essex.edu” was incorrectly flagged by our SafeSearch algorithms because it contains the word “sex.”  On the rare occasions we make manual exceptions, we go to great lengths to apply our quality standards and guidelines fairly to all websites.

    Of course, we would much prefer not to make any manual changes and not to maintain any exception lists.  But search is still in its infancy, and our algorithms can’t answer all questions.”

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit