Google Sued Over “Jewish” Autocomplete Suggestions

If you’ve spent any time on humor sites, forums, or user-submitted content aggregators like reddit, you have probably seen Google’s autocomplete search feature used as a tool for discoveri...
Google Sued Over “Jewish” Autocomplete Suggestions
Written by Josh Wolford
  • If you’ve spent any time on humor sites, forums, or user-submitted content aggregators like reddit, you have probably seen Google’s autocomplete search feature used as a tool for discovering the sometimes fascinating, sometimes downright odd, and oftentimes frightening collective queries of the internet population. If you want to see this in action, just go to Google and type “Why can’t I” or “Should you” or “British people are.” You’ll see that people are actively searching some pretty weird stuff.

    While autocomplete can produce this decidedly comedic result, it’s not a laughing matter for some who have accused the feature of having untold reputation-ruining powers. Today, Google is being sued over their autocomplete feature, and it’s definitely not the first time the company has faced these allegations.

    The newest lawsuit comes from France, where anti-discrimination group SOS Racisme has accused Google of the “creation of what is probably the greatest Jewish history file” ever.

    French site La Cote reports (Google translation):

    Numerous users of the first search engine from France and world are confronted daily with the association unsolicited and almost systematically the term ‘Jew’ with the names of those most prominent in the world of politics, media or business, “deplore these organizations.

    The claim is that Google’s autocomplete feature is mislabeling celebrities, politicians, and other high-profile people by suggesting “Jew” or “Jewish” next to their names in possible search queries. These celebs include News Corp’s Rupert Murdoch and actor Jon Hamm. As you can see above, a search for “rupert m…” suggests “Rupert Murdoch jewish” as its fourth option.

    As you’re most likely well aware, Google isn’t sitting back there hand-picking these suggestions. They are the result of an algorithm that takes into account popular searches from other users as well as your own previous Google activity (if you’re logged in).

    Here’s how Google describes its autocomplete feature:

    As you type, Google’s algorithm predicts and displays search queries based on other users’ search activities and the contents of web pages indexed by Google. If you’re signed in to your Google Account and have Web History enabled, you might also see search queries from relevant searches that you’ve done in the past. In addition, Google+ profiles can sometimes appear in autocomplete when you search for a person’s name. Apart from the Google+ profiles that may appear, all of the predicted queries that are shown in the drop-down list have been typed previously by Google users or appear on the web.

    For certain queries, Google will show separate predictions for just the last few words. Below the word that you’re typing in the search box, you’ll see a smaller drop-down list containing predictions based only on the last words of your query. While each prediction shown in the drop-down list has been typed before by Google users or appears on the web, the combination of your primary text along with the completion may be unique.

    Predicted queries are algorithmically determined based on a number of purely algorithmic factors (including popularity of search terms) without human intervention. The autocomplete data is updated frequently to offer fresh and rising search queries.

    That lack of manual intervention has gotten Google in trouble in the past. Back in December of 2011, Google was ordered to pay a $65,000 fine because of an autocomplete suggestion directed toward a French insurance company called Lyonnaise de Garantie. One suggestions inserted the word “esroc,” which means “crook.” In the ruling, it was emphasized that they court felt Google should exercise some human control over these autocomplete suggestions.

    Google also found themselves in trouble in Japan earlier this year after autocomplete associated a man with crimes he apparently did not commit.

    It’s important to note that Google does manually exclude some autocomplete suggestions in very limited circumstances – those having to do with “pornography, violence, hate speech, and copyright infringement.”

    Having “Jew” or “jewish” pop up as a suggestion with some people’s names is simply a reflection of that term’s popularity on the internet. It’s no different that the second suggestion that pops up when you search “Obama is,” but a tad different from the fourth result:

    The point is, people are going to search for untrue things. Jon Hamm may not be Jewish, but apparently enough people have heard that he is and are checking it out. I’m also aware that labeling certain high-profile public figures as “Jews” is a negative in the eyes of many. But “Jew” or “Jewish” doesn’t fall into one of those categories that would demand an intervention from Google. It’s not hate speech to say someone is Jewish, even if the people searching for it might have hate on their minds.

    But as we’ve seen, Google is vulnerable to this sort of lawsuit. The world “esroc” doesn’t qualify as pornographic, violent, hate speech, or promoting copyright infringement – it simply harms a reputation. Nevertheless, Google had to pay a fine and remove it.

    Should Google really have to take action on autocomplete results? Tell us what you think in the comments.

    [Via The Hollywood Reporter]

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit