Google Fined $65K For Defaming Search Suggestion

It would appear that France isn’t a big fan of Google’s automated search suggestions. The company has lost lawsuits related to the feature in that country on more than one occasion. Now, a court h...
Google Fined $65K For Defaming Search Suggestion
Written by Chris Crum

It would appear that France isn’t a big fan of Google’s automated search suggestions. The company has lost lawsuits related to the feature in that country on more than one occasion.

Now, a court has ordered Google France to pay €50,000 (about $65,000) over a search suggestion for a query for an insurance company called Lyonnaise de Garantie. Google apparently suggested the word “escroc” as an addition to the query. The word means crook. French news site The Local reports:

A Paris court held that the addition of the offending word “was offensive towards the company.” The court said that Google should be able to exercise “human control” over the functioning of words suggested by its search engine.

Google said the auto-complete functionality was not the “expression of a human thought”, an “opinion” or a “value judgement or criticism” but was the result of its automatic algorithm.

Google explains how the feature works in a help center article:

As you type, Google’s algorithm predicts and displays search queries based on other users’ search activities. In addition, if you’re signed in to your Google Account and have Web History enabled, you may see search queries from relevant searches that you’ve done in the past. All of the predicted queries that are shown in the drop-down list have been typed previously by Google users.

For certain queries, Google will show separate predictions for just the last few words. Below the word that you’re typing in the search box, you’ll see a smaller drop-down list containing predictions based only on the last words of your query. While each prediction shown in the drop-down list has been typed before by Google users, the combination of your primary text along with the completion may be unique.

Predicted queries are algorithmically determined based on a number of purely objective factors (including popularity of search terms) without human intervention. The autocomplete data is updated frequently to offer fresh and rising search queries.

Google does have what it refers to as a “narrow set of removal policies” in place for porn, violence, hate speech, and terms that are frequently used to find content that infringes upon copyrights.

The suggestion appears to have been removed.

Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit

Advertise with Us