What Does Google Suggest About You?

    September 27, 2005
    WebProNews Staff

Google Suggest, in mind reader twin soul fashion, will finish your sentences for you. But you may be shocked when those sentences finish as racist remarks echoed by millions of others.

The blogger who pointed this out admits it’s not Google’s fault. It may just be an enormously telling reflection of the global society.

Google Suggest, an artificial intelligence application that offers suggestions for finishing search queries, is suggesting some pretty awful endings for phrases like “blacks are” “whites are” “Jews are” “Americans are” et cetera.

“What percentage of the web is related to sex, crime, slander, and racism? What percentage of blogs are bias, and rant based?” asks blogger “questsin” at questsin everything blog.

The answer may be higher than you think. With gripping evidence, questsin captured screenshots of the prejudiced suggestions.

We’ll close with questsin’s comments on the matter.

“Its not my intension to bring grief to Google. Google is not to blame. They are only the medium we are the message.”

**The remarks outlined in this article in no way reflect the feelings of the author or his publisher and are intended for illustrative purposes only.