UPDATE: Google SafeSearch Changes Hit the U.K., Australia, New Zealand, and More
Google has just made their Image search worse in an effort to protect your virgin eyes.
If you’re in the U.S. and trying to search for boobs on Google Images right now, you’re going to have a tougher time. That’s because Google has prevented U.S. users from disabling SafeSearch. And if you want to find NSFW images, you’re going to have to be more specific with your searches.
Google users should be familiar with the SafeSearch toolbar at the top right of Image searches. Until recently, that bar allowed users to select MODERATE, STRICT, or OFF. As of right now, those options have been removed from the drop-down menu – but only in the U.S.
Do you think Google should have messed with the SafeSearch format? Do you think it makes Google Image search worse? Let us know what you think in the comments.
Here’s what it currently looks like:
And here’s what it looks like on Google.co.uk (and other countries). This is what it looked like in the U.S. before today:
Note that “SafeSearch” is the only option for U.S. users. For a search for “boobs” for instance, this means that the results will not feature any “explicit” images (nipples showing). The only other option is the “filter results,” which in our boobs search filters out all results. “The word ‘boobs’ has been filtered from the search because Google SafeSearch is active,” reads the results.
It would appear that the only two options Google is giving U.S. users are STRICT and MODERATE. Or in other words, you can’t turn off SafeSearch.
The default “SafeSearch” results for U.S. users are the exact same results for a MODERATE SafeSearch results for U.K. users:
If you go to Google’s SafeSearch help page, they tell you how to disable SafeSearch from your settings:
Here’s how to disable SafeSearch:
- Visit the Google Preferences page.
- In the “SafeSearch Filtering” section, select Do not filter my search results.
- Click Save Preferences.
But when U.S. users visit their search settings, they are only given the option to filter explicit results:
But moderate SafeSearch is already on by default.
If you go to another country’s Google, say Google.com.bz, the search settings provide the option to turn on “no filtering.”
So, what gives? Is Google guilty of some seriously awful censorship? The quick answer is no, but it’s a little more complicated than that.
You see, you have to be specific if you want NSFW results. Searching for “boobs porn” does give you plenty of nudity. But for searches on words without specific qualifiers, you still don’t see any of these results. In essence, Google has changed their search settings to only display adult results when queries are specifically adult-oriented.
That means for all intents and purposes, users in the U.S. now only have two options – default SafeSearch and filter explicit results. The default SafeSearch is akin to MODERATE. The option to turn off SafeSearch completely for all results is gone in the U.S. And the big question is why? What was wrong with the old Google image search format (and the one seen in the rest of the world)? Why did Google feel the need to change it?
Let’s look at two different responses we received from Google. First, Google Webmaster Trends Analyst John Mueller had this to say:
“The default should continue to behave similarly to what most users have had as the default so far (“moderate”). Our algorithms are designed to downgrade explicit content when you’re not specifically looking for it. If a search term is very explicit, relevant adult content may show up, but we’ll err on the conservative side. So if you want to see adult content in Image Search, just make it clear with the query — we’ll show the most relevant content for each search.”
Now from another Google spokesperson:
“We are not censoring any adult content, and want to show users exactly what they are looking for — but we aim not to show sexually-explicit results unless a user is specifically searching for them. We use algorithms to select the most relevant results for a given query. If you’re looking for adult content, you can find it without having to change the default setting — you just may need to be more explicit in your query if your search terms are potentially ambiguous. The image search settings work the same way as in web search.”
Ok, so the point here is that users need to be specific with their searches. Got it. Apologies for the frankness, but if I want to find blowjob images, I now have to search “blowjob porn.” There is now no way that I can edit my own personal settings to make a search for just “blowjob” yield all results, both NSFW and otherwise.
In essence, Google is fragmenting their image search. A “no filter” search is a true search of the most popular images across the web. U.S. users no longer have this option. We’re now only given the choice between filtered results for “blowjob” or the most popular results for “blowjob porn.” That smattering of all results, both NSFW and SFW for the query “blowjob,” cannot be achieved anymore.
Plus, is there really a question about what I’m looking for when I search “blowjob?” Do I really need to provide any more detail?
It seems like a big gripe about a small change, and it is in a way. But one could make the argument that this actually is a form of censorship. If I go to Google images and search “blowjob,” I want to see the best of what the web has to offer – all of it. Not what Google thinks I should see based on their desire to prevent adult results unless users are super specific.
Go ahead and try a search for “blowjob” on Google Images right now. Those aren’t really very relevant results, are they? Users should see the most relevant results for their searches, no matter what. And they should have the option to simply turn off the SafeSearch filter, which they all had just a couple of days ago.
How about you? Do you think this is a form of censorship? Are we making mountains out of molehills? Could this Image search tweak make you seek out a competitor’s image search? Tired of reading the word “blowjob?” Let us know what you think in the comments.