Should Google and Facebook Be Filtering Our Content For Us?

Is the personalization of the Internet a step backwards? Is the wealth of information that is accessible to us being reduced because the products we use are filtering it all so heavily? This is a disc...
Should Google and Facebook Be Filtering Our Content For Us?
Written by Chris Crum

Is the personalization of the Internet a step backwards? Is the wealth of information that is accessible to us being reduced because the products we use are filtering it all so heavily? This is a discussion that has been gaining momentum in recent weeks.

Do you want your search results and news tailored to your tastes, or do you want more control? Let us know in the comments.

The topic was brought up most recently by alternative search engine DuckDuckGo, which calls out the major search engines for being too heavy on the content filtering. DuckDuckGo has set up a site at DontBubble.us, which provides something of a graphical slideshow to illustrate its point. If you strip out all of the graphics and sub-text, it reads:

When you search the Internet, search engines now show different results to different people. Results are tailored to who you are, based on your search history and your click history. Since you often click on things you agree with, you keep getting more and more of what you already agree with, which means other stuff gets demoted (effectively filtered). This begs the question: what are you missing?

In other words, you are living in a Filter Bubble that promotes things it thinks you’ll like, and demotes (effectively filters) out some of the rest, which may limit your exposure to opposing information. Unfortunately, it’s not easy to pop your filter bubble, because the technology is used so much across the Internet.

Then it turns into an ad for DuckDuckGo:

We offer you an alternative: a search engine that breaks you out of your Filter Bubble by default, plus other differences like real privacy.

Founder Gabriel Weinberg discussed these differences with WebProNews in an interview earlier this year:

Of course, DuckDuckGo is not above some level of filtering. It’s already pre-filtered out results from sites like eHow, which many may applaud, but others may not appreciate. For all of the controversy that’s surrounded eHow, it also has its fans, and Demand Media, which owns it, claims to be taking action to make its quality better. The point is, there is some level of filtering going on, though this is more at the human level, than at the personalized algorithmic level.

The “Filter Bubble”

This “Filter Bubble” DuckDuckGo speaks of is a concept discussed by Eli Pariser in a recent TED Talk, which can be viewed here:

Pariser had some interesting things to say, speaking directly to executives from Facebook, Google, Microsoft, Yahoo, and other companies, who were in the audience. In his presentation, he included a couple of interesting quotes – from Facebook CEO Mark Zuckerberg and Google Executive Chairman Eric Schmidt:

“A squirrel dying in front of your house may be more relevant to your interests right now than people dying in Africa.” – MZ

“It will very hard for people to watch or consume something that has not in some sense been tailored for them.” – ES

Pariser talked about being a political progressive, but liking to hear what conservatives have to say, but noticing that all of the conservative posts had disappeared from his Facebook News Feed because Facebook had noticed he was clicking more on liberal links than conservative ones.

“Facebook isn’t the only place that’s doing this kind of invisible algorithmic editing of the web,” he said. “Google’s doing it too. If I search for something and you search for something even right now at the same time, we may get very different search results…There is no standard Google anymore.”

This is a fairly well-known fact, but that doesn’t make it any less of a nightmare for SEOs.

He talked about having several of his friends search for “Egypt” and send him screenshots of their results, only to find they were very different. One person didn’t even have any stories about the recent protests, and this was apparently while they were the “big story of the day”. He went on to note that many sites (mentioning the Huffington Post, the Washington Post, the New York Times and Yahoo News) are engaging in some kind of personalized content delivery behavior

If you take all of these filters/algorithms together, you get a filter bubble, he says – your own personal unique bubble of info, which depends upon who you are and what you do, and “you don’t decide what gets in . You don’t actually see what gets edited out.”

He equates the phenomenon to the “passing of the torch from human gatekeepers to algorithmic ones,” with the humans being traditional human news editors. If algorithms are going to curate the world and decide what to show us, he says, we need to make sure they’re not just keyed to relevance, but that they will also show us things that challenge us or make us uncomfortable – basically give us other points of view.

“We need you to give us some control,” he told the executives in the audience.

I might argue that we as users do have control. In the end, we’re choosing what services to use, what people or brands to follow, what publications to read, etc. If you’re limiting your content intake to what Facebook is showing you in the news feed, what Google is returning in search results, etc. then yes, you are succumbing to the algorithmic editors. He makes some great points.

However, in the end, it is still up to us humans to dictate how we go about consuming our information. Even Google and Facebook have ways that let us see what we want, in terms of news. You can use Google Reader, for example, and subscribe to every RSS feed your heart desires, and you can see every headline from every publication offering these feeds. It can be quite a task to get through all of your feeds, if you’re subscribed to too many, but you are still in control of how you consume that information. If you want conflicting view points, you can subscribe to both Fox News and MSNBC.

If Google is returning you MSNBC links for news searches, you can go to Fox News and search for the same topics there. And vice versa.

All of that said, it can certainly get more complex when you’re talking about non-news content, there’s a lot of gray area.

Google’s 57 Signals

He also says a Google engineer told him that Google has 57 signals that it looks at to personally tailor your query results. These signals, I presume, are a certain subset of the over 200 overall ranking signals Google employs with its algorithm. Pariser says the 57 includes things like what kind of computer you’re on, what browser you’re using, and where you’re located. Google doesn’t like to get into signal-naming too much, though it does let us know about certain ones from time to time.

RenĂ© Pickhardt, a Webscience PHD student, took a crack at naming at least 40 of them. These are by no means confirmed by Google, but it’s an interesting compilation. It includes things like: search history, frequency of searches, age, sex, use of advanced search commands, etc.

In the end, there is simply a ridiculous amount of information at our disposal, being uploaded to the web every single second. The concept of the filter bubble charges that our access to all of this is limited by what the algorithmic gatekeepers think we should be seeing. On the flipside, these gatekeepers are tasked with providing the information they deem most relevant to our daily content consumption (and search) needs. By not employing such filtering, they could be said to be adding more noise. It’s a complex issue, on which opinions vary. It’s convenience vs. information overload.

What do you think? Should Google, Facebook and others be filtering results based on who we are? Share your thoughts.

Get the WebProNews newsletter delivered to your inbox

Get the free daily newsletter read by decision makers

Subscribe
Advertise with Us

Ready to get started?

Get our media kit

Advertise with Us