Control is the Answer to the Filter Bubble

The “Filter Bubble” has been a hot topic of discussion this week. This is based on the concept recently discussed in a TED Talk by Eli Pariser, which is essentially about the information w...
Control is the Answer to the Filter Bubble
Written by Chris Crum
  • The “Filter Bubble” has been a hot topic of discussion this week. This is based on the concept recently discussed in a TED Talk by Eli Pariser, which is essentially about the information we’re consuming being filtered by the websites we use to consume it. Thinks Google, Facebook, Amazon, Netflix, Huffington Post, etc.

    These sites (and many others) are feeding us information tailored to us on a personalized level. Algorithms are attempting to provide relevant content based on what they think we want. The problem with this, for many who view it as a problem, is that content is essentially being filtered out without our say in the matter.

    When we entered the discussion, alternative search engine DuckDuckGo had just put out an infographic-based site DontBubble.us, talking about the concept (and plugging DuckDuckGo). Now, Founder Gabriel Weinberg is talking about this more, saying that the “real Filter Bubble debate” is not so much about whether segregating results based on personal information is good o not, but over which personal signals should be used, what controls we should have as users, and what results arise from the use of the signals presented. On his blog, he writes:

    The central point of the Filter Bubble argument is that showing different people different results has consequences. By definition, you are segregating, grouping and then promoting results based on personal information, which necessitates less diversity in the result set since other results have to get demoted in the process. Of course you can introduce counter-measures to increase diversity, but that is just mitigating the degree to which it is happening. Consequences that follow from less diversity are things like increasing partisanship and decreasing exposure to alternative viewpoints. 

    My view is that when it comes to search engines in particular, the use of personal information should be as explicit and transparent as possible, with active user involvement in creating their profiles and fine-grained control over how they are used.  Personalization is not a black and white feature. It doesn’t have to be on or off. It isn’t even one-dimensional.  At a minimum users should know which factors are being used and at best they should be able to choose which factors are being used, to what degree and in what contexts.

    If you do not do that, and instead rely on implicit inference from passive data collection (searches, clicks, etc.), then the search engine is just left to “guess” at your personal profile. And that’s why the examples from The Filter Bubble seem creepy to a lot of people. It seems like the search engine algorithm has inferred political affiliation, job, etc. without being explicitly told by the user.

    Here’s video from the Filter Bubble talk, which illustrates what he’s talking about:

    The real Filter Bubble debate #fb #in http://t.co/pFV6o94 cc: @elipariser 3 hours ago via web · powered by @socialditto

    Interesting. RT @yegg: The real Filter Bubble debate #fb #in http://t.co/pFV6o94 cc: @elipariser 2 hours ago via TweetDeck · powered by @socialditto

    There are times when filtering results makes sense, as Weinberg points out, such as movie listings by zip code, for example. I don’t know about you, but I don’t mind having results that are actually relevant to me based on certain elements like this, but Weinberg’s point about control is certainly a good once. Should Google and Facebook give us more control over what is being filtered from our search results or news feeds?

    I don’t think many would complain about having more control.

    If control is the answer to the Filter Bubble, it makes the timing of this debate even more interesting, considering the FTC is investigating Google’s business practices.

    “We firmly believe you control your data, so we have a team of engineers whose only goal is to help you take your information with you,” Google’s Amit Singhal said in Google’s blog post addressing the investigation. This is more about what you can do with your data should you choose to take it away from Google. Not so much about what you can do with your data while Google is using it.

    That said, Google does provide quite a few different search options and ways users can refine their searches. If you want a broader political spectrum of results, you it shouldn’t be hard to find, by simply looking at different publications known for their respective biases and viewpoints. If you’re not sure which publications subscribe to which ideologies (for those that do have clear bias), you can simply Google them and find more information about that. It’s not that hard.

    Control may be the answer to the Filter Bubble, but you have no greater control, at least in the case of search, than to simply exercise your own ability to research and adjust your queries. Google tries to make as good a guess about the results it thinks will be most relevant to you personally through a variety of factors, but in the end, it’s an algorithm trying to determine this, and it’s never going to be 100% accurate.

    Facebook’s a little different. It’s harder to control what you see in the News Feed. You have the ability to block things, but it’s harder to know when things are being hidden without your knowledge. I guess it gives you a reason to visit people’s Walls more often.

    Get the WebProNews newsletter delivered to your inbox

    Get the free daily newsletter read by decision makers

    Subscribe
    Advertise with Us

    Ready to get started?

    Get our media kit