Mind Matters Natural and Artificial Intelligence News and Analysis
Soap foam macro abstract
Air foam bubbles in close up. Soap foam macro abstract.

Are You Trapped in a News Bubble?

The news filtered to you might leave out important things you need to know. But how can you tell?
Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

Do you get a lot of your news from social media services, like Facebook and Twitter? You are not alone. Two-thirds of Americans get at least some news on social media.

Do you mainly learn about new products or decide what to buy via big online shopping services like Alibaba or Amazon? So do hundreds of millions of people worldwide.

Do you mainly search for information using the Google search engine, and choose an answer from the first few returned results? So do three out of four search engine users.

The large majority of users of the internet are living in a filter bubble. But what if you, individually, need the information that the bubble filters out?

If you don’t even know what the bubble filters out, how do you know if you need it or not?

A filter bubble is a form of intellectual isolation. It forms when you are exposed only to a small segment of the information—such as news or search results—available on a topic or about the world. The danger can be expressed in an old computing adage: garbage in, garbage out

Filter bubbles are created by two separate mechanisms. Either mechanism can form a filter bubble on its own but they most often reinforce one another.

First, Information may be purposefully filtered by the content provider to shape and mold your opinion or actions. Think of fast food. When you walk into the lobby of a movie theatre, you smell popcorn. When you walk into an ice cream parlor, you smell waffle cones.

You might think that these smells are just a natural side-effect of the fact that those items are being cooked. But a smart retailer might waste a little popcorn or a few waffle cones just getting the smell out there everywhere so that more customers will react and buy. In fact, some businesses use equipment that generates these lovely smells in public places. There’s a word for that: choice architecture. Every successful business designs items, colors, and pathways in such a way as to get people to spend their money.

Online retailers who adjust their search results are not doing anything different from brick-and-mortar businesses in principle. What is perhaps a bit more controversial is that search engine and other content providers may be selling you a worldview when they shape their offerings in the same way.

The second way in which a filter bubble forms is through the analytics used on the social graph (or media technology) on which these systems are built. The information that social media, search, retail, financial, and other service providers collect on you is used to determine what you are likely to buy through social graphs.

If you live in a particular neighborhood in Denver, and you like a particular kind of coat, then it is likely that other people who live in the same neighborhood will also like that kind of coat. The more information that can be discovered about you, down to the smallest detail, the more accurately these models can predict what you will like (or not). Advertisers and social media services can use this information to predict what you will click on, and hence what you will read or like. The better they can predict these things, the more easily they can place advertisements that will convince you to buy something. Again, in a bit more controversial vein, they can also predict what will sway your beliefs with a fairly high degree of accuracy. Even what you see in a search box can have an effect.

A good example of a filter bubble is deciding whether to buy something based on the reviews at a single online retailer. It might go something like this: first, you search for the product. The list returned by the search can be sorted so you sort based on the average user review. Then you examine the top two or three products more closely, looking at the individual reviews for each one.

Look, first, at the search results. Are you seeing all the products available, or only a few? For instance, perhaps the online retailer is really only returning the results with the highest profit margin for their business or the products you are most likely to buy.

Look, second, at the reviews. Maybe the reviews you are reading were all written by people who bought the product in question and have used it for some time. Or maybe not. The online retailer might be showing or sorting the reviews based on what, in their system, is most likely to convince you to hit the Buy button. It could be, as well, that many of the reviews are fake or have been written by people who have bought the product but are not accurately assessing its quality. There is really no way to tell.

We can see two major problems. First, filter bubbles can sway your beliefs. Suppose every post you see on a social media site’s timeline supports a particular candidate for office and every internet search you do only shows positive results for one candidate and negative ones for the other candidate. Not only that but every bit of news you hear or read is positive for one candidate and negative for the other. Your beliefs about which candidate is best are going to be swayed. We can also apply this caution to religious beliefs, beliefs about people, beliefs about political solutions, beliefs about what kind of lives we should live, and any other belief.

Second, filter bubbles can be self-reinforcing, creating a “one-way ratchet” in your beliefs so that they become stronger. The more you see of a particular product, the more likely you are to “like” that product. The more you “like” that product, the more the media technology all these systems use will show you similar or adjacent products.

The same holds for beliefs—the more you are swayed to one way of thinking about an issue, the more you are going to see positive information about that issue (and surrounding issues). The more you fall into the filter bubble, the stronger it becomes because the system becomes ever more fully convinced that this is what you like—and these systems are highly tuned to show you what you like.


Next: Escape the filter bubble! See the world! Three escapes that don’t require tech savvy.

Further reading: How supermarkets are totally designed to get you to fill that cart. Deep learning specialist: And the scary thing is, the AI is not especially advanced


Russ White

Russ White has spent the last 30 years designing, building, and breaking computer networks. Across that time he has co-authored 42 software patents, 11 technology books, more than 20 hours of video training, and several Internet standards. He holds CCIE 2635, CCDE 2007:001, the CCAr, an MSIT from Capella University, an MACM from Shepherds Theological Seminary, and is currently working on a PhD in apologetics and culture at Southeastern Baptist Theological Seminary.

Are You Trapped in a News Bubble?