If you plan to spend some holiday downtime catching up on the latest presidential campaign news, consider the following about how political information reaches our (virtual) doorsteps … and what this means for how we form decisions about who to vote for, and how to take a stand on other important topics.

Filter Bubbles Influence Life-Critical Choices

We are blessed today with unfettered access to news provided by citizen journalists, through an almost infinite number of Internet channels. By removing human gatekeepers such as newspaper editors and TV news producers, the Internet has, for the first time, enabled unfiltered access to raw information, straight from the sources. We should be making the most informed decisions in history.

But things haven’t turned out quite that way, for a number of reasons. First, there is no way you can consume and process the unending torrent of ongoing news. Secondly, even if you could consume the stories, it would be impossible to evaluate the reliability of each news source. 

So to keep things simple in this age of information overload, we rely on a new generation of gatekeepers. Not accredited journalists or experienced analysts, but algorithms. 

This is an observation put forward by Internet activist Eli Pariser in his 2011 book, "The Filter Bubble: How the New Personalized Web Is Changing What We Read and How We Think." According to Pariser, when we search for information on the Internet, algorithms from Google, Bing, Facebook, Yahoo and others calculate search results according to what they think we most likely want to see. 

By presenting "friendly," personalized information, Google and other search engines lock us into "filter bubbles." We think we are seeing unbiased information. In actuality, we live within a bubble universe of likeminded opinions. In this universe, we rarely encounter alternative viewpoints … leaving us with the impression that our viewpoint is the popular one.

Of course, everyone is different, so these types of search results are personalized for each and every person. There are no right or wrong answers — everyone’s a winner. In a popular TEDTalk from 2012, Pariser showed how two similar individuals performing an identical Google search for information on Egypt, received wildly different results. One received primarily political results, while the other received information about travel options. 

Well what’s wrong with that? It sounds very practical … and commercially sound. Why should Google determining the information we see worry us?

Is a Filter Bubble a Good Thing?

When it comes to making informed decisions, the fact that we primarily see likeminded views online should be wildly troubling, because it locks us into increasingly-polarized camps on just about every topic under the sun. So it’s no surprise that the manipulation of information is a hot topic among information researchers, especially now, before the 2016 elections.

A study from this past summer, published in the Proceedings of the National Academy of Sciences (PNAS), claimed that Google alone could sway a national election simply by manipulating search results. The study concluded with the foreboding statement that “unregulated election-related search rankings could pose a significant threat to the democratic system of government.” The study uncovered four reasons for concern:

  1. Voter preferences affecting search results lead to a bandwagon effect that magnifies even small-scale search manipulations. This means that people clicking on "promoted" content may cause that content to become popular by bumping it up in the search index.
  2. Search rankings are hard to detect and therefore people don’t know they are being manipulated.
  3. Candidates unpopular with a search engine provider have no recourse to overcome programmatic manipulation. There is simply no practical way to circumvent search result intervention, nor is there any path for recourse with the provider.
  4. People increasingly turn to search engines as an information source, thereby increasing the effects of a search manipulation.

If this wasn’t bad enough, two new studies showed that manipulation through social networks, like Facebook, may have an even bigger influence than those exerted by search engines. The research found that algorithmic-led manipulation can be exacerbated by a second type of information gatekeeper: our friends and family.

One study from the University of Buffalo uncovered a phenomenon called "selective exposure," through which people share likeminded ideas with friends on social networks, leading to a social filter bubble. The findings make intuitive sense. People gravitate to likeminded people in online social settings, so many of the stories they share are likely to conform to similar worldviews.

A second study from the University of Indiana focused on the use of social platforms as a way to deal with information overload. According to the study, people are unaware that “the discovery of information is being transformed from an individual to a social endeavor.” As in the PNAS study, the University of Indiana researchers concluded the danger lies in people's unawareness of the artificial limits being placed on their information access.

Learning Opportunities

From even a cursory review of these findings, it would seem we are in the throes of an information crisis. So it’s not surprising that those with the most to lose are not taking this sitting down.

Some of My Best Friends Are Algorithms

Facebook scientists tried to dispel the Facebook filter bubble idea in a study, "Exposure to Ideologically Diverse News and Opinion on Facebook." The study examined how 10.1 million US Facebook users interacted with socially-shared news. The scientists concluded that “individuals’ choices, more than algorithms limit exposure to attitude-challenging content” and that “social media exposes individuals to at least some ideologically cross-cutting viewpoints,” which is better than “people browsing only ideologically aligned news sources or opting out of hard news altogether.” 

The bottom line is that if we miss a challenging article, we are more to blame than our friends, and sharing with friends is better than ignoring opposing publications altogether, because at least with friends, there is a chance we may receive an article with an opposing viewpoint.

So, is social media helping or hindering our exposure to a balanced diet of news? Pariser, for one, is not convinced. In his recent article, "Did Facebook’s Big New Study Kill My Filter Bubble Thesis?," Pariser concluded that Facebook’s determination that ‘‘'individual choice’ matters more than algorithms” is an overstatement. He wrote that “certainly, who your friends are matters a lot in social media. But the fact that the algorithm’s narrowing effect is nearly as strong as our own avoidance of views we disagree with, suggests that it’s actually a pretty big deal.” 

The takeaway? If we ignore news stories with an opposing viewpoint, it’s unlikely our like-minded friends will be sharing those either.

Implications for the Holidays

While the research studies focused on political topics, the implications for information manipulation go far beyond politics. When tasked to take a stand on important issues like immigration reform, global warming, energy policy and foreign affairs, most of us are fed news stories by like-minded friends and family. How can we make informed decisions when we only see half the picture? And should we rely on Google and Facebook to determine what’s newsworthy for us?

My proposal: let's spend some of our holiday downtime researching these important topics and developing our opinions. Seeking out opposing viewpoints on reputable news sites that represent views other than our own. We will be wiser for it, and our decisions will be more informed. 

And maybe — just maybe — it may make the world a better place. What a wonderful thought for the holidays.

Title image CC BY-SA 2.0 by  Meg Stewart 

fa-solid fa-hand-paper Learn how you can join our contributor community.