The Algorithm And Its Echo Chamber: You Don't Know What You Don't Know

We should be more selective in the content we consume. Instead of the algorithm doing the filtering first, we should manually look to filter the news, media, and information ourselves in order for algorithms to gently nudge new information by suggesting opposing views to broaden our perspective.

Nowadays we use Facebook, Twitter, and other social networks to gain information, ingest content and read the news. We often see what we like and like what we see, resulting in biased social feeds because of this echo chamber. Most of us have realised how biased our news feed is, but recent political events such as Brexit and the US election have shown us the extent of just how much this is the case.

Those that use social networks tend to fall into the Relevance Paradox, which occurs when readers only consume information that is relevant to them. In many cases, users don't even realise that they consume one-sided, or similar information because of the social circles around them. Users aren't aware that they need to, or in some cases, don't even know how, to look for fresh, new information from different perspectives because of lack of experience and knowledge in these hyper social circles. This entire phenomenon can be chalked up to "You don't know what you don't know," and users end up only reading regurgitated information.

The circumstance of regurgitated information occurs because many times readers are unaware of how algorithms currently work and how they select media for us to consume. As a consequence, we unknowingly accept the echo chamber we're placed in because we are fed information we like and agree with our own opinion. It's then reinforced because people in the same social sphere agree with us too. So the echo chambers' cycle remains because it gives us a false sense of affirmation that we are right in our beliefs - also known as a confirmation bias.

The 2016 U.S. Election coverage on our social network accounts is a perfect example of relevance paradox, echo chamber cycle, and confirmation bias. The Wall Street Journal recently put together this graphic which depicts how feeds may differ to Facebook users based on their political views: http://graphics.wsj.com/blue-feed-red-feed/

In the graphic, you can see Liberal and Conservative Facebook feeds side by side and how much they differ. After all, your feed is designed to prioritise content based on what you've liked, clicked and shared in the past. This means that conservatives don't see much content from liberal sources and vice versa. And actually, it makes people more narrow-minded according to research published in the Proceedings in the National Academy of Sciences. According to these findings, users not only select and share content that is related to a specific narrative, but they actually ignore the rest of what is presented. So in the case of the U.S. Election, Liberals will digest only the "left-wing" political agendas and narratives, and will ignore the other half of the story presented by Conservatives - and vice versa.

However, another aspect of regurgitated information has to deal with "trending topics." What is defined as trending, on social networks, is also determined by larger algorithms that calculate popular topics by national-level or world-level engagement. According to Facebook, trending topics are based on different "factors including engagement, timelines, pages you've liked, and your location." Thus what we think of as important news or information has once again been targeted and refined to reflect our specific tastes.

With this in mind, it is important to remember that Facebook had to deal with accusations in May of this year that the "trending section" of its news feed took the echo chamber one step further by openly suppressing news stories of interest to conservative readers. Suppressing certain news cycles in order to push a certain agenda would not even cause users to accept a new perspective, according to the Proceedings in the National Academy of Sciences seen above. The researchers found that "whether a news item, either substantiated or not, is accepted as true by a user may be strongly affected by social norms or by how much it coheres with the user's systems of beliefs."

However, algorithms don't seek out opposing views, or surface them for readers because they're not built to do so. Thus, we are in a filter bubble wrapped around an echo chamber (or should that be an echo chamber wrapped around by a filter bubble?). However, waiting for a change in algorithms would not be the most applicable way to break free of our polarised clusters of information. The way to break free from this is to start understanding how algorithms work, and to manually seek out others with different viewpoints. The ultimate goal is balance, and only this way can you find new perspective, different content, and learn what you don't yet know.

We should be more selective in the content we consume. Instead of the algorithm doing the filtering first, we should manually look to filter the news, media, and information ourselves in order for algorithms to gently nudge new information by suggesting opposing views to broaden our perspective.

The algorithm should be the one to challenge our point of view, not reinforce it.

Close

What's Hot