You Only See What You Want to See - The Dark Side of Social Media

20/07/2016 13:37 | Updated 20 July 2016

Like millions of other Britons, on the 24th June I woke up to news I couldn't understand. Since then, according to my feeds, the world as I know it ended and season 7 of Game of Thrones was played out in Westminster, this time with more plot twists but mercifully less nudity.

Brexit. When the vote was announced my digital world echoed my thoughts; a reverberant chamber of fear and outrage. If so many people shared my thoughts, how could this have happened?

The reality is that social media presents us with the opinions and values we want to see. Twitter users actively choose who they want to follow. The Facebook algorithm encourages users to remain on Facebook for as long as possible by showing them information they are most likely to be interested in. This is a necessary (and understandable) business model to expose users to the advertisements which fund Facebook itself.

In the run up to the referendum how could I have missed the voices of those worst hit by austerity, crying out for attention? I was within my own echo chamber. Presumably so were other voters and even political parties. Political leaders were not only detached from their constituents but also unable to access the private echo chambers of individuals' news feeds to gauge true sentiment.

We were unknowingly complacent. This is the darker side of digital: information walls built around those who don't challenge them.

A recent Reuters study found that 50% of people use social media as a news source, all of them being presented with coverage of news in a tone which matches their own beliefs. This is before we consider sentiments from friends adding their own reinforcement to a person's views. As Upworthy CEO Eli Pariser would say, we have a very large "filter bubble".

There has been much talk on healing the divides revealed during the referendum. How then does social media fit into this when it is designed to reinforce views?

If you're looking for an immediate "quick fix" way of opening your own horizons you can do two things. Firstly, resist the temptation to unfollow those who have varying beliefs to you. Secondly, remember that humans are social animals. It's well documented that a person is most likely to be open to discussion when they don't feel threatened. Before embarking on a factual debate look to understand a person's view point; self-affirmation is a powerful tool.

What about ensuring world views are not narrowed on social media for those not actively seeking varied opinions? Presenting a user with the occasional story which has the opposite view to their own cannot solve the problem. We've all experienced that arguing against someone's view point causes the person to defend that view even more strongly. This is commonly referred to as the "backfire effect" - the mind's automatic defense to a contradictory view.

To avoid this effect, algorithms could use "nudge theory" to present unaware users with some stories whose sentiments are close to their own. How would social media providers decide which way to steer peoples' values? To be fair, everyone would need nudging. Inevitably a human programming the algorithm would need to decide the end goal of the nudge. This is far from ideal...

Could algorithms instead determine and openly tag the sentiments people are being presented with? The printed press have long been accepted as having particular affiliations to political parties or viewpoints. Is it socially responsible for social media providers to tag the news they provide in a similar way (e.g. "left", "right", "liberal", "green",..)?

Ultimately this is a question which will take time to answer, but, as businesses who have such power to influence society, social media providers must find a way of keeping our world view open. Their desire to be a constant presence in our lives must come at the cost of ensuring our chambers don't echo.