How Do We Open Up The Social Media Echo Chamber Without Letting The Trolls In?

Obviously, however networks choose to approach the echo chamber, be it through creating neutral spaces for debate or using AI to play matchmaker with political opposites, it would have to be done in such a way that filters out abusive language and provides a considerate environment for users to explore all of the different perspectives;

2016 is turning into one of the most politically explosive years in living memory. It is also the year that, if some pundits are to be believed, we lost our ability to even briefly consider an opinion that clashes with our own. In the wake of Brexit and amid the ruckus of the presidential election, it seems we are cutting ourselves off from dissenting voices online, preferring the company of likeminded people and consuming news from outlets whose sensibilities mirror ours.

It might not be an entirely conscious decision; as in all aspects of modern life, algorithms play a role here. Eli Pariser is CEO of Upworthy, aka the ground zero of clichéd clickbait. He knows a thing or two about the kind of content that does well on social media, and has coined the term "filter bubble" to describe the phenomenon of a news feed offering up more and more of the stuff you like.

"What most algorithms are trying to do is increase engagement, to increase the amount of attention you're spending on that platform. So it makes sense that they're sharing articles you think you're going to like, read, and share," he told NPR in a recent interview. "It's helpful, but the danger is that increasingly, you end up not seeing what people who think differently see, and in fact not even knowing that it exists."

Which might go some way towards explaining why so many Remain supporters were blindsided by the result of the EU Referendum in June. If we're surrounded by people who think the same way as us, it's easy to forget that the rest of the world doesn't follow the same pattern. "I don't know a single Trump supporter," says Pariser, "and that's a problem, because even if the current polls are correct, 4 out of 10 people are voting for Trump. So it illustrates to me the importance of finding a way to build media that does bridge some of those divides."

So how can users break out of this bubble? "There's the algorithmic side of this, and then there's the behavioural side," says Pariser. In other words, while a left-leaning Twitter user can choose to follow a conservative blog like Breitbart, that doesn't necessarily mean they are going to read the articles, because it's still not the kind of content they want to engage with.

Rob Owers, Head of News and Government Partnerships at Twitter, believes it is "unhealthy" to limit your online exposure to one narrative. By that token, users should be encouraged to be promiscuous in their interactions, and to broaden their worldview by engaging with different people on different platforms.

But is this phenomenon even real? "I think our real lives are echo chambers," says Nic Newman, lead author of the Reuters Institute's Digital News Report 2016. "Social media might, in fact, be less of a reinforcing mechanism."

And he's not alone; Professor Jeff Jarvis at the City University of New York's Graduate School of Journalism doesn't buy the argument that we're all falling into silos which entrench our views. "That is a presumption about the platforms -- because we in media think we do this better," he says. "Newspapers, remember, came from the perspective of very few people: one editor, really. Facebook comes with many perspectives and gives many; as Zuckerberg points out, no two people on Earth see the same Facebook."

And as the current fragmented political landscape indicates, no two people hold identical views, either. Far from thinking along traditional "left vs. right" lines, the 'Dead Centre' report from research firm Opinium has actually identified eight distinct "tribes" in the UK. These groups span the entire political spectrum, varying and overlapping on myriad issues; ripe ingredients for debate, surely, not consensus.

If these bubbles of agreement do exist, are they the indirect result of a design flaw in the platforms themselves? Facebook had to hand control of its Trending Topics feature back to algorithms after it was revealed that the human editors' own political leanings were influencing what users saw -- pretty damning, considering Facebook is a primary news source for 6 out of 10 millennials, according to Pew Research. But even algorithms aren't immune to prejudice; they soak up the unconscious biases of everybody involved in the development process.

This has some rather grim implications when you consider the homogeneity of your average team of software engineers -- and the tech world at large. Until recently, Elon Musk followed zero women on Twitter, and that industry leaders Tim Cook and Bill Gates weren't doing much better. "These men are shaping the products and services that influence our lives, and they're choosing to build an echo chamber of other men," writes Caroline O'Donoghue at The Pool. Such a lack of diversity means conversations can very quickly become circular, and without fresh new voices breaking in with solutions, the same old problems can be bandied about for years. When you exist in a self-perpetuating space, adds O'Donoghue, "you miss opportunities to create things for an audience that isn't you."

So how do we design a more inclusive social media experience? Are more holistically conceived algorithms needed, to challenge cloistered thinking and confirmation bias? Does the answer lie in crowdsourced human curation? Perhaps some combination of the two can be implemented to create a tool akin to Twitter's Moments, pulling together a plurality of voices on news stories and encouraging users to think about issues in a different way. Instead of simply dismissing an all-informed tweet, wouldn't it be nice if instead there was a way to direct people to the right resource with a single click? Of course, not everybody would take kindly to such a feature.

There have been some very thoughtful examinations of how our online interactions can serve to further entrench our existing views. There have been less nuanced pieces claiming that younger generations are incapable of critical thinking. But what these commentators often fail to acknowledge is that, in many cases, blocking, muting and un-friending is a necessary act of self-care. Not because the people having these conversation are delicate cry-babies, but because all too often, they are interacting with trolls whose anonymity allows them to harass and threaten, and because lax conduct policies allow even the most civil of conversations to devolve into abuse.

Obviously, however networks choose to approach the echo chamber, be it through creating neutral spaces for debate or using AI to play matchmaker with political opposites, it would have to be done in such a way that filters out abusive language and provides a considerate environment for users to explore all of the different perspectives; weeding out the trolls will be outcome-critical. Twitter and Facebook are commercial companies, after all, and it's in their interests to make the user experience safe.

So once the personal insights and experiences going into making these new algorithms are truly diverse, and Twitter and Facebook have become more democratic in the content served up to users, what sort of positive effect might we see in users' lives? It is entirely possible that such a tool would build empathy and understanding, make people less quick to pass judgment, and foster a more respectful approach to discourse.

Or at the very least, people will be less bloody shocked when the election doesn't go their way.

Originally published at Imperica. © 2016 Perini

Close

What's Hot