Question Your News Sources - It's Never Been More Important

People need to understand the role their likes and shares play in generating the content they see, and remember that, at the root of all this, the platforms they use are private companies who depend on activity to survive.
|
Open Image Modal
RTimages via Getty Images

As the dust settles on last week's US elections, the air on social media has become thick with unanswered questions. How did Trump win? How many of the promises hurled from his podium will he be able or inclined to fulfil? What does all this mean for the UK, asides from the fact that we're probably going to be hearing a lot more from Nigel Farage?

Amongst these discussions, one may prove particularly important, and it's got nothing to do with America's nuclear codes. This is a question of identity - more specifically, that of Facebook, and many platforms like it. If a large number (say, 62% of American adults) use your software to get their news, are you a news company?

So far, Facebook's response to this has been an emphatic no. The head of their News Feed feature, Adam Mosseri, explained their stance in September: "We think of ourselves as a technology company, because the problems that we deal with on a day to day basis are technological problems." In the case of Facebook's 'Trending' section, which appears on the side of your screen with a list of articles you might like to read next, this technological problem is as follows: How can we present content which will best keep people on the site; which they will share, and comment on, and generate the froth of activity which keeps the platform alive? This doesn't mean, of course, that Facebook will show absolutely anything - posts must live up to the company's community standards, which prohibit 'sensitive content' such as nudity - but for content which passes this bar, clicks are king.

This approach has not damaged Facebook's revenue stream - the company's net income for Q2 2016 was $2bn. Some, however, are beginning to question the effect it is having on public discourse. Not only are you more likely to interact with, and thus be presented, content you agree with, but also that much of this content bears little resemblance to any objective truth.

One sobering story which emerged during the 2016 presidential campaign concerned a group of teenagers in Macedonia, running a network of pro-Trump sites. These churned out extraordinary pieces peppered with emotive language, often entirely false or purloined from other far-right sites, in an attempt to get them talked about on Facebook. The fact that much of the content their sites was sharing was untrue, of course, didn't bother the Macedonians - they were simply trying to maximise the advertising revenue gained from all those American clicks. Like Facebook, their goal was not to inform, but to engage.

It is tempting to see this as a problem limited to the right, or to social media. The need to compete for attention online has put pressure on news sources from all political persuasions to make their content more impulsively sharable. Traditionally, however, these stories have come from sites with editorial oversight, with a readership to maintain and some motivation not to publish outright falsehoods. This respectability, however, is becoming increasingly easy to fake. At first glance, the Macedonian sites cheering on Trump resemble any other blog. When shared, the link seen by users on social media looks identical.

We may be about to see Facebook's approach to fake news changing soon - there is talk of a secret task force being formed to address the problem. Even a strong editorial stance by Facebook, however, would miss the core issue here - that people are either increasingly unable to discern truth from hoax, or, as seems more likely, simply don't care if the articles they are sharing are untrue.

It is impossible to purge the internet of false or misleading material, and we shouldn't expect it to be absent from social media. The challenge here is to equip people with the tools to deal with this material when they inevitably come across it. People need to understand the role their likes and shares play in generating the content they see, and remember that, at the root of all this, the platforms they use are private companies who depend on activity to survive. There are certainly ways in which social media sites can help here - Google's recent addition of fact check links to certain articles may prove a useful tool - but the responsibility must in the end fall on the users themselves.

The problem of fake news cannot be solved solely by the far right, or by Facebook. As I've argued before, talking about the response to extremist content online, a solution needs to involve wide swathes of society; the media, the education system and civic society all have a key role to play. It's not going to be easy to get people to question their news sources, but it's never been more important.