Facebook has followed the lead of Google and banned fake news sites from using its advertising network. This comes hot on the trail of public scrutiny surrounding the social network's unwitting proliferation of articles containing false information.
"While implied, we have updated the policy to explicitly clarify that this applies to fake news," a spokesperson told the Wall Street Journal. "We vigorously enforce our policies and take swift action against sites and apps that are found to be in violation. Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance."
Facebook's track record with fake news has faced particular criticism in the aftermath of the US presidential election, with many believing that the spread of inaccurate or wholly fabricated news stories helped to deepen voter biases and even influence or shape their decisions in the booth. "Facebook, by design, by algorithm, and by policy, has created a platform that amplifies misinformation," says Zeynep Tufekci, Associate Professor at the University of North Carolina.
Zuckerberg himself has called such claims "crazy," but an unnamed source within the company has confirmed that "[Zuckerberg] knows... that fake news ran wild on our platform during the entire campaign season." There are even hints that a secret internal task force is looking into this exact issue.
Over 60 per cent of millennials use Facebook as a primary political news source, according to Pew Research. This essentially makes Facebook the "largest millennial marketplace for news and ideas in the world," the Guardian's Scott Bixby wrote prior to the election.
Facebook is also recognised as the most influential social platform when it comes to consumer decisions, with 47 per cent of Americans saying that "Facebook has the greatest impact on purchase behaviour" out of all networks. And yes, sure, voting in an election isn't exactly like buying a vacuum cleaner. Even so, Zuckerberg's assertion that Facebook played no part whatsoever in informing voter decision-making feels more than a little disingenuous.
But Facebook's problem with how it circulates news stories goes much further back than just the election. The network has had issues with its Trending Topics function; control had to be handed over to artificial intelligence after it became apparent that human editors were letting their own unconscious biases slip into the mix. And the inability of both algorithms and users to differentiate between satirical stories and real news reports resulted in a number of hoax stories going viral, necessitating the creation of a 'satire' warning two years ago.
It's not impossible to see how the fake news phenomenon has spread. We all exist in a media ecosystem where outrage and inspiration are equally incentivised (not to mention monetised), and the shortest route there is often through hastily assembled clickbait. Curiosity is an unavoidable impulse online; human beings are wired to follow a link to a story if they feel they are missing out on information. But whose responsibility is it to ensure that this information is accurate? Can we design our way to the truth?
"A bias towards truth isn't an impossible goal," says former Facebook designer Bobby Goodlatte. "Wikipedia, for instance, still bends towards the truth despite a massive audience." However, he also acknowledges that Facebook's News Feed "optimises for engagement," and that, unfortunately, "bullshit is highly engaging."
Or at least, it was. With fake news sites now cut out of the Audience Network, and Facebook itself under the microscope, there is every chance that consumers will become increasingly discerning in their media diet.
This article originally appeared at Ogilvydo.