Facebook Winning Trump The Election Is 'Crazy' Says Zuckerberg

He didn't mince his words...

After claims that fake news on Facebook as well as its effect as an ‘echo chamber’ had aided the rise of Donald Trump, founder Mark Zuckerberg has finally spoken out.

The CEO strongly defended the social media network at Techonomy, a technology conference in California, and said that Facebook was in no way responsible for the outcome of the US presidential election.

Open Image Modal
Bloomberg via Getty Images

“The idea that fake news on Facebook influenced the election in any way is a pretty crazy idea,” he said.

Facebook is by far and away one of the largest news publishers on the planet, with many news organisations getting much of their traffic through Facebook Instant Articles, links and videos shared on the site.

Despite this many fake stories still made huge waves on the site. A story claiming that Pope Francis had endorsed Donald Trump was shared thousands of times on the social media site despite being completely false.

Open Image Modal
ASSOCIATED PRESS

Donald Trump’s presidential campaign has been a consistent problem area for Facebook after claims earlier in the year that the site was intentionally favouring liberal stories in its “Trending Stories” section.

The resulting controversy led to Facebook sacking its entire human-led news team and replacing them with a computer-powered algorithm.

Almost immediately after the computer system was installed however a number of fake stories started appearing on the “Trending Stories” section.

In response Zuckerberg admitted that the News Feed was always a work in progress, and that more could be done. 

“My goal, and what I care about, is giving people the power to share so we can make the world more open and connected. That requires building a good version of News Feed. We still have work to do on that. We’re going to keep improving it.

One of the key hurdles that Facebook needs to overcome is the basic principle that powers the News Feed. It is designed to intrinsically only show you content that it thinks you will be interested in seeing. 

That means that if you’re only interested in seeing videos about Trump then that’s all it’s going to show you, with the same going for Clinton.

This, combined with the fact that you then cement this even further by simply choosing friends who agree with your ideology means that the likelihood of you seeing a balanced and fair contradictory view of the world is unlikely. This effect has been called the “echo chamber” or “filter bubble”.