TECH
15/05/2018 17:05 BST

Facebook Has Revealed The Amount Of Content It Takes Down – And The Numbers Are Staggering

Over half a billion fake accounts were removed down between Jan-Mar 2018.

Facebook removed over half a billion fake accounts during the first quarter of 2018.

The staggering number comes from Facebook’s new enforcement report that reveals for the first time the amount of content that violated its community standards, including examples of hate speech, fake accounts and nudity or violence.

The figures (which Facebook claims are in development) reveal the sheer magnitude of the task that the social network faces in tackling inappropriate content.

The company claims that it removed or added warning labels to some 3.5 million pieces of violent content – 85% of which it says was removed before being flagged by a member of the public.

Facebook says it removed 2.5 million pieces of hate speech in the first quarter of 2018, but in stark contrast to the amount of violent content that was discovered by its own software, just 38% of hate speech was flagged by Facebook’s own technology.

Guy Rosen, VP of product management at Facebook, explained: “As Mark Zuckerberg said ... we have a lot of work still to do to prevent abuse. It’s partly that technology like artificial intelligence, while promising, is still years away from being effective for most bad content because context is so important.” 

“Artificial intelligence isn’t good enough yet to determine whether someone is pushing hate or describing something that happened to them so they can raise awareness of the issue.”

Here’s a breakdown of all the content that Facebook discovered and removed between Oct-Dec 2017 and Jan-March 2018.

Graphic Violence

How much content on Facebook contained graphic violence?

Oct-Dec 2017: For every 10,000 views, 16 to 19 contained graphic violence

Jan-Mar 2018: For every 10,000 views, 22 to 27 contained graphic violence

How much content was taken down or had warning labels applied?

Oct-Dec 2017: 1.2 million

Jan-Mar 2018: 3.4 million

How much content was found by Facebook before it was reported?

Oct-Dec 2017: 71%

Jan-Mar 2018: 85.6%

Hate Speech

How much content on Facebook contained hate speech?

Oct-Dec 2017: Data not available

Jan-Mar 2018: Data not available

How much content was taken down?

Oct-Dec 2017: 1.6 million pieces of content

Jan-Mar 2018: 2.5 million pieces of content

How much content was found by Facebook before it was reported?

Oct-Dec 2017: 23%

Jan-Mar 2018: 38%

Terrorist Propaganda (ISIS, al-Qaeda and Affiliates)

How much content on Facebook contained terrorist propaganda?

Oct-Dec 2017: Data not available

Jan-Mar 2018: Data not available

How much content was taken down?

Oct-Dec 2017: 1.1 million pieces of content

Jan-Mar 2018: 1.9 million pieces of content

How much content was found by Facebook before it was reported?

Oct-Dec 2017: 96.9%%

Jan-Mar 2018: 99.5%

Fake Accounts

How many fake accounts were discovered?

Oct-Dec 2017: Data not available

Jan-Mar 2018: Data not available

How many fake accounts were taken down?

Oct-Dec 2017: 694 million

Jan-Mar 2018: 583 million

How many fake accounts were found by Facebook before it was reported?

Oct-Dec 2017: 99.1%

Jan-Mar 2018: 98.5%

Adult Nudity and Sexual Activity 

How much content contains adult nudity or sexual activity on Facebook?

Oct-Dec 2017: For every 10,000 views, 6 to 8 contained nudity or sexual activity.

Jan-Mar 2018: For every 10,000 views, 7 to 9 contained nudity or sexual activity.

How much content was acted upon?

Oct-Dec 2017: 21 million

Jan-Mar 2018: 21 million

How much content was found by Facebook before it was reported?

Oct-Dec 2017: 99.1%

Jan-Mar 2018: 98.5%

Spam

How much content on Facebook contained spam?

Oct-Dec 2017: Data not available

Jan-Mar 2018: Data not available

How much content was taken down?

Oct-Dec 2017: 727 million pieces of content

Jan-Mar 2018: 836 million pieces of content

How much content was found by Facebook before it was reported?

Oct-Dec 2017: 99.8%

Jan-Mar 2018: 99.7%