In a statement made on Facebook, Zuckerberg said that he had been “reflecting on how we can do better for our community.”
The decision is in direct response to a series of violent deaths that were broadcast in the last fortnight using Facebook Live.
“If we’re going to build a safe community, we need to respond quickly. We’re working to make these videos easier to report so we can take the right action sooner ― whether that’s responding quickly when someone needs help or taking a post down.” he said.
In addition to making it easier to report these videos, Zuckerberg confirmed that over the year Facebook would be hiring some 3,000 new staff to the global Community Operations Team.
This would be in addition to the 4,500 people who are already tasked with reviewing the millions of requests that are made regarding violent or inappropriate content on Facebook.
On Easter Sunday 74-year-old Robert Godwin was randomly shot dead on a Cleveland street by gunman Steve Stephens, who uploaded the footage before fatally shooting himself. The chilling video was on Facebook for three hours before it was removed.
Days later, a Facebook Live video of a Thai man hanging his 11-month-old daughter before killing himself in Phuket emerged. Two clips of the video were accessible to users on his Facebook profile for about 24 hours and were viewed almost 400,000 times in total.
While Zuckerberg mentioned both incidents during the companies annual developer conference it appears as though the company now has a definitive plan of action moving forward.
“These reviewers will also help us get better at removing things we don’t allow on Facebook like hate speech and child exploitation.” explains Zuckerberg. “And we’ll keep working with local community groups and law enforcement who are in the best position to help someone if they need it ― either because they’re about to harm themselves, or because they’re in danger from someone else.”
Responding to Mark Zuckerberg’s post Labour MP Yvette Cooper said:
“This is a very welcome step from Facebook. They are the first social media company to be open about the number of people they employ in community operations who are tasked with reviewing reported content - something the Home Affairs Select Committee repeatedly called for and recommended in our report last week before Parliament dissolved. Facebook have also announced more much needed staff to help speed up their processes - again something we called for as a result of our inquiry.
“It is also welcome that Facebook say extra staff will improve their ability to remove hate speech and child exploitation - and that they will be using more of their technological ability to improve safety too. I hope that Twitter and Google will now do the same, starting with the simple first step of being transparent with the public about the resources they put into safety.
“Social media is an immensely important part of all of our lives, and that is why our report was clear that social media companies need to show social responsibility in tackling illegal content, hate crime and abuse online. As a result, I think Mark Zuckerberg’s commitment to do much more in building a safe community online is important and I look forward to seeing the further steps that Facebook plan. I hope all the social media companies and Government will take forward all the recommendations we made.”
You can read the full statement from Mark Zuckerberg here:
In response to the status Facebook’s COO reaffirmed Zuckerberg’s status saying, “Keeping people safe is our top priority. We won’t stop until we get it right.”