24/04/2019 12:09 BST | Updated 24/04/2019 12:23 BST

New Zealand Terror Attack Video May Still Be On Facebook, Director Admits

Social media firms told their platforms are a "cesspit".

The New Zealand terror attack video in which dozens are brutally shot and killed on camera could still be on Facebook, a director of the company has admitted. 

Neil Potts, public policy director, claimed the social media company was doing “an excellent job” of taking material down but “it is possible” versions of harrowing clip remain online. 

The Christchurch mosque attacks saw 50 murdered and a further 50 injured. 

The man thought to be behind the attacks, 28-year-old self-proclaimed white supremacist Brenton Tarrant, broadcast the attack live on Facebook. 

Giving evidence to the Commons’ home affairs select committee’s session on hate crime on Wednesday, Potts said the firm was able to remove the video within ten minutes of police flagging it, but 800 variants of the video sprang up in its wake. 

“Do you think there are still versions of that video on your platform now?,” Potts was asked by Labour MP Yvette Cooper, who chairs the committee. 

Potts replied it was “hard for me to say” but “possible”. 

He said: “I think we’re doing an excellent job of removing with machine-learning and does learn over time, so as we see a video it can teach itself what to look for. 

“Is there a possibility that one video exists that has changed audio, a different filter, a different angle. It’s possible. 

PA Wire/PA Images
Neil Potts, Public Policy Director at Facebook, giving evidence to the Home Affaiirs Select Committee at the House of Commons, London, on the subject of hate crime and its violent consequences.

“I’m unfamiliar with seeing those copies. I know we are doing a lot of investment there but it is possible.”  

Representatives of YouTube, Facebook and Twitter were questioned during the session about extreme right-wing content that can be found “in seconds”.

Labour MP Stephen Doughty claimed sites were “a cesspit” and systems to monitor offensive content were not working. 

“Your systems are simply not working and, quite frankly, it’s a cesspit,” Doughty said. “It feels like your companies really don’t give a damn

“You give a lot of words, you give a lot of rhetoric, you don’t actually take action.”

He highlighted neo-Nazi material from well-known outlets that was easily found on all three sites, and said he had managed to find footage of the March terrorist attack in New Zealand in seconds.

The Labour MP told the trio: “You’re all three of you not doing your jobs.”

Neil Potts, from Facebook, stressed that the network has 2.7 billion users and said it does “an extremely good job” in taking material down.

Katy Minshall, representing Twitter, said safety is the company’s “top priority” and it will continue to invest in tools that identify users who break rules.

YouTube’s Marco Pancini said the company is working with non-governmental organisations in 27 European countries to improve detection of offensive content.

The committee heard that supporters of the New Zealand attacker had deliberately uploaded altered versions of the footage to avoid detection by the systems designed to identify offensive and illegal content.