Facebook, Twitter and other social media giants have been branded a “disgrace” by MPs for failing to crack down on the distribution of terror recruitment videos and images of child abuse on their networks.
In a report published today, the Commons home affairs committee said the government should consider changing the law to punish the tech firms for failing to remove illegal or “dangerous” content.
Yvette Cooper, the Labour chair of the committee, said the way the internet firms were behaving “shameful”
The MPs slammed social media companies’ enforcement of their own community standards as “weak, haphazard and inadequate” when compared to smaller firms.
The committee said during its investigation, which had to be cut short due to the snap general election, it found repeated examples of illegal material not being removed even after it had been reported.
According to the MPs, videos for banned jihadi and neo Nazi groups remained online even after being reported by the committee.
Anti-Semitic abuse of MPs was also not removed despite being highlighted by MPs in a previous report.
And material encouraging child abuse or sexual images of children, even after being reported by journalists, was still accessible.
Twitter and Facebook both said they were committed to tackling the problem.
Commenting on her committee’s report, Cooper said leading social media companies needed to to better.
“These are among the biggest, richest and cleverest companies in the world, and their services have become a crucial part of people’s lives. This isn’t beyond them to solve, yet they are failing to do so,” she said.
“They continue to operate as platforms for hatred and extremism without even taking basic steps to make sure they can quickly stop illegal material, properly enforce their own community standards, or keep people safe.”
But Nick Pickles, Twitter’s UK Head of Public Policy, told HuffPost UK the company’s rules “clearly stipulate that we do not tolerate hateful conduct and abuse” on its platform.
“As well as taking action on accounts when they’re reported to us by users, we’ve significantly expanded the scale of our efforts across a number of key areas,” he said.
“From introducing a range of brand new tools to combat abuse, to expanding and retraining our support teams, we’re moving at pace and tracking our progress in real-time.
“We’re also investing heavily in our technology in order to remove accounts who deliberately misuse our platform for the sole purpose of abusing or harassing others. It’s important to note this is an ongoing process as we listen to the direct feedback of our users and move quickly in the pursuit of our mission to improve Twitter for everyone.”
Simon Milner, Director of Policy at Facebook said “nothing is more important to us than people’s safety on Facebook”.
“That is why we have quick and easy ways for people to report content, so that we can review, and if necessary remove, it from our platform.
“We agree with the Committee that there is more we can do to disrupt people wanting to spread hate and extremism online. That’s why we are working closely with partners, including experts at Kings College, London, and at the Institute for Strategic Dialogue, to help us improve the effectiveness of our approach.
“We look forward to engaging with the new Government and parliament on these important issues after the election.”
The report has also been seized upon by anti-Brexit campaigners who said the report highlighted an increase in hate crimes since the EU referendum result.
Chuka Umunna, a Labour member of the committee, said “the scourge of hate crime has been worsened and stoked by Brexit”.
“There is real evidence to suggests that the language and tactics of the Leave campaign led to an increase in hate crime incidents in our country,” he said.
“This behaviour is disgusting and un-British. We all need to unite to take it on and make clear that hate crime has no place in modern Britain.
“The Prime Minister and her party – who have adopted the Leave campaign’s rhetoric, sent ‘go home vans’ around areas with high immigrant populations and ran a disgraceful campaign against London’s first Muslim mayor – need to wake up and set a far better example.”
Peter Wanless, the chief executive of the NSPCC, said the report showed the government needed to tackle the problem of child abuse “head on”.
“We have called for the next UK Government to introduce regulation for social networks - just like other media outlets - and to fine them where they fail to protect children,” he said.
“Whichever party is in Government after 8 June, the time has come for them to commit to ensuring the right tools are in place so children are as safe online as they are offline.
“The suffering experienced by children – often with devastating consequences for them and their families – shows that relying on voluntary regulations developed by internet companies is not enough.”