Internet Platforms Like Facebook Should Be Fined If They Don't Remove Extremist Content, Policy Exchange Says

Social media networks accused of saying 'fine words' on issue but not doing enough about it.

Companies like Google, Facebook and Twitter should be fined if they fail to remove extremist, terrorist content, a leading think tank has said, warning their progress in in stopping it has been “glacial”.

Policy Exchange said a regulator should have the power to punish the UK subsidiaries of tech giants, just as Ofcom can fine broadcasters, that inadvertently host terrorist messages like propaganda and instructions on how to carry out attacks.

Policy Exchange’s report follows the Parsons Green attack last week, when an improvised explosive device was set off on a packed rush hour train. The device was reportedly built with help from online instructions.

Dr Martyn Frampton, the lead author of Policy Exchange’s report, said, when Theresa May meets with internet companies as part of her North American tour on Wednesday, she should challenge their “constant repetition” of “very fine words” on the subject without action.

He said they have moved at a “glacial” pace when addressing the problem.

Theresa May is to meet internet companies on Wednesday. Policy Exchange said she should push them to go beyond 'very fine words' when dealing with extremist content
Theresa May is to meet internet companies on Wednesday. Policy Exchange said she should push them to go beyond 'very fine words' when dealing with extremist content
PA Wire/PA Images

“We’ve heard very fine words from internet companies about their determination to act on this,” Frampton said.

“We really haven’t seen the decisive sea change in terms of implementation. That’s the message [May] should take. We’ve heard fine words before. We need to see meaningful consistent delivery.”

The report mentions how, in December 2016, Twitter, Facebook, Google and Microsoft announced they would co-ordinate to build a shared database that would help them identify and remove extremist content quicker.

But it wasn’t until six months later that the same four companies announced a “Global Internet Forum” to combat terrorism, the report notes.

It adds: “It is salutary to note that it had taken six months to reach even this point – of agreeing to talk together about the issue – and this, against a backdrop of fairly concerned governmental pressure and public concern.”

Frampton told HuffPost a Policy Exchange’s survey that coincided with their report showed the public had a “crisis in confidence” in digital platforms in dealing with extremist content.

Policy Exchange found that:

  • 72 percent believe it is up to internet companies to remove extremist content, compared with 53 percent believe it is the Government’s responsibility

  • 75 percent support the creation of an independent regulator to monitor online content

  • 74 percent support a law criminalising the persistent viewing of extremist material online, while 73 percent back criminalising the possession and consumption of extremist material

  • 66 percent believe the internet should be a “regulated space” where extreme material should be controlled

“Until you bring [internet companies] to a point where they see it as absolutely vital to their interests to change their behaviour, then they’re not going to,” Frampton said.

The Policy Exchange report says that Islamic State’s propaganda output on social media has remained strong, despite the terror group losing territory and soldiers in battle.

Frampton added: “Although Isis is losing on the ground, their virtual output has been consistent throughout the last three years.

“If we neglect the online networks that jihadists use to disseminate their poisonous ideology, movements like ISIS, and the horrors they inspire, will endure even after their physical infrastructure is removed.

“It is the online networks that give them the ability to reach into our society and target, in particular, the most vulnerable sections of our society.”

The report suggests any regulator fining internet companies should have the same discretion as Ofcom fining broadcasters.

Ofcom has issued fines ranging from £65,000 to £42million and the German Government has agreed proposals that could see social media companies fined up to €50 million for failing to delete “obviously criminal content” within 24 hours, the report notes.

It cited “as a basic guideline” the Private Member’s Bill by Labour MP Anna Turley, which proposes Ofcom regulate social media companies and fine them up to £2million or five percent of their global turnover.

David Patreus, the former US commander in Iraq and Afghanistan who wrote the Policy Exchange report’s foreword, said: “The attempted bombing of an underground train in London last Friday – using a device that can be built from instructions available online – merely underscored once again the ever-present nature of the threat.

“Jihadists have shown particular facility in exploiting ungoverned or even inadequately governed spaces in the Islamic world... They are also exploiting the vast, largely ungoverned spaces in cyberspace, demonstrating increasing technical expertise, sophistication in media production, and agility in the face of various efforts to limit its access.

“It is clear that that our counter-extremism efforts and other initiatives to combat extremism on line have, until now, been inadequate. There is no doubting the urgency of this matter.”

Close

What's Hot