This week, the Prime Minister's Tackling Extremism in the UK task force reported its findings and recommendations. Set up in the aftermath of the killing of Lee Rigby in London in May 2013, the task force points to the need to tackle extremist propaganda online.
Although the report mentions the need to build the capabilities of communities and civil society organisations to provide credible alternatives to extremism online, the recommendations still focus heavily on censorship and 'take downs' of extremist websites. While it is symbolically important for governments to be seen to be taking a hard line, it is inconceivable that the problem can be solved through a focus on repression. There are two major limitations to this approach.
First, much extremist content is legal; extremist networks know the law and how to stay on the right side of it. They are also getting smarter about working around the terms of service of Internet service providers and social media platforms. For example, videos of the late Anwar al-Awlaki - the so-called 'jihadist rock star' - continue to inspire new generations of terrorists on YouTube because they fall short of violating laws or codes and are therefore not liable for removal.
Second, there is a major challenge of scale. Every day, over 140,000 hours of content is uploaded to YouTube, over 720,000 websites are created and five billion new items are posted to Facebook. Even assuming that only a tiny fraction of this content is 'extremist', the government would need an impossibly large Internet police force. Since 2010, the UK government has only managed to take down 4,000 URLs.
Instead, approaches to tackling online extremism need to work with the logic of the Internet, not against it. They need to be based on large-scale content creation to drown out extremist messaging and digital disruption tactics to make it more difficult for extremists to use the Internet to communicate, find one another and share information.
A new report to be launched next week by the Institute for Strategic Dialogue argues that the UK government should recognise the limitations of a censorship approach to tackling extremism online and instead focus on up-skilling messengers who can speak to those at risk of radicalisation so they have the expertise and tools to push back on extremist propaganda effectively and at scale. These must be messengers with legitimacy and authenticity among their target audience.
If enough material were generated, positioned cleverly and marketed to the right audiences, it could serve to 'drown out' extremist voices - there are more of 'us' than 'them'. There are also smart technological methods that can be deployed to disrupt extremists' ability to operate and organize online, by for example hijacking extremists' twitter hashtags, search engine optimization methods, de-prioritizing extremist content and comments, or flooding the Internet with altered versions of terrorist manuals to ensure would-be terrorists are never sure if they have the genuine article or not.
The government should fund a large-scale national programme of training and development for tens of thousands of the most credible messengers so they have everything from basic knowledge of how to create Facebook groups and run a Twitter account to more sophisticated expertise relating to storytelling, crafting compelling messages, film making, production, marketing, and dissemination.
One of the most credible groups among young people at risk of radicalisation are former violent extremists - disengaged neo-Nazis and Jihadis - using their experiences to counter the appeal of extremist causes and the use of violence to pursue them. Governments are often nervous or reluctant to work with these individuals or provide them funding, especially formers who have renounced violence but continue to hold what could be deemed 'extreme' views. This is a mistake and weakens our response.
The UK government should also pay for existing counter-narrative films and texts to be translated into multiple languages to avoid duplication and scale up the films and documents that are available.
Finally, we do not understand what works in countering extremist messages, what kinds of messages and content, delivered by whom, how and when. The UK government should therefore establish a working group involving private sector tech companies, community activists and researchers to develop an effective analytical framework for measuring the impact and reach of this work to ensure future efforts are fine-tuned and lessons are learned.
While ISD welcomes the government's review of its approach to extremism and acknowledges that it contains many good ideas, when it comes to extremist propaganda online its approach is woefully lacking. Violent extremists are so far winning the war of ideas and content online. We have scale on our side so all is not lost, but the window of opportunity is closing, fast.
- Rachel Briggs, Director of Research and Policy at the Institute for Strategic Dialogue.
- Sebastien Feve, Programme Associate at the Institute for Strategic Dialogue.