Kicking Extremists Off Social Media Helps Fight Hate, Report Finds

Hope Not Hate tells HuffPost UK deplatforming far-right figures including 'Tommy Robinson' can stop organising, fundraising and directing hate.

Banning extremists such as ‘Tommy Robinson’ from social media works and firms should do more to stop them organising, raising money and spreading and directing hate, a report seen by HuffPost UK has said.

Debate has raged about whether extremists should be deplatformed if they are not breaking the law and figures including Robinson, real name Stephen Yaxley-Lennon, have sought to exploit the situation by presenting themselves as defenders of free speech.

But Hope Not Hate said deplatforming Yaxley-Lennon had worked by “severely” reducing the number of his followers, making it harder for him to organise demonstrations and direct his supporters against minorities.

In one such incident, the former English Defence League leader claimed a Syrian boy who was bullied at school had attacked a white girl from the same school and posted accusatory videos on Facebook.

But in the last two years Yaxley-Lennon has been kicked off Twitter and Facebook, as well as having restrictions placed on his YouTube channel, “which resulted in his views collapsing”.

He has been forced to communicate with supporters through the more marginal encrypted messaging app Telegram, where he has just 42,000 followers compared to more than a million before his bans, Hope Not Hate’s State Of Hate 2020 report found.

Yaxley-Lennon attracted more than 10,000 supporters to “Free Tommy” demonstrations against his incarceration for contempt of court in London in 2018, one of the largest far-right protests in the UK in recent years.

But similar demonstrations after his bans attracted little more than a few hundred last year.

Open Image Modal
A supporter holds a banner during a rally outside the BBC in August 2019 to demand the freedom of Stephen Yaxley-Lennon
SIPA USA/PA Images

“The reasons for this are by no means monocausal, but he and his associates’ inability to spread the word about events and animate the masses beyond core supporters has clearly played a role,” the report said.

“The last decade has seen far-right extremists attract audiences unthinkable for most of the postwar period, and the damage has been seen on our streets, in the polls, and in the rising death toll from far-right terrorists. 

“Deplatforming is not straightforward, but it limits the reach of online hate, and social media companies have to do more and do more now.”

Patrik Hermansson, who researches the far-right for Hope Not Hate, said the Yaxley-Lennon case showed why it was important to make the case for deplatforming, which is opposed by some on both the left and right.

Social media is an “essential tool” for the far-right to “change public opinion and get people on their side”, to organise demonstrations and to raise funding, he told HuffPost UK.

“Also it’s a tool for them to use in their actions - they use social media to spread hate and target journalists and minorities and their political opponents.

“The abuse that happens on social media is quite significant.”

Hermansson added: “(Yaxley-Lennon) was very efficient at organising his social media following, he profited from it quite extensively, and he also used it in extremely negative ways by directing his followers against minority communities.”

The charity also pointed to research which showed that the removal of Britain First from Facebook, where it had 1.8m followers and two million likes, had successfully disrupted the group.

The extremist group has turned to the small and marginal Gab platform, where it has just 11,000 followers and Telegram, where it has 8,000.

“This has undoubtedly been a key factor in the decline of Britain First as a dangerous force in the UK,” the report said.

But it noted that while deplatforming has been “highly effective in inhibiting the power and growth of the far right, there has to remain a balance with freedom of speech to prevent a negative reaction from the British public”.

The report also warned that deplatformed groups and individuals often recongregate on platforms “with a more laissez faire attitude towards extremism”

It called on Telegram in particular to take urgent action to take down extreme far-right channels dubbed “Terrorgram”, where illegal content is shared on an “almost hourly basis”.