2015: When the State Met the Social Media Company

Illegal guns and child pornography are bought and sold. Terror groups are using Facebook to radicalise young people in their bedrooms. Islamic State propaganda is splattered across the internet in greater quantities and in plainer sight than ever. Why isn't Twitter capable of getting rid of this stuff? Social media companies should be doing more!

2015 is the year the state met the social media companies.

As winter turned to spring, sharply-dressed politicians and special advisors met with denim-clad policy brains in the Narnian offices of Californian internet companies. 'Digital democracy', they said, and everybody nodded. Conferences were held in Whitehall and in the White House, cards were exchanged. But winter was coming.

Over the past few months, there's been trouble in paradise. The technology companies, so the story goes, have let their guard down. They have become unwitting squires to those who would at best defy the state, and at worst, commit the worst crimes imaginable. There is no silicon valley in Raqqa: the tools used by Islamic State and other radical groups are being manufactured and distributed here at home.

Illegal guns and child pornography are bought and sold. Terror groups are using Facebook to radicalise young people in their bedrooms. Islamic State propaganda is splattered across the internet in greater quantities and in plainer sight than ever. Why isn't Twitter capable of getting rid of this stuff? Social media companies should be doing more!

Apps like Telegram and ChatSecure, available from the app store, have come under fire for letting terrorists 'go dark'. "How on earth could the Paris attacks have been coordinated without the use of encrypted communications?", came the cry. But we have since heard that the attackers communicated, at least in part, using unencrypted, thirty-year old technology: a simple text message. After the massacre in San Bernardino, CBS News led on the attackers' phones boasting 'built-in encryption'. But these days every phone has 'built-in encryption', just like they have buttons and a screen.

When the Bangladeshi government tried to censor the net, hundreds of thousands downloaded the Tor browser to circumvent the block. When the Brazilian state came for WhatsApp, a mobile messaging app, a million people joined Telegram in 48 hours.

Some have attacked the technology companies with more misguided frenzy than others. Hilary Clinton has called for better 'cooperation' between social media companies and law enforcement. "You are going to hear all the usual complaints," she said. "Freedom of speech, et cetera", which is reassuring. Eric Schmidt, executive chairman at Google, called for tech companies to step up technological developments in countering hate speech and radical content. Donald Trump is going to have a chat with Bill Gates and ask him to turn the internet off.

The two big topics are encryption and administration. In my work at CASM, we dug up evidence of IS using encrypted technology to stay safe online. This is nothing new, of course, but recently it has caused politicians, security chiefs and media organisations to lash out at those providing this technology.

Solutions proposed have simply not been feasible. We have heard talk about governments being given a magic key that gives them access to encrypted communications. But this just isn't possible. Mathematically, strong encryption doesn't allow for backdoors - a way to decrypt messages. Any backdoor be available to cybercriminals, hackers, the Red Terror and so on, and the technology will rightly be rejected by anyone protective of their bank details and naked selfies.

The mooted alternative was to try and clamp down on encryption or even to ban it. This was attempted thirty years ago in the US, and is even more impossible now than it was then (how do you roll back maths?). It also ignores the fact that so many good use hard encryption every day. Gay rights activists in Uganda, democracy advocates in Turkey, journalists trying to protect their vulnerable sources, whistleblowers and so on all rely on this technology.

Administration, or keeping your digital house free from terrorist content, is the second big debate. Extremist content should be removed immediately. Terrorist communications should be forwarded to the security services, as was called for in the wake of Lee Rigby's murder.

There is a debate around whether we want a Californian social media company as our front line in deciding who is a threat to UK national security. There is a debate around whether whitewashing this content and any public record of it, and any opportunity to engage and counter it, is really a wise course of action. Leaving all this aside, there is a basic problem of data volume.

Charlie Winter estimates that there are over thirty daily online releases from 'official' IS channels alone, not to mention thousands of pieces of content from supporters around the world. These are then shared by thousands of others, creating hundreds of thousands of pieces of extremist content sitting amongst billions of photos of cats and so on. A large part of social media administration is manual - somebody reports it, and it appears on the screen of some poor guy (who probably doesn't speak Arabic) in an outsourcing centre in the Philippines. And as soon as a piece of content is removed, it's re-uploaded, either to the platform or elsewhere on the internet and linked to it through Facebook or Twitter. It is the world's biggest and most difficult game of whack-a-mole.

The alternative is Eric Schmidt's 'spellchecker': an automated system of content removal. But this 'magic algorithm' that finds terrorist content just doesn't exist, and probably isn't technologically possible at this stage. Language is hard to categorise and it changes all the time. There are about 30 billion messages sent every day on Facebook alone. Spotting the one message containing a real threat is impossible. It isn't even a needle in a haystack situation - computers are really good at spotting needles in a haystack, because a needle is made of metal and looks different - this is like finding a specific piece of straw in a haystack.

The ethical implications are equally dark. The internet, love it or hate it, is a place where anyone can say anything. The idea that certain phrases or words or points of view could be automatically picked up by a computer program and erased is nothing short of terrifying.

What 2016 holds for the turbulent relationship between states and technology is difficult to say. States like Bangladesh, Turkey and Egypt ran out of patience a long time ago. Legal frameworks are being drafted in the US and the UK to bring surveillance powers 'up to date'. But the bottom line is that technology moves fast, and states move slowly, and the responsibilities of those who bring us technological change will be central to discussions in 2016 and beyond.

Close

What's Hot