To Hold Social Media Responsible for the Death of Fusilier Rigby Is Grossly Unfair

The guilt and responsibility for that lies with the terrorists who committed the crime. The security services - as the name implies - have the job of keeping us secure. But there is a public consensus that anyone in a position to prevent that terrible vicious murder should have done so.

This week the ISC reported that whilst the security services had committed a number of errors, it was a social media company which could have taken action to prevent the killing of Fusilier Lee Rigby. That company has been widely reported to be Facebook. None of the major social media companies have commented in detail but I doubt any of them are happy with headlines generated by the report. The question is whether they understand the root cause of the criticism and whether they acknowledge any justification for it.

Some have argued strongly that criticism of Facebook specifically or social media in general is hardly fair, they are not tasked with our safety.

I agree to hold social media responsible for the death of Fusilier Rigby is grossly unfair. The guilt and responsibility for that lies with the terrorists who committed the crime. The security services - as the name implies - have the job of keeping us secure. But there is a public consensus that anyone in a position to prevent that terrible vicious murder should have done so. The question is therefore not whether it is fair or unfair but how social media companies should respond. Whilst it may not be fair it is perhaps appropriate that they should be held to account. For too long the social media giants - Facebook, Twitter, Google - have held themselves aloof from any debate about internet ethics, indeed even from some of our laws. It is time for them to come to the table and participate in a public debate as to what they can and cannot, and more importantly should and should not do.

This has been a long time coming. I spent twenty years working in the IT industry designing, building and finally regulating communications networks including the internet. I spent many hours debating whether network providers, ISPs or the applications that run on them had any responsibility, legal or moral, for what they carried.

The response was almost always a very clear negative. Networks could take responsibility for how fast a connection was made or how secure it was or where it was available, but nothing more - the content was the responsibility of whoever created it. Networks were just dumb pipes. Pipes have no ethics.

I can still see the beauty of clear dividing lines between the network and the content. But I don't think it works anymore. The nature of networks like Facebook and Twitter is that their users produce the content. Most of us remain within the law, but some do not. So who polices the internet?

For many years the industry argued no one policed the internet, indeed some were adamant that that was the whole point. It was designed to be a 'free for all', the Wild West.

But that has changed. Internet companies work with excellent third sector organisations like the InternetWatch Foundation and police forces around the world to identify and report child pornography. Now most would agree that, when it comes to child porn at least, police police the internet, with the help of socially responsible corporate citizens.

And that seems reasonable. The internet is a public space and the police cannot be everywhere. If someone is aware of child abuse then of course they should report it. Some argue that terrorism is different, we all agree on what child abuse is, but one person's terrorist is another's freedom fighter.

No one is making that argument in this case.

Social media already monitor what I do in order to advertise me things. Their technical ability to do this cannot be doubted. The question is whether they should. That is an ethical question. And as a country, and an international community, we should be having a grown up debate with the social media giants taking part.

As Ed Miliband said to the Prime Minister in Parliament this week, part of the problem is the existence of different company practices and the absence of agreed procedures. In cases of child abuse images, a procedure is in place for companies to take action and refer abuse to the authorities, and when it comes to terrorism, there should be much stronger procedures and obligations on companies as well.

On Tuesday I launched the Independent Review of Digital Government, announcing that a Labour government would undertake a review to establish a coherent and ethical approach to the use of data within government.

The private sector needs to address the issue too.

In September Facebook wrote to me, to point out that more people were on Facebook than voted in the last General Election and therefore it should be the primary platform for political debate.

But as well as not necessarily wanting my hard-pressed constituents to be advertised at when they come to hold me to account - I wouldn't let a private company sponsor one of my surgeries - I believe any platform for political debate should have a sense of accountability to the UK - not so much its government as its laws and its people. That's what I want from my social media.

Close

What's Hot