In many ways David Cameron has found the perfect enemy in porn. It's widely perceived to be a depraved, misogynistic and abusive industry at the best of times, so it's no wonder that he has the support of the NSPCC, the traditional Tory press and vast swathes of the country. Surely nobody of reasonable mind could possibly object to the principle of protecting children from such a horrible spectre?
The problem is that even if we admire Cameron's determination and principles then there are still a number of very good reasons to object to a crusade that is likely to go about as well as the 'war on drugs' and the 'war on terror'.
In a far reaching and emotive tour-de-force of a speech the Prime Minister urged us to consider pornography's 'poisonous' and 'corrosive' effect, and to think about the children who might stumble upon it by accident. All of these are serious issues, and the proposed solution was a suitably radical one that relies upon Google blocking search terms and adults having to opt-in before they can view certain websites. In addition there were pledges to look at blocking websites that promote self-harming and criminalise videos that simulate rape and sexual assault.
Making it harder for children to access the most abusive, misogynistic and violent pornography is undoubtedly a good thing, but there are obvious issues with the idea of an opt-out filter.
First of all, what would it cover? Many of those who believe that it's the role of government to protect children from potentially corrupting websites will argue that the proposal doesn't go far enough. The everyday sexualisation and objectification of girls and women is manifested in a number of places outside of the internet. What about imagery like Page Three, which is freely available to people of all ages and normalises casual sexism? Where would the line be drawn, and, more importantly, why?
Things become even trickier when considering websites that promote self-harming. These sites exist but they tend to be very niche and hard to find. Would it even be possible to block them, and their associated search terms, without also blocking websites that offer help and advice to people in need?
Then we come on to simulated rape. I find the idea of people watching it utterly objectionable, but because I find something objectionable that doesn't mean that I think it should be banned and criminalised. The proposal means that consenting adults can be charged for watching an act portrayed by other consenting adults, and that doesn't sit well with me. The evidence that those who view violent images are more likely to commit violent acts is very much contested, and certainly not strong enough to be the basis of implementing laws that fall on the wrong side of being illiberal.
Surely a good idea would be to put a portion of the money and resources being used to fund the scheme into promoting the numerous existing tools that already allow parents to set their own filters? This would remove politicians from the equation and empower families to make their own choices.
Furthermore, if one of the main goals of the campaign is to limit the ability of abusers to exchange indecent images then I don't see how making it harder for consenting adults to access perfectly legal websites is any kind of solution.
Most of the abusers who exchange images will presumably opt-in to allow adult material anyway, and those that don't are likely to be using P2P sharing technology, secure networks and far more sophisticated tools than a simple Google search.
And then there are the numerous logistical questions.
Would there be a list of banned websites? Presumably this would need to be published and updated regularly. Who would police this list? What would happen to the data gathered from the opt-in system? And what safeguards would be put in place to stop governments from abusing the same facility in the future?
How would the block avoid impacting on search terms which can be used in an innocent context? In writing this article I searched for 'David Cameron porn' on Google, would that be an acceptable search term? We already know the search terms that Stuart Hazell used when accessing pornography before he murdered 12 year old Tia Sharp, they may have been sick and distasteful but none of them described anything that was illegal.
One of the most effective steps would be for the government to put more money and resources into tracking and identifying abusers and making sure that they're brought to justice. This should be paired with a stronger focus on sexual abuse, consent, relationships and pornography in schools. However, in an age of austerity, cuts and 24 hour media it's easy to see why the government is proposing to put that onus, and its associated cost, on to the search engines rather than the public purse.
My other objection to Cameron's approach is that it trivialises and simplifies the issues of sexual exploitation and consent. The rhetoric and imagery of the government's campaign has conflated child abuse and pornography, this is reflected in much of the coverage, which has appeared to use the two terms interchangably. The reality is that both are very different and shouldn't be lumped into the same strategy.
With that in mind it becomes harder to view this as a serious campaign as opposed to a sensationalised one which is as focused on political positioning as it is on getting the right things done. Furthermore, for any Prime Minister to imply to bereaved parents that any search engine could have been partially responsible for the death of their child makes me question both his sincerity on the issue and whether he has any idea of how the internet works.