Microsoft Attacks Online Child Pornography

They say when you walk along the streets of London you are never more than six feet away from a rat. Looking at some of the headlines carried by newspapers in the early days of the internet you could be forgiven for thinking every user was never more than two mouse clicks away from child pornography.

They say when you walk along the streets of London you are never more than six feet away from a rat. Looking at some of the headlines carried by newspapers in the early days of the internet you could be forgiven for thinking every user was never more than two mouse clicks away from child pornography. In the public's consciousness the propagation of child pornography became one of the signature crimes of cyberspace.

The fight against online child pornography has been relentless ever since. Considerable resources are devoted to it in a still growing number of countries. It's not just the sense of outrage and obligation to the children depicted in the images that spurs police officers to rip into this problem with uncommon zeal. Historically it has also been a belief that if they don't get rid of those pictures new people will find them and become a fresh threat to more children as yet unharmed.

Typically once child pornography is located on the internet a notice is issued to the electronic service provider concerned, suggesting they take it down. Elaborate procedures have been developed to facilitate the rapid exchange of these notices. As long as the offending material is removed promptly the hosting company attracts no criminal or civil liability. In the UK an illegal image of this kind is normally gone within 60 minutes.

It was therefore a bit of a shock to attend a meeting in Brussels last week and hear the FBI and the US Department of Justice announce that, as far as they were concerned, notice and take down now had almost "zero value" as an aid to law enforcement.

One of the original justifications for notice and take down of material found on web sites was that it disrupted the operation of the criminal enterprises that often lay behind them. However, the highly organised, technically literate gangs of paedophiles and large scale distributors are no longer working in that way. They have by and large deserted publicly accessible places, burrowing deep into file sharing environments, peer-to-peer networks and closed groups of one kind or another. Police work today in this field is principally covert, intelligence led.

A senior FBI agent described notice and take down as "wackamole". A notice is issued, the hosting company removes the image then, sometimes within minutes, the same picture will pop up somewhere else. And so on ad infinitum.

Obviously the US Government was not saying they were indifferent to the images remaining on public view. Getting them off any and all parts of the internet remains an important goal of policy for everyone, particularly the child protection community. The Feds were simply pointing out that the amount of time and money devoted to notice and take down was disproportionate to the benefits obtained in terms of reducing the total volume of illegal activity or helping to secure convictions. We need to find new and better ways to tackle the problem. We cannot simply arrest our way out of it.

Step forward Microsoft. At the same meeting they described new software they had developed. It's called PhotoDNA. Microsoft will give it away free to appropriate companies.

Every image stored on a computer is, by definition, digital. This means, reduced to its essence, the picture is simply an assembly of 1's and 0's. Each assembly is unique. It can be expressed as something called a "hash value". There are various programmes around that can pick up and read these values. The trouble is if anyone does anything as elementary or obvious as edit the picture, even by the tiniest fraction it becomes a completely new file with a totally different hash. The system is flummoxed. The picture goes undetected. Bad guy walks off into the sunset.

By contrast PhotoDNA looks at what makes the picture what it is i.e. the actual content, not simply its digital representation. Thus, even if the picture changes format, shape, size or colour, within certain generous tolerances it will still be picked up. Using the parameters Microsoft recommends the chances of the software making a mistake seemingly are around 1 in 10 billion.

The vast majority of illegal images on the internet are constantly being recycled. The child in an image may already have been identified and rescued, the abuser is in jail. Alternatively in some instances the children shown will now be in their 30s, 40s or be even older. Deserving attention, but a different kind of attention.

If all these known pictures could be taken out of the equation, not only is it a service to the children in them but any new ones uncovered are likely to be of a recently committed crime. This suggests there is a child currently in danger. Police resources can be focused more intensively on finding him or her and the perpetrator. This is exactly where PhotoDNA could pay huge dividends. It stops historic material getting back on to the internet, or if it is already there it will root it out of hidden nooks and crannies.

All that said it seems likely there remains a substantial and important future for notice and take down. For one thing there remains a significant amateur trade through public places. These sites have not disappeared completely, more's the pity.

Moreover it may be several years before we see how well PhotoDNA works at scale around the world. Using a database of illegal images supplied by the National Center for Missing and Exploited Children Facebook is currently trialling it in an environment in which between 200 and 300 million new pictures go up every day. The early signs are good. Microsoft are looking for partners in Europe to extend the trial.

Any images found in a post-PhotoDNA world must be new. We don't yet know how long it will take to get them into a PhotoDNA readable database and efficiently distributed to all the right places. However, the child or children in the images nonetheless can be reassured that, in the meantime, because of notice and take down fewer people will be able to witness their humiliation because representations of it are going to be very quickly deleted at source or blocked.

PhotoDNA in the wrong hands might be a worry. It could be used to prevent publication of a cartoon of the King or the Archbishop. But then almost anything in the wrong hands can be misused. It's not a reason for refusing to act at all. I am sure Microsoft is fully aware of this and will be watching like hawks how their licences are deployed.

In some cases brought, for example, against developers of peer-to-peer programmes courts have made clear that where software has a primary or main purpose which is lawful, its use will not be disallowed or penalised simply because some people do not respect that. It's only where there is little or no evidence of lawful intent or use you're at risk. Sounds like common sense.

PhotoDNA definitely has an overriding lawful, indeed highly virtuous purpose. It is unlikely to be a silver bullet but it is chipping away, making it more likely that fewer children will be sexually abused solely in order to make new images to sell or trade. Three cheers for that.

Close

What's Hot