Normalising Criminal Communities

The challenges facing investigators today is much more widespread than we have ever anticipated. We've doubled resources to the IWF to tackle the take down of this kind of material but the next goal is to step up law enforcement's ability in the same way.

"Once I'd opened my Tor browser, it took me two mouse clicks to arrive at the page advertising the link. If I had clicked again I would have committed an extremely serious crime. I can't think of another instance where doing something so bad is so easy." Jamie Bartlett, The Dark Net, 2014.

Anonymity, and the ease with which obscene content can be accessed have helped reduce not only the physical, but also the mental barriers to engaging in illegal activity. A click, an open browser, a saved file, seems such a small action for such a devastating crime.

According to research by the Lucy Faithful foundation, nine out of ten Internet sex offenders did not intentionally seek out child images, but found them via pop-ups or progressive links whilst browsing adult pornography. Obviously it's difficult to verify this kind of statistic as few would confess to actively seeking out child sexual abuse content and many might be seeking to distance themselves from their crime. Nevertheless, I believe the 'normalising' nature and the easy access to legal and illegal content alike, have undoubtedly lead more to CSA content than may have otherwise been attracted.

Twenty years ago, criminals and voyeurs were by definition operating alone. The communication was linear. Once the police picked up the trail, they could quickly shut down the operation and press charges. Case closed. However with the rise in social media, the Internet has provided the means to not only clone and spread digital material far and wide, it is also bringing individuals together, normalising their behaviour, and changing their attitudes towards traditional societal values. As a result, people are drawn towards collective thinking and behaviour. Sometimes for good, other times in illegal practices like child sexual exploitation.

Whilst blocking CSA content will not help us stop the creation and distribution of child sexual abuse material, it plays a critical role in deterring curious browsers from finding this kind of material. This deterrence helps slow the normalisation of this kind of material, and stops new people getting 'recruited' into that world.

On the whole, social networks and online search engines are doing a much better job at taking down explicit and illegal photos and videos being posted online, compared to just a few years ago. Facebook, for example, has recently revamped its community standards in its takedown policy. It now includes a separate section on "dangerous organisations" and gives more details about what types of nudity it allows to be posted. It actively encourages its members to report posts that they believe violate its rules. Google has also made efforts to change its algorithm for illegal search results and develop technology that can identify children in illegal online videos, to aid police investigations.

Despite all this there are people that would abuse children with or without the Internet, and blocking won't stop them. Today, as soon as one case is closed, another opens up. More child sexual abuse cases emerge every day and each case could contain new or unidentified victims. The challenges facing investigators today is much more widespread than we have ever anticipated. We've doubled resources to the IWF to tackle the take down of this kind of material but the next goal is to step up law enforcement's ability in the same way.

Close

What's Hot