THE BLOG
04/04/2018 17:02 BST | Updated 05/04/2018 09:37 BST

Why Can’t We Stop Sexual Extortion Of Our Young People?

60% of teenagers have been asked for sexual images or videos

https://stocksnap.io/photo/NXWB8SXF45

Emily had spent a few weeks chatting to Dave online. They’d never met in person, but it’d just been a bit of fun. She’d been put off when he’d asked her for underwear photos, but she’d gone along with it – her friends at school were always talking about doing it for their boyfriends. Emily’s parents were separated and her dad was angry all the time, so it was a relief to have someone like Dave to talk to.

But then it turned nasty. Dave said that if she didn’t do something for him, he’d put the pictures online and send it to everyone at her school. She picked up a parcel from a man in a hoodie by the betting shop and took it to another house in a grubby neighbourhood that her mum had told her never to go to. Emily was fifteen and no-one ever looked at her suspiciously. She ran all the way home.

Being frank, it’s unnecessary to tell scary stories about the impact of sexting and sextortion. Emily may be fictional, but these ‘county lines’ models of exploitation are all too common. In fact, there were 1,744 victims of sexual exploitation and 2,352 victims of drug smuggling like this case, in the last year alone. We may typically think of young women when we hear the term sexting, but too many young men have killed themselves after being sexually blackmailed over explicit images.

The state of the nation

Let’s not mince words. Over half (60%) of teenagers have been asked for sexual images or videos – and a third of those sexting sent their images to someone they’d never met. Half of teens know someone who has been bullied or abused because their sexual or nude images have been passed around schools or colleges.

A child sex offence – of which this is one – is recorded every eight minutes in the UK alone. That term – child sex offence – is worth thinking about. Sending or receiving an explicit image or video of an under-18 is child sexual exploitation content. Possessing or distributing this content carries a criminal sentence; in fact, over a thousand children in Denmark were recently charged after distributing an explicit image of a fifteen year-old.

This is a difficult discussion for parents to have with their children. Half of parents don’t even realise that sexting is illegal and a fifth (19%) don’t intend to talk to their kids about it – ever. Considering the risks, this borders on negligence.

The challenge is that children’s phones are almost welded to them today. Taking away a child’s phone – the knee-jerk reaction to sexting – can lead to humiliation, isolation and a crash in self-esteem.

Beyond ‘the talk’

So, what can you do? Apart from having ‘the talk’, of course. And please, have the talk.

A major challenge is that having parents or other adults check this content is practically very difficult, potentially unethical and harmful for the adults in question. Children have a right to privacy and we shouldn’t spy on them.

We have machine learning today – primitive AI. Why isn’t this being pointed at the problem?

AI image recognition is difficult but it’s getting better. It can block images before they are sent or seen. In the same way as phone cameras know what to focus on, software can turn off cameras if they detect that a sexual image is being taken.

This can be complemented by learning ‘backwards’. As I’ve said before, the pattern of communications leading to sexting is predictable: it’s quick-fire kind of conversation. Similarly, the way that a 40-year-old groomer speaks online is quantitatively different to the way that a normal 16-year-old speaks online. This can be detected – and then prevented.

This means that no human ever needs to see the content. Images can be blocked before the damage is done and children can be warned each time they violate a ‘strike’, with their parents eventually notified.

This use of AI preserves privacy without spying. It can scan phone image libraries and check for potentially explicit images, quarantining them. Of course, children should have a right to correct the system – no AI is perfect, yet – and if they grant a parent the right to check the images, the parent can restore any incorrectly categorised images.

AI can learn conventional patterns of behaviour and points out the anomalous ones. It can keep children safe without breaking trust, allowing them to experience the best of the web but stay safe from the worst.

It means that incidents like that of Emily and Dave, where a simple request turns into blackmail, sextortion and drug smuggling, need never happen again.