MP Demands Deepfake Porn And 'Nudifying' Images Are Made Sex Crimes

“Women in this country have faced a growing problem of image-based sexual abuse over the past decade but the scale of the problem is increasing.”
"Let’s be clear: this non-consensual use technology is almost exclusively used against women."
"Let’s be clear: this non-consensual use technology is almost exclusively used against women."
Photonews via Getty Images

An MP has called for non-consensual deepfake porn and nudification images to be made sex crimes, warning they are rapidly on the rise.

Maria Miller wants the government to ban the making and sharing of image-based “sexual abuse” under the online safety bill.

She will bring an adjournment debate to the Commons on Thursday in which she will outline the “devastating” impact such images have on the victims.

Deepfakes are when ordinary photos of women are taken without their consent and placed onto pornographic images or videos using AI or software. They can be superimposed onto violent or illegal extreme material such as rape.

Meanwhile, nudification software takes everyday images of women and creates a new image which makes it appear as if they are naked.

The Tory MP said the creation of such images without consent was a “highly sexualised act” and they were difficult to remove from the internet.

Calling for it to be a sex crime, she added: “Deep fake and nudification software are yet more ways women can suffer online sexual abuse.

“Women in this country have faced a growing problem of image-based sexual abuse over the past decade but the scale of the problem is increasing.”

She points to reports of online image based abuse soaring by 87 per cent in 2020, adding: “Deepfakes and nudified images are another vivid form of violence against women online. Let’s be clear: this non-consensual use technology is almost exclusively used against women.”

The MP for Basingstoke said the government has to put in place laws that recognise technology and AI is being used to inflict sexual attacks and violence on women and girls.

Miller previously successfully campaigned to outlaw “revenge porn” in 2015, after she was contacted by one of her constituents who was a victim.

She is also running a campaign along with Grazia UK calling for Cyberflashing to be criminalised.

Tory MP Maria Miller
Tory MP Maria Miller
Yui Mok - PA Images via Getty Images

Deepfakes have developed rapidly since emerging in late 2017 and researchers warn that they are becoming increasingly realistic.

The AI research group Deeptrace found that 97 per cent of deepfakes are pornographic in nature and exclusively target women.

One victim, referred to as Helen, discovered in 2019 that non-sexual images of her had been uploaded to a porn website, where users were invited to merge her face with explicit and illegal sexual content.

The original images were taken from her social media, including photos from her pregnancy.

She said: “Obviously, the underlying feeling was shock and actually I initially felt quite ashamed, as if I’d done something wrong. That was quite a difficult thing to overcome. And then for a while I got incredibly anxious about even leaving the house.”

Helen alerted the police to the images but was told that no action could be taken.

HuffPost UK has approached the government for comment.

Close

What's Hot