What Are Deepfakes, Are They Illegal And Can They Be Stopped?

It’s not just politicians and celebrities at risk from pornographic and sophisticated manipulations using deepfake technology.

The debate around deepfake has centred around high profile women like Emma Watson or Daisy Ridley being clipped into pornography, or politicians and even Facebook CEO Mark Zuckerberg being made to say things they never said in doctored videos. But the general public is increasingly under threat.

Research from cyber-security company Deeptrace found the number of videos online produced using deepfake technology has doubled in the last nine months: 14,698 videos compared with 7,964 online in December 2018.

And not only are there more videos, they are increasingly pornographic in nature with 96% of deepfake clips featuring a computer-generated face replacing that of a porn actor.

Experts say the more prolific this technology becomes – and the more accessible and easy to use – the greater the threat to non high-profile people..

Henry Ajder, head of research analysis at Deeptrace, told the BBC: “The debate is all about the politics or fraud and a near-term threat, but a lot of people are forgetting that deepfake pornography is a very real, very current phenomenon that is harming a lot of women.”

In 2018 a report by the Women’s and Equalities Committee said the government was running to keep up with technological abuse – including deepfake, cyberflashing and revenge porn.

So what is deepfake and what is being done to tackle it in the UK?

What Is Deepfake Pornography?

Deepfake is a practice which uses smart face-swap technology to digitally manipulate pornography so it looks like other people are present in images or film.

Using photographs of celebrities and everyday people, a victim’s face is put into an existing pornographic photograph or film, replacing the original participant.

It began with many female celebrities including Taylor Swift, Emma Watson, Gal Gadot, Michelle Obama, Daisy Ridley, Meghan Markle and Kate Middleton.

But as the software becomes more accessible, it is happening to everyday people too.

Where Has It Come From?

Doctoring sexually explicit images is nothing new, but previously was a painstaking process that involved technical know-how. Not only did it take time and commitment, but the end result was unlikely to look convincing.

Since 2017 new software has been making it easier: all you need to do is gather a photoset of one particular person (easy when celebs post selfies on Instagram on a daily basis), choose a porn film, and input both into the automated AI-system.

It can take a long time to achieve (longer than 24 hours even for a short clip) but open-source software has made it accessible to the masses. One commonly-used programme has been downloaded more than 100,000 times, according to its designer.

To get an idea of the pace at which this technology is developing, at the beginning of 2018, Motherboard predicted it would take another year to automate deepfake software. It took only a month.

Is Deepfake Pornography Illegal?

Currently producers of deepfake material in the UK can be prosecuted for harassment as was the case in May 2018 when 25-year-old Davide Buccheri was jailed for 16 weeks and ordered to pay £5000 in compensation for photoshopping pictures of a female intern to porn websites.

But there are now calls to make deepfake a specific crime of its own. In October 2018, the Women and Equalities Committee called on the government to implement a law against image-based abuse that stops the non-consensual creation and distribution of sexual images. This would cover deepfake porn.

What Is Being Done About Deepfake

While we wait for the law to change, websites and social media giants are taking a stance on their own platforms.

PornHub, for example, has banned deepfake videos this year – although critics have said it isn’t working. Twitter and GIF-hosting site Gfycat have also placed similar bans.

Reddit also closed down the primary subreddit hosting deepfake porn, which had 90,000 users at the time, a move deemed largely ineffective as users migrate the material to other locations.