Keeping An Ex's Nudes Is Not Illegal. But Should It Be?

“I was 17, my partner and I filmed and took photos often — including a video of me giving him oral sex. I found out he had shared it to our mutual friends and being a kid at that time, I didn’t really understand what that meant.”
Illustration:Jianan Liu/HuffPost Photo:Getty Images

“I hope you don’t mind me messaging you”, I type nervously. “But I found some pictures of you on Jack’s* laptop. I’ve deleted them for you.”

Sometime in 2009, my then-boyfriend Jack, 20, had a penchant for holding onto nude images of his exes. Something I found out while rummaging through his laptop folders. I found her there, she looked young, teenage, in white cotton knickers looking back over her shoulder. Then I found more, of other girls varying in age and explicitness. Instinctively, I rounded them up and dragged them to the trash can, along with any of myself he’d kept onto after promising to delete them.

But keeping intimate images of exes (who are over the age of 18), even if asked to delete them, wasn’t illegal then and isn’t illegal now, however morally grey it is. In fact, according to image-abuse research conducted by Bolt Burdon & Kemp, 1 in 10 Brits don’t intend to delete intimate photos of an ex-partner. Ever.

For those under the age of 18, the law is very different. Section 160 of the Criminal Justice Act 1988 criminalises ‘possession’, for which the sentence is five years. Yet, in Ofcom’s recent report on video-sharing platform regulation from 2022, self-generated sexual material, such as sexting or nudes, (like the ones Jack had kept of me and his other teenage exes) was a significant driver of online harm, where contributing to child sexual abuse materials (CSAM) is concerned.

“When I was 12, or 13, at my first high school there was this girl who set a picture of herself in a bra and underwear, and it became the phone background of every boy in her year and the year above. It was so normalised,” Hannah explains to me.

Five years later, Hannah would experience something similar within her own peer group.

“Unfortunately, it’s a very, very common tale, I think, especially amongst people when they were teenagers”, she says, “I was 17, my partner and I filmed and took photos often — including a video of me giving him oral sex. I found out he had shared it to our mutual friends and being a kid at that time, I didn’t really understand what that meant.”

Hannah describes feeling objectified and ashamed, and couldn’t piece together that there was any blame on her partner for non-consensually sharing the video. She tells me, “It took a long time to recognise that actually, I had a lot of internalised misogyny and shame, that was enforced by patriarchal standards of female sexuality. I felt punished for trying to act on that empowerment.”

Hannah experienced the sexual double standard firsthand. A term that describes the hypocritical manner we view female sexuality, and punish women for taking agency and autonomy of their sexuality — which can also be known as slut shaming. A phenomenon that can leave victims of sexual offences feeling too humiliated to come forward.

The truth is, intimate image abuse is pervasive. Especially for women and young girls. A comparative analysis of data collected by the Revenge Porn Helpline and Professional Online Safety Helpline in 2019 highlighted a shocking gendered disparity in victims of intimate image abuse. A gap that runs parallel to other sexual offences, such as rape, sexual assault, sexual harassment, childhood sexual assault, coercive control and domestic violence. Soberingly, though perhaps, not surprisingly, the majority of the perpetrators of these crimes are heterosexual white men.

This gap and the impact of systematic cultural misogyny is best explained by the sexual violence continuum. No matter where on the spectrum women experience misogyny, be it something more ‘commonplace’ like prejudice or harassment, to something more ‘serious’ like stalking or rape, all women experience a violation that has the potential to limit lives, statuses and opportunities, erode self-worth, safety and trust, and violate bodily autonomy and integrity.

While public opinion and policy are beginning to listen and mobilise to protect women and girls from this continuum, (albeit slowly). Legally speaking, how protected are our intimate images? And are these protections enough?

Recently, news of the new Online Safety Bill has brought up discourse around online safety, specifically surrounding intimate image abuse, (more commonly known as ‘revenge porn’, though this term is inherently problematic). Worryingly, in March of 2022, only 58% of a YouGov survey found people strongly in favour of the Online Safety Bill, which promises to make the unsolicited sending of nude images (cyber flashing) a criminal offence, as well as taking on a more nuanced approach to things like intimate image abuse and other online safety issues.

The bill includes recommendations from The Law Commission, following the Reform of the Communications Offences report, to protect people, particularly young people, online. Flagging the ill-suited nature of the UK legal system’s current laws, these recommendations have been a welcome addition. Particularly as it uses the same definition for consent online of that set out in the Sexual Offences Act 2003.

Though, as Dr Kirsty Welsh, Senior Lecturer in Law and Criminology at Nottingham Law School, tells me, it’s not without its shortfalls. And, when I ask her how effective the new Online Safety Bill will be in prosecuting and bringing perpetrators of these criminal offences to justice, she confidently tells me; “I don’t think it will be successful at all.”

But, why? How can a law designed to protect people online make experts like Dr Welsh feel like it’s destined to fail? Firstly, we should look at what laws are currently in place to protect our intimate images. Namely, Section 33 of the Criminal Justice and Courts Act 2015.

Section 33 creates an offence of disclosing private sexual photographs or films without the consent of an individual who appears in them and with intent to cause that individual distress, with one major flaw.

“Section 33 doesn’t define what consent is,” Dr Welsh explains. “It also falls under the remit of confidentiality and disclosure, meaning that being charged under it isn’t a sexual offence, even if you have disclosed intimate images.”

How Section 33 categorises intimate imagery is alarming too, using overtly objectifying and pornographic terms, negating other forms of intimate images, like being pictured without a hijab, having an exposed breast or being in underwear. “It’s about genitalia or ‘otherwise not seen in public’,” Dr Welsh explains. “It’s a very sexualised definition of what is considered intimate.”

But it’s not only the offputting definitions of intimate images Section 33 has that limit its protective powers.

Research carried out by Refuge in 2022 found that, despite a significant rise in the number of intimate image abuse cases reported to the police year on year from 2019, only 4 per cent of all offences recorded across the 24 police forces from January 2019 to July 2022, were the alleged offender charged or summonsed.

Refuge also reported that in 35 per cent of cases where the police had identified a suspect, victims were unwilling or unable to prosecute. And it’s understandable. Under Section 33, victims aren’t guaranteed anonymity.

“The real difficulty that we’ve experienced is people are quite fearful about bringing these cases and reporting them because of the natural stigma that attaches to what they’ve done,” Ashley Fairbrother, Partner at Edmonds Marshall McMahon explains. “I think that’s where a lot of the stumbling blocks are in legal cases is the fear, the shame, the retribution that the individual feels that they will get.”

Fairbrother explains that because the burden of proof lies solely on the prosecution. It’s highly likely that exhibits where images of a victim on websites, in group chats and other places, are shared both with lawyers on the sides of prosecution and defence. They also may be shared in court — and so, with a Jury, which can be incredibly traumatic and triggering for victims. So much so, that cases are dropped.

But it’s not just the fact that victims are forced to relive the trauma of image-based sexual abuse (IBSA), studies conducted by the Criminology & Criminal Justice Journal in January 2023, found that there are significant failures across current legislation and policing. Police are failing to implement victims’ rights, while victims are facing insurmountable prosecution barriers “due to the plethora of limitations within the law which results in many contexts falling outside of the law’s remit”. In short, the justice system is fundamentally failing to provide justice for its victims.

Charlotte Hooper, Helpline Manager at Cyber Helpline, has worked with the police on multiple occasions, helping them to collect and collate evidence and signpost to resources that can support investigations. “There are a few different things that we’ll do if the police officer is receptive to us,” she says.

Hooper explains that Cyber Helpline has advised the police on ways to manage victim safety, how to track IP addresses and bypass VPNs, amongst other things. However, their help isn’t always met with openness. And, often important steps in identifying perpetrators or gaining evidence can be skipped if the importance of doing them isn’t reinforced. “There’s a real lack within the police of wanting to pursue digital information and intelligence because there is an assumption that the internet is anonymous and they’re not going to get a lot of data from it,” says Hooper.

The police aren’t always happy for the support either, “I think it’s maybe feeling that we’re trying to work against them. And that’s really not the case. We really want to support individuals as much as everybody else does,” Hooper tells me.

However, this isn’t always the case, Hooper says, “We have a lot of officers that are honest to us say, I have no clue what’s going on here — can you help me out? And it’s really great to hear that honesty because no one can be an expert in everything.”

There is a pattern of ignorance surrounding what consent ‘is’ when it comes to the digital sphere too. If consent can be given and withdrawn at any stage of an in-person sexual encounter, can consent be withdrawn (in the eyes of the law) in cybersex, like intimate images or video sharing?

Zahra Awaiz-Bilal, Senior Associate at Bolt Burdon Kemp, explains how the Criminal Justice and Courts Act 2015 protects against intimate image abuse by determining a breach of consent as sharing or threatening to share intimate images without explicit consent. “So as long as you’re not disclosing it, or threatening to disclose it, you’re committing no crime,” she says.

Awaiz-Bilal continues, “And, where individuals are convicted, you have the option to then pursue the perpetrator directly in the civil courts as well for compensation.”

In January, such a case was brought forward, FGX v Gaunt [2023]. “FXG brought a civil claim against Gaunt for compensation,” Awaiz-Bilal continues, “And was successful. She was awarded nearly 100,000 pounds for her civil claim against her ex-boyfriend, which shows that judges are going to treat these as cases of significant value.”

But with so many obstacles in the way of justice and limitations within law and policies, not to mention the slow pace these changes come about. Is there a better way for change to happen?

Gina Martin, a gender equality activist, writer and speaker, changed the law known colloquially as the Upskirting Bill — or the Vuoyerism (Offences) Act 2019. She tells me, “The fact I had to do that myself shows how bad it really is.”

Martin, whose work includes working for Nyome Nicholas-Williams to change global Instagram policy, grassroots work with Beyond Equality to host sessions for teachers and parents on the rise of online misogyny and writing books and articles on gender, misogyny and sexual violence, tells me that, in her experience, criminalisation isn’t always the best agent for change.

“I don’t believe that enough of the gatekeepers and policymakers in the halls of power have the kind of nuanced and compassionate understanding needed for bills to become laws to truly protect us,” she says.

And, the outcome of this lack of real understanding is sobering. There is considerable distrust in the MET police, conviction rates for sexual violence against women and girls consistently worsening, so much so, that the current percentage rate for a rape conviction sits at an abysmal 1.9 per cent. Under-18s are most at risk of intimate image abuse, while user-generated CSAM is a rising concern for the NSPCC, who report 1 in 8 secondary school children have been sent, or have seen, naked or semi-naked images by another young person. While 1 in 25 have been shown or sent naked imagery by an adult.

It’s clear that the system designed to protect us against sexual offences is broken from beginning to end. “I mean our legal system was, and is, constructed by systemising a particular type of thinking from a particular group of people; we know it was created in the image of white, male, upper-class society,” says Martin, “Our society is built with the thread of misogyny, so, too is our legal system.”

She explains that, when it comes to gender equality, waiting for the legal system to catch up to what is happening is hopeless without meaningful change and disruption of the status quo at a grassroots level. “We have to start to look at what prevention could look like, what changing culture could look like. And we are society. We are culture.”

Her latest book, No Offence, But…, tackles how we can begin to have these kinds of conversations, confidently. The book, which is to be released on 27 July 2023, has a host of chapters discussing how to challenge conversation stoppers on social justice and cultural issues, with chapters from Aja Barber, Ben Hurst, Cathy Reay, Charlie Craggs, Daze Aghaji, Ione Gamble, Koa Beck, Mariam Kemple Hardy and Azadeh Hosseini, Nova Reid and Salma El-Wardany. You can order your copy here.

*names have been changed for privacy.

Close