Byline Times is an independent, reader-funded investigative newspaper, outside of the system of the established press, reporting on ‘what the papers don’t say’ – without fear or favour.
To support its work, subscribe to the monthly Byline Times print edition, packed with exclusive investigations, news, and analysis.
1. It isn’t revenge and it isn’t porn
It’s not revenge, because the victim’s done nothing wrong. And it’s not porn, because viewing it is an act of violence, not of pleasure. The non-consensual sharing of someone’s intimate image is sexual abuse.
‘Image-based sexual abuse,’ to be precise. Start calling it that — maybe society will start treating it as that.
Reports of image-based sexual abuse in the UK have increased tenfold over the past few years. Women are five times more likely to be victims of intimate image abuse than men. And one in 14 adults, which is equivalent to 4.4 million people in England and Wales, have experienced threats to share their intimate images without consent. The scale of coverage hardly reflects this.

“There are some phenomenal journalists that really understand the issue,” argued Elena Michael, co-founder and director of #NotYourPorn, “but on the whole it’s this watered down, diluted version” — starting with the term ‘revenge porn’.
The term is not just “incredibly derogatory,” Elena says, it fails to capture the multiplicity of forms in which the crime can be committed — from resharing, to peeping, to selling, to publishing, to generating ‘DeepFakes’ and more.
“It’s limiting our understanding of who might be a survivor, but it’s also limiting our understanding of who might be a perpetrator,” she explained.
2. You can count on two hands the years since it’s been criminalised
The residual implication of the term ‘revenge porn’ is that victims are victims of their own actions, for example by taking an erotic picture in the first place. This stigma might explain why non-consensual intimate image abuse was only actually criminalised in 2015.
Make no mistake, image-based sexual abuse destroys lives. “To experience it is gut-wrenching,” recounted survivor Madelaine Thomas.
At points, I thought my life was not worth living because of the shame and embarrassment that I felt seeing those images and being unable to take them down
Madelaine Thomas
But Madelaine never reported what happened to her to the police because she was not even sure it was actually a crime. And even if it was, “I also felt like the shame that I was feeling would just be reflected back at me”.
3. It’s economic exploitation – as well as sexual exploitation
There’s another reason Madelaine is wary of telling her story. “I feel like I still need to caveat my experience,” she warned at the start of our interview. “I, um, had a very crap job actually. It was a crap job and, um, my husband and I had a child and my crap job didn’t pay for childcare.”
In short, Madelaine started selling intimate images for money. “Just some saucy nudes, nothing terribly explicit.” She said she did so “in the understanding that I was selling a consensual moment for [my clients] to enjoy those pictures”.
“It was a consensual moment of an exchange of trust, as well as exchange of financial remuneration.”
Six months later, she learned some of her images had made their way onto a porn site. “Of course,” was her first thought, “I should expect this to happen to me”. But then she thought again.
Strangers were making a profit using her body, without her consent, and without paying her a penny. Call it theft, call it trafficking — it is tantamount to forced labour.
Very often in cases of image-based abuse, sexual violence is overlaid with economic violence as well. This is true in any case where money changes hands, and all cases where sex workers’ labour is appropriated for free.
The issue is, whenever Madelaine shares this detail with journalists, it becomes the headline focus: “Whatever I do, it becomes ‘SEX WORKER DOES X’ — because we’re searching for clicks, and that’s saucy”.
But by sensationalising her and other victims’ trauma, journalists often end up economically exploiting them over again.
“I get really frustrated with some of the journalists I’ve worked with and the way they treat survivors,” said Elena Michael. “You’re not entitled to their story, unpaid, just so that you can manipulate it and then not do it justice because you’re only going to give it a one-dimensional view.
“You are essentially replicating the harm [of image-based sexual abuse],” she went on. “It’s just another form of entitlement to exploit somebody’s story.”
4. One ‘DeepFake’ has many victims
The media also cannot seem to keep up with the nuances of ever-changing and more sophisticated technology, leading to a dearth of detail when reporting on ‘DeepFake’ content, or AI-generated sexual material, which exploded 400% between 2022 and 2023. Elena sees this as another area of the media’s “one-dimensional” grasp of the topic.
“We don’t understand that there are often two victims of DeepFakes,” she explained. There’s the person (probably a celebrity) whose face has been non-consensually configured — making that side of the story a well-covered headline.
“But then there’s also the body of sex workers” it’s superimposed onto. “Sex workers who are not seen as having rights to choose where their images or their videos or their work goes” — hence why that’s probably not a news report you’ve seen.
5. This is a solvable problem
In cyberspace, legal frameworks often appear incapable of keeping up with the Hydra monster of technology. But tech can be our friend if used right, Madelaine insists.
“It’s not actually a tech issue, it’s a society issue,” she said. “Tech is where I’m trying to knock on doors and explain to people we can use this to make a solution.”
After her “humiliating” experience, Madelaine founded Image Angel, a forensically-approved image-tracking software that can be used to identify offenders.
Even in cases where they could not be convicted, they could still be flagged as dangerous users to dating sites down the line. It’s a deterrent logic — the knowledge of being watched. “Preventing harm from happening, not mopping it up once it’s happened, handing out tissues and going ‘we should probably do more’.”
And she isn’t reserved about her hopes for the software. “That real world impact is going to start a step-change in society. It’s going to start people being accountable for their actions online.”
Meanwhile, Elena’s #NotYourPorn movement focuses on policy. The survivor-led group has consulted on a number of positive developments in recent years, including the 2023 Online Safety Act, which made the sharing of AI-generated intimate images without consent illegal.
“It’s a really ambitious piece of legislation,” Elena said encouragingly, “it gives Ofcom the power to enforce duties against companies for hosting this kind of material”.
But Ofcom is “reading that duty incredibly narrowly,” and the Labour Government is “muddying the waters” with “secondary legislation” that — as far as she’s concerned — is purely designed to give them credit for pre-existing legislation without enacting anything new. “I’m thinking, where’s the teeth? Other than a press release, where have you done any of the substantive work?”
As Elena points out, there already exists a far more rigid framework that could be used to immediately ramp up protections for victims of image-based sexual abuse. Treat all image-based sexual abuse as seriously as we treat its equivalent where children are concerned.
ENJOYING THIS ARTICLE? HELP US TO PRODUCE MORE
Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.
We’re not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.
We are a long way from there. This is the chapter in which offenders are being given back their phones with the abusive images still on them. Is it too much to ask that in the 3% of cases that authorities actually delete the material, they please delete it from the cloud, not just the device?
This is not rocket science. But it is the level at which we’re operating.
Media Storm’s lates episode Image-Based Sexual Abuse: Not your ‘revenge porn’ is out now.
If you or someone you know needs mental health support, the Samaritans are available 24/7, day or night, 365 days a year, and can be contacted for free on 116 123.