Media Storm - S5E8 Image-Based Sexual Abuse: Not your 'revenge porn'
Episode Date: April 4, 2025Warning: this episode contains mentions of suicide. If you need support, contact the Samaritans on 116 123 Reports of image-based sexual abuse in the UK have increased tenfold over the past few years.... Women are five times more likely to be victims of intimate image abuse. The true scale of the problem is probably larger, as many victims do not come forward. But sensationalist headlines about so-called 'revenge porn' are doing a disservice to survivors. The term 'revenge' welcomes victim-blaming, the term 'porn' undermines the severity of the crime. Articles about apparent new protections for victims are written from Government press releases, with fact-checking thrown out the window, leaving tired survivors to take on crucial work as campaigners. Legal frameworks also can't seem to keep up with rising technology - AI generated image-based sexual abuse, also known as 'deepfakes', increased 400% between 2022 and 2023. Is the media failing to point to a culture of misogyny behind this crime? What steps can we take to combat IBSA? And how can we put survivors first every step of the way? Joining Media Storm this week is Elena Michael, co-founder and director of #NotYourPorn (you can sign their open letter to address the critical gaps in current legislation), and founder of anti-share technology Image Angel, Madelaine Thomas. The episode is hosted and produced by Mathilda Mallinson (@mathildamall) and Helena Wadia (@helenawadia) The music is by @soundofsamfire Support us on Patreon! Follow us on Instagram, Bluesky, and TikTok Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
The Hulu original series Murdoch Death and the Family dives into secrets, deception, murder, and the fall of a powerful dynasty.
Inspired by shocking actual events and drawing from the hit podcast, this series brings the drama to the screen like never before.
Starring Academy Award winner Patricia Arquette and Jason Clark.
Watch the Hulu original series Murdoch Death in the Family, streaming October 15th on Disney Plus.
With the RBC Avion Visa, you can book any of any of the movie.
airline, any flight, any time. So start ticking off your travel list. Grand Canyon? Grand. Great
barrier reef? Great. Galapagos? Galapagos? Switch and get up to 55,000 avion points that never expire.
Your idea of never missing out happens here. Conditions apply. Visit rbc.com slash avion.
Reports of image-based sexual abuse in the UK have increased tenfold over the past few years.
Women are five times more likely to be victims of intimate image abuse.
But the true scale of the problem is probably larger, as many victims do not come forward.
Wait, are you talking about revenge porn?
Well, yes, I suppose I am, but I am.
am trying to deliberately not use that term.
Oh, why?
Well, for a while, I've been wanting to do a media storm episode on the scale and range of
revenge porn.
I've been wanting to delve into how the media discusses it, what it entails, what we can
do about it.
And then the more and more I delved into it while researching it, the more and more I realised
that the term revenge porn just doesn't even come close to describing the issue at hand.
In fact, it's almost downright.
offensive now, I think
about it. And as we
always say on Media Storm, if we
can't even use the correct language
we do a disservice
to the community experiencing the issue.
Okay, okay, I'm intrigued.
I think I sort of see the problem.
And I'm glad. And I'll let
our two brilliant upcoming guests
tell us more about what terms
we should or should not be using.
But first, let's lay out the
problem. What do you think
revenge porn, or as I'm now
referring to it, image-based sexual abuse is.
The thing that springs to my mind immediately is like an ex-boyfriend sharing images online with
his friends of an ex-girlfriend that were maybe once sent with consent to that specific
person, but we're not supposed to be viewed by others.
Yeah. And I would say that that's probably what most people think of when they hear the
term revenge porn. And that certainly happens and is a big part of image-based sexual abuse.
But there are unfortunately many different forms or aspects of non-consensual intimate image abuse.
There's the threat of sharing intimate pictures.
There's sharing pictures or videos without consent.
There's uploading images or videos onto porn websites and money being evolved or the porn industry benefiting from it.
There's the culture behind it all.
For example, sometimes the pressure to send these.
images in the first place. The End Violence Against Women Coalition found that 9% of girls
aged 13 to 16 said they felt pressure to share images of themselves that they're not comfortable
with. I mean, I remember how big an issue this was at school and how destructive it was
when, and it happened to a bunch of friends, girls in my year, you know, photos that they sent
guys were then predictably sent around the whole year and then the slut shaming and the cyber
bullying that came from it. But at least back then, you know, we were sending really grainy shit
quality photos on flip phones. Right. Today, every 13 year old, I don't know, young teenager
has a smartphone with an HD camera and access to the whole universe of the social media cloud
whatever's out there. Yeah, absolutely. I completely agree. And actually,
Actually, that rise in technology and mobile phone usage has played a part in this.
When we were all stuck in lockdown and on our phones 24-7,
the Revenge Porn Helpline said that the number of reports they received doubled in 2020,
reaching a record high number of cases.
And advancing tech has only worsened things for women in this case.
There has been a huge rise in AI-generated image-based sexual abuse.
also known as deep fakes, being thrown out into a world where legal frameworks cannot seem to keep up with this rising technology.
And bear in mind, it was only 10 years ago in 2015 that non-consensual intimate image abuse became illegal in the UK.
In many countries, it still isn't illegal.
Are you serious?
Our entire, like, school and uni experiences, it was legal to what?
We were not protected.
It seems that at every turn, not just in tech spheres,
the legal frameworks and the justice systems
are always one step behind.
What are we missing in our reporting?
What can we do about it?
And are we focusing so hard on isolated incidents
we're failing to see the bigger picture?
And that original definition in law of intimate abuse
is way too narrow.
It leaves many loop on.
The problem is prolific. The software searchable and no so easy to find. The number of victims fast.
He was trying to make me out as, you know, a jealous attention seeker.
We can reveal so-called revenge porn is rising in the UK.
Welcome to MediaStorm, the news podcast that starts with the people who are normally asked last.
I'm Helena Wadia. And I'm Matilda Mallinson.
This week's Media Storm. Image-based sexual abuse. Not your revenge porn.
Welcome to the Media Storm Studio.
Joining us today are two very special guests.
Our first guest is an award-winning campaigner and activist.
She is the co-founder and director of hashtag not your porn,
the movement fighting for policy changes
and protections for victims of image-based sexual abuse.
Welcome, tuning in from North Wales, Eleanor Michael.
Hello, thank you so much for having me.
Really excited.
Our second guest, like thousands of people, discovered intimate images of herself published online without her consent.
She couldn't trace who leaked them. The crime remains unpunished and the images remain online.
This experience spurred her to found Image Angel, a technology that can be used to identify where non-consensual images leaped from.
We'll hear much more about it later on in the episode, of course.
But for now, welcome Madeline Thomas.
Hello, thank you for having me.
In those introductions, we use terms such as image-based sexual abuse
and non-consensual intimate images.
But in the media, these terms are often put under one label, revenge porn.
Let's begin with Eleanor.
What do you think of the term revenge porn?
I think the first thing to understand about any term that's coined by the media
is that it's only going to offer maybe like 5% of what actually happens to people who are subjected.
to this kind of abusive behaviour.
So one of the biggest problems with terms like revenge porn
are that it's not always about revenge
when somebody shares or takes images non-consensually.
And it's certainly not porn.
So it's limiting our understanding of who might be a survivor,
but it's also limiting our understanding
of who might be a perpetrator.
And we want terms that the public understands
to understand an element of the crime.
At the same time, we need to move away from this language.
It's incredibly derogatory, and it does not encompass the full picture, spectrum, magnitude of the kind of harm that people experience.
Yeah, Madeline, in some of our correspondence before this episode, you used the term image-based sexual violence.
Is that your preferred term for the crime?
For what happened to me, yes.
But there are any number of experiences that go along with this.
And it could be to do with stalking.
could be to do with digital harassment.
So I tend to go with the term image misuse
because that is a bit more of a cover-all.
It doesn't denote the harm.
That's the problem.
But we can understand it a bit better
when we say image misuse.
It's funny because I had never even thought about the fact
that the term revenge porn was in any way
problematic or insufficient until we did this episode.
And then as soon as Helena said it,
I was like, hold up, yeah, porn.
And sorry, you are not in a porno that you don't consent to be in.
And porno implies something that's designed for pleasure.
That is not porn.
That was the thing that hit me straight away.
And then revenge implies that the victim did something wrong in order for somebody to seek revenge.
Yeah, it's problematic on so many levels.
When we talk about the term revenge porn being useful, perhaps it is somewhat useful in the way that people know exactly.
what we're talking about when we use the term revenge porn.
For example, there is a helpline dedicated to victims and survivors called the
Revenge Porn Helpline. Is it in that way useful?
Well, it's a cyclical argument. So it's useful in the sense that it's a popularised term
that people understand and therefore we can catch survivors and we can help them. But in terms
of its overall use within media, within policy, it's really incredibly damaging. Like,
A lot of the work that Madeline and I have been doing policy-wise has been limited and affected the understanding of law and policy makers because of use of this kind of term.
It's also similar with terms such as deepfakes.
There's such an inherent bias with the term deep fakes that we don't understand that there are often two victims of deep fakes.
There's the person who's been non-consensually generally put in.
into some sort of porn video that they've not consented to.
But then there's also the body of sex workers
that are not not seen as having rights
to choose where their images
or where their work goes or their videos.
And these kinds of terms,
in terms of the wider landscape damage it's doing
with law and policy, it's huge.
Yeah. And editors have to recognize the power they have.
They can't just say,
we speak to people in the language.
understand, well, no people use the language that you use. You know, no one has more power to
define the vocabulary than the mainstream media. Eleanor, I think this is a good time now for you to
tell us about the hashtag not your porn campaign. What are its aims? What does it do? It was founded
in 2019 by Kate Isaacs, who was supporting a friend who had her images non-consensually
shared on Porn Hub. And at the time, there was also a download function.
So by the time that Pornhub actually got round, I think it was between six months to a year later, taking that non-consensual content down, the damage was already done.
She was trending in the UK and also Europe, I believe, in the top three videos, but she had obviously not consented to being on Pornhub.
So we are, you know, our roots are super grassroots.
Me and myself, I'm a survivor of intimate partner violence.
The focus was always pushing survivor stories into the violence.
press into policy spaces because we learned very, very quickly, like people like Madeline,
are the experts. They know how to fix this problem. And also, they can tell you in intimate
detail what is wrong with the system. So we've evolved since then and we've become more proactive
to help people campaign, but also not retortmatize them because the burden on survivors
to fix this problem unpaid around also managing their own trauma,
also managing dealing with the police and the CPS and the stigma
and all the rest of that shite that comes with it, basically, is huge.
There's so much expertise there and there's so much passion and fire,
but the system that we work in is that survivors' stories or narratives
are not necessarily given the weight that they absolutely should,
and we're fighting to change that,
so that survivors are leading the way, along with their partner organisations and, you know, academics.
Wow, you're just preaching the media store mission over here.
And, you know, that survivor-led approach is particularly important when you dig a little deeper
into the effects that image-based sexual abuse can have on victims.
Research has shown that victims of this crime can suffer long-term psychological,
personal and social consequences, including severe emotional distress, anxiety and depressive.
there can be a huge sense of powerlessness.
Madeline, before we talk about how you reclaimed that power
by creating Image Angel, could you describe to us your experience
with image-based sexual violence?
I feel like I still need to caveat my experience
because when your listeners hear this, they'll go,
oh, well, that makes it different when I explain my story.
Perhaps it won't.
But in my experience, I'm still dealing with the shame
of how my images came to be.
I had a very crap job, actually.
It was a crap job.
And my husband and I had a child.
And my crap job didn't pay for childcare and for my life.
So I thought, I want something better than this.
And he was wonderfully supportive.
And he said, yes, well, you know, I will help you.
I'll do whatever we can.
So I got a job doing camming, which is like bizarre, it's strange.
But it was wonderful.
It was fun.
And I earned more.
in that 20 minute call than I would in a day and I'm having fun and I'm staying at home I mean
where is the harm I'm having consensual conversations with people and at one point I sold some
images to to a group of people you know just some sort of saucy nudes nothing terribly explicit
in the understanding that I was selling a consensual moment for them to enjoy those pictures and
I thought I look bloody great in them actually and that's that was the last of it I never thought
any more of it. It was a consensual moment of an exchange of trust as well as exchange of financial
remuneration. And then it wasn't until probably about six months after that that somebody said,
oh, you do realize you're on this website, don't you? And I'm like, what? No. Why? And I felt
stupid. Of course I should expect this to happen to me. Of course, it's par for the course.
And then this part of me was going, but why?
Why do we as society let somebody take your image just because?
And to write shameful things about you and to hurt, Jane, embarrass, humiliate, docks,
all of these things that other people have experienced and just the minutiae of that.
It's so complex.
But to experience it is gut-wrenching.
And I'd love to know how your audience is now feeling.
Are they kind of going, well, you know, what was she wearing?
essentially because it is it is the modern day equivalent of what was she wearing well if you put
those images out there of course that's going to happen and I too felt that and I still do I'm
still dealing with that but I do believe that it's a fundamental right that I should be able to
earn in a way that suits me my family my children and still feel safe doing so in the same way
an Uber driver has the ability to drive and be safe in their cab or a hairdresser has basic rights and
labour law protection in their workplace.
This is a method of earning for me
and I should have the right to have those images removed
because I consented to A, but I did not consent to B.
And so it led me down a path of pain, hurt.
At points I thought my life is not worth living
because of the shame and embarrassment that I felt
seeing those images and being unable to take them down.
So I contemplated ending my life.
I contemplated moving to Russia, finding whoever's hosting this
and tearing it down myself, what can I do?
And I was faced with the solution which was lump it.
You've made your bed and now lie in it.
And I don't think that that's enough.
And I want to change the world now.
I don't want anyone else to experience that.
There shouldn't be a trade-off of rights.
There is this stigma around sex work that because you put your body out there,
therefore everything that happens to you is your consequences we live in a world where people's bodies
predominantly women are commoditised and yet a woman of her own volition making choices about her body
is seen as well you reap what you sow which is completely you know that is the narrative
um we have to cut through yeah and i see ripples of it just looking in consensual relationships
for example between partners or perhaps you know someone's posted um a nude to one individual
They have maybe a tenuous relationship.
There's this level of entitlement to do whatever you want to somebody's body.
And we're part of the movement that says a woman's sexuality is her choice,
whether that's for profession, whether that's personal.
And if any fundamental foundation for law or policy is working on this idea of stigma and shame
and one survivor's rights is more important than another's, we've got a real big problem.
That's something that struck me.
mandolin, you would think, oh, what are people going to think when I include the detail that I was
working as a sex worker? But what I hear, when you tell that story, in the intersection of being a
sex worker and being a survivor of image-based sexual abuse, is this added layer of exploitation,
on top of the sexual exploitation, is the economic exploitation. And indeed, this is faced by
anyone whose image is non-consensually put on a porn site, because someone is
profiting there. Someone is making money of that. And that is something you provided for a fee.
That is your labour, and you are not being paid for it, but someone is. And that is amount to
forced labour, modern slavery, trafficking. Absolutely. So, yeah, I mean, I see this is an added layer
of expectations. Yeah. And also, there's this strange layer that I'm coming to terms with,
which is the Vorg sectors, the violence against women and girls.
I feel like my story, my voice, is seen as problematic in violence against women and girls' sectors
because I'm seen as part of the problem because I commoditized my image.
So I'm walking a very, very fine line between acceptable and unacceptable.
But when you did share your story, it was filled with an emotion that is so real and related.
and frankly, uncontroversial, and we do not see that nearly enough in the coverage of this
topic. And I think that that brings us on to the media and how they report the subject of
image-based sexual abuse. Victims of this abuse are overwhelmingly female. Now, countless
studies have shown us that women make up between 80 and 90% of victims of this crime. And this is
not to do a disservice to male victims who exist and who deserve their own.
space in this conversation, but I say it to show that this crime at its call is gendered.
Eleanor, back to you. Do you think that the media does a good enough job at reporting on the
gendered aspect of image-based sexual abuse? I think that there are some phenomenal journalists that
really understand the issue and understand that it's a multifaceted approach. And it's interesting
to me, not always, but the majority of the journalists that are doing a good job are women.
You know, just to name a few, you've got Adele Walton, who's also a campaigner.
You've got Lydia Morris, you know, he's just phenomenal.
You've got Lucy Morgan from Glamour, who's been, you know,
Not Your Porn has been working with End Violence Against Women Coalition.
There are phenomenal people, but on the whole, no, I don't think that our audiences are completely one-dimensional,
unable to critically think for themselves
and need to have this really watered down, diluted version of the issues
because, oh, it's not nice with their morning coffee.
But you've also said, you know, glamour, women's magazine,
if it's a question of speaking to your audiences,
you know, should this really just be in women's magazines to women's audiences,
who are the audiences that really need to be educated about image-based sexual abuse,
if it is as gendered as the data tells us it is.
I think there may be a portion of the harm is coming from how we frame people who have
experienced this because whenever I do something at home, we always go,
sex worker does something because it's such a headline.
Whatever I do, sex worker smashes plate, sex worker smashes plate.
It becomes so sensationalized because we're searching for clicks.
So as soon as a sex worker does something, it becomes this sensational.
I should click on that and find out what's going on because that's saucy.
Absolutely.
Yeah.
So I often talk on this podcast slash every day of my life about how in cases of fatal domestic abuse and violence against women and girls, they're reported on in the media as isolated incidents and we'll hear a lot about certain specific cases but we'll rarely hear about the wider context.
In the case of fatal domestic abuse, the wider context is that three women.
women a week in the UK are killed by a man.
In the case of image-based sexual abuse,
the wider context is that nearly a fifth of women in the UK,
17% have experienced image-based sexual abuse.
That status from the End Violence Against Women Coalition.
Madeline, do you think the media does enough
to explain the culture of misogyny behind image-based sexual abuse
and to point to that wider context?
No, they don't because it's not sexy, is it?
It's not clickable and it's not going to get you the adverb in you.
So no, they don't.
You know, what would be the benefits?
It's very difficult to see really tough-hitting conversation happening
without there being some kind of showbiz angle or, you know, a documentary that pulls a couple of strings
and it kind of ruffles feathers because that's what gets the viewers.
Yeah.
There is a place for a show-biz angle, but that's a show-biz angle.
can't be the only narrative. There are so many facets of how this behaviour exists, how this
abusive behaviour plays out, how it becomes a part of a pattern of behaviour and attitudes towards
women and girls. And those nuances are crucial to being able to tackle this problem. And I think
that that's part of the wider conversation that we need to be having that the press play a
fundamental role. And I also really get frustrated with some of the journalists I've worked with
over the last several years
of the way that they treat survivors.
You know, this is work what they're doing.
You know, you're not entitled to their story unpaid
that you can manipulate and say whatever you want
and then not do it justice
because you're only going to give a one-dimensional view.
That's really deeply problematic
because you're essentially replicating their harm.
It's just another form of entitlement
to exploit somebody's story
in order to put out whatever it is you want to put out.
huge sort of sticky issue in journalism. A lot of journalists, documentary makers would say,
oh, you can't pay someone for their story. It incentivizes exaggeration. But when you are
consulting experts by experience about these topics, it's a very different case. I mean,
at Media Storm, it is really important to us and has been from the beginning that we pay our guests
to share their lived experience and expertise. And in some ways, that makes us actually really
quite unusual in the media and the people who do get paid for their stories by the way are sort of
you know the friends of celebrities who will spill the beans and actually you know often do just
exaggerate and make up stories for those cash checks but it is not the people who are exposing themselves
and sharing intimate intimate stories in order to push much needed policy change so to provide that
context that is missing from the media it is that one in 14 adults which is equivalent to 4.4 million
people in England and Wales have experienced threats to share their intimate images without their
consent. That stat is from refuge. Given that stat, our next question would be, you know,
does this topic get enough coverage in the media? We have an example we want to talk about. There was
recently a very, very important story on this topic that was broken by the observer. They analyzed
court records and put in freedom of information requests. And they found, this shocked me,
that perpetrators of image-based sexual abuse are being allowed to keep the explicit images of their victims on their devices after the investigation.
And this is because of a failure by prosecutors to obtain orders requiring their deletion.
This story broke in late February and of 98 cases concluded in the magistrates court in England and Wales in the previous six months, that was 98 cases, just three resulted in a deletion order.
This is a huge story and a shout out to The Observer and the Guardian who really are the outlets that seem to report on image-based sexual abuse and online abuse far more than any other news outlet.
This story is an example of what the media is for, right?
The Observer using their truth-seeking power to call out these loopholes that affect marginalised people.
And as I said, by all accounts, this is a huge story.
But it didn't make a splash really elsewhere in the media.
One month later on the 23rd of March, the observer provided an update to the story,
stating that the Crown Prosecution Service are now going to update their guidance on image-based abuse crimes
to stop perpetrators being allowed to keep these expressive photos of their victims.
Now, this update was picked up in one other place, the Telegraph,
who reported on the story by praising the new CPS guidelines
and missing out the context provided by the Observer,
in the first place, that only three of 98 cases in the last six months resulted in one of these
deletion orders. In fact, there's no mention in the whole Telegraph article as to why the CPS is
suddenly updating their guidelines. And instead, the article is mostly a direct quote from the
CPS, patting itself on the back for this update. And this is just one example of the lack of
traction an important story about image-based sexual abuse can get. And it's also an
example of the erasure of victims. Eleanor, why do you think this happens in our media?
Well, first thing to say is huge shout out to Chanty Das for doing both those stories.
That is an example of comprehensive, nuanced journalism. Why do I think it happens?
It's the lack of connection of understanding how these moving parts fit together. For example,
what I would have loved to have seen in the Telegraph article is the criticism that deprivation
orders for devices should only apply from what we can tell on the guidance to one device.
But we know that perpetrators tend to have multiple phones, multiple devices.
Wait, what? Surely it's not stored on the device?
We all have multiple devices.
The cloud, yeah.
I know. It's frustrated me beyond belief.
The other thing that really irritates me, if you even get to the point where the CPS is going
to prosecute, which let me tell you is an absolute nightmare, there's the other issue of
In the course of the investigation, police officers routinely do not confiscate all devices.
Right.
And then, you know, even if you go through the investigation and you get to the court, the charge rate for trials is incredibly, incredibly low.
Data obtained by Refuge shows only about 4% of intimate image abuse cases reported to the police currently result in the perpetrator being charged.
Refuge also noted that we urgently need trauma-informed training
for all sections of the justice system
from the police to the CPS to the judiciary.
Did you report what happened to you?
No.
Why not?
I felt that I don't know who has done this,
so how can I chase?
I also felt like the shame that I was feeling
would just be reflected back at me.
And I'm still struggling with it.
I still feel icky.
and I really don't want to
but it's so deeply embedded in me
to feel shame about that
but I'm pretty sure if I had gone to the police
and I have heard from many people who have
they're given
well you probably shouldn't have done that in the first place
there's not really much we can do
have you tried doing this
you know have you tried getting it taken down for a copyright
which is not successful
so unfortunately the frameworks don't exist
to support people who are going through this
and it doesn't mean to say that there aren't
phenomenal police officers out there have worked with many, but an overwhelming proportion of
the people we've worked with have had negative experiences with the police. And I can't tell
you how many times we've seen cases where police officers have one, you know, run out of time
for pursuing a particular case. Two, don't understand the nature of the offence. And so the
survivor is doing all of the legwork. Three, don't investigate properly, aren't able to get particular
sites up on their computers so the survivor is having to go through it themselves it becomes a
full-time job and you know i've also sat in court cases and listened to or read through transcripts
of court cases where i'm listening to the questions of defense barristers and i'm thinking how on earth
are you still practicing and you know there are times where i have to really bite my tongue
and be like just because you are an expert in criminal law you have completely misunderstood the
nature of the offence and what the evidential threshold is and what the components of the
offence are. Yeah. And our media really are not helpful in this case because one of the things
that I find very confusing when reading about this topic in the media is the legality and the
laws. I'm never really entirely sure in what ways victims are protected or not protected
when I read articles about image-based sexual abuse. You know, if I were to go by what
I have been reading recently online on the topic.
I've read that the Online Safety Act is prioritising the seriousness of image-based abuse.
Eleanor, maybe you can tell us what is the Online Safety Act and if that reading is accurate.
So the Online Safety Act is a really ambitious piece of legislation.
It's huge.
And there were some things added onto it, such as the amendments to Image-Based Abuse Laws.
but primarily it gives offcom the power to enforce duties against companies
for hosting this kind of material and it is allegedly holding them to account.
The way that offcom is reading that duty is incredibly narrowly
when it comes to IBSA and ending violence against all women and girls,
regardless of their profession, online.
This is where government comms plays a huge role
in confusing and muddling the waters of what,
actually happening. When the Labor government came in, they announced they were basically
putting in place a statutory instrument, which is a secondary piece of legislation, to include
IBSA as a priority offence, which basically means that it's a really important offence. It's on
the level of some of our worst crimes, basically. However, the piece of legislation had already
done that. Government comms decided to make this like, look at what we're doing, convincing the public
that you're apparently tackling violence against women and girls,
but that's not what you've done from the start.
And so this is one of the things that really worries me
about the messaging that people are getting,
the general public, about what their protections are,
because half the time, I don't even know what's going on,
but I'm thinking, one, where's the teeth?
And two, other than a press release,
where have you done any of the substantive work?
Now, something else the Online Safety Act does is make the sharing,
of AI-generated intimate images without consent illegal.
This is bringing us on to the topic of technology.
As technology advances and rapidly evolves,
so does the nature of image-based sexual abuse, IBSA,
leading to a rise in AI-generated intimate images,
as they're widely referred to, deep fakes.
Now, according to online security experts at Security Hero,
between 2022 and 2023,
deep-fake sexual content increased by over 400%.
Madeline, you work in this area of tech-based protections.
Are the measures in place that we're seeing going far enough?
Can you also tell us about your solution at Image Angel what it is
and how the technology you propose could help victims?
It's not actually a tech issue.
It's a society issue that we can change through tech
is where I'm trying to kind of knock on doors and explain to people.
this is how we can make this solution.
But what we have right now is the Online Safety Act
is asking the platforms that host content
to make sure that they're protecting their users from harm.
So they're making sure that people who are accessing these sites
where harmful content may be aren't children.
Is that enough?
I don't know.
The tech I have made is primarily a deterrent API.
So it is software that a platform can install.
and it gives a nudge to your average user to say this piece of content is protected.
It is assigned to you.
You are the guardian angel of this image.
You don't own it.
But it is yours to look after.
And should you share this somewhere, we'll be able to find out who you are.
If you don't share it, nothing happens.
No problem.
You keep it on your phone, fine.
If you keep it within your private space, that's fine.
but this moment of trust that you have with somebody on a dating out for example you send a
picture saucy nude they send one back both of you are responsible and entrusted with that
private moment and if you share that you are committing a criminal act an image angel will be able
to find out your name address and can knock on your door not personally but we can give it to the
authorities who can knock on your door it would be forensically able to prove that that
person shared it, non-consensually, therefore they wouldn't have done it, or they would
have been less likely to do so. And that real world impact is going to start a step change
in society. It's going to start people being accountable for their actions online.
So you want really the platforms such as dating apps, social media platforms to sign up to
image angels? Absolutely. And I don't see why they shouldn't. This is
protecting people on their platforms, and that is their duty of care under the Online Safety Act.
This type of preventative technology, preventing harm from happening, not mopping it up once
it's happened, not handing out tissues and going, we should probably do more.
This is stopping it before it even happens.
It's stopping that user in their track and saying, you probably don't want to do that because
we're going to find you.
Finally, to wrap up, let's look forward.
how do our laws surrounding image-based sexual abuse
compare to other countries
and are there any immediate steps
that we can take from other countries, perhaps, Eleanor?
So the law that we're fighting for at the moment
for a comprehensive image-based abuse law
would plug all of the gaps that we have in our current system.
So there are five main asks
which are things like improving the criminal laws,
classifying this kind of content as illegal to kind of put it on par with child sexual exploitation
material. But we also need to improve civil laws. You know, we need to be able to go into an online
portal, submit a takedown notice, and within a few hours have that image taken down because
it's legally mandated. It needs to be accessible. Then thirdly, we need to improve sex education.
We need to be having not just conversations at schools with any key stakeholder. The
NHS needs to understand that this kind of abuse is abuse and therefore warrants psychological
support. We also need to have a sort of commissioner type role like we see in other countries.
And finally, we need support for specialist services. A recent report showed that actually the
government had underspent its VORG budget considerably by the Office of National Statistics,
yet they're telling us that they are trying to tackle Vork.
How is that possible?
All of these harms and communities and behaviours are operating at such a fast rate.
We are already 10 years behind.
So we need to start getting ahead of it and stop, you know,
thinking about the politics of it and legacy building.
We need to see the teeth of these policies.
I want to see action because it's only going faster.
And the more content is out there, the more we feel.
into these massive monoliths of data and content and we are contributing to a society
that just sees it as as throwaway, as offhand.
When what you're looking at, that is private, that is protected.
It's a special moment.
So speaking of action, let's give our listeners calls to action.
Eleanor, let's start with you.
Can you tell listeners where they can follow you and your work and if you have anything to
plug?
Yeah, absolutely.
So you can follow us.
on Instagram at the Not Your Porn handle,
the F, the O in Porn has an X, you know, sensitive to your porn on Instagram.
And also on X, formerly Twitter, Not Your Porn is felt normally.
The thing that you can do to support us at the moment,
we currently have a petition that is being headed by a survivor campaigner,
Jody Campaigns, for an image-based abuse law, has over 70,000 signatures.
If we can get it to 100, we can stop pressuring the government even further
to take this seriously.
So please, I urge you all to sign it
and obviously I'll give you the link.
And Madeline, same question.
Where can people follow you?
And do you have anything to plug?
My plug would be demand more.
Demand, demand protection and rights.
And where can people find me?
They can find me at imageangel.com.com.
But we also have imageangel.com.com.
You can type in the platform that you are working on,
playing on, dating on, and you can send them a direct mail that says,
hey you, install this for my protection.
So I would request that if you care, if you want to be safe, make some noise.
Thank you for listening.
If you want to support Media Storm, you can do so on Patreon for less than a cup of coffee a month.
The link is in the show notes and a special shout-outs to everyone in our Patreon community already.
We appreciate you so much.
And if you enjoyed this episode, please.
send it to someone. Word of mouth is still the best way to grow a podcast, so please do tell
your friends. You can follow us on social media at Matilda Mal at Helen Awadier and follow the show
at MediaStorm Pod. MediaStorm is an award-winning podcast produced by Helen Awadier and
Matilda Mallinson. The music is by Samfire.
all book lovers. The Toronto International Festival of Authors brings you a world of stories
all in one place. Discover five days of readings, talks, workshops and more, with over 100 authors
from around the world, including Rachel Maddow, Ketourou Isaku and Kieran Desai. The Toronto International
Festival of Authors, October 29th to November 2nd. Details and tickets at festivalofauthors.ca.
