Should I Delete That? - Cyber brothels, AI girlfriends and misogynistic algorithms: Laura Bates on the new age of sexism
Episode Date: May 11, 2025The Netflix show Adolescence opened up the conversation about male violence and the radicalisation of young boys online into the mainstream. But how much do we really know about what young boys are vi...ewing online and the effect is has on them? This week, we're joined by someone whose work has shaped the conversation around gender, power, and the internet for over a decade - writer, activist, and founder of the Everyday Sexism Project, Laura Bates.We were so thrilled to welcome Laura back on the podcast to talk about her brand new book, The New Age of Sexism - an examination of how misogyny is evolving in the digital age. We spoke to her in light of the rise in incel culture, the mainstreaming of online radicalisation, and the links between toxic masculinity and real-world male violence,This conversation touches on some upsetting themes - specifically violence against women and sexual abuse. But Laura gives us practical advice on a personal and policy level to help us eradicate this new wave of gendered hate. We learned so much from this conversation and we hope you will too.Follow @laura_bates__ on InstagramThe New Age of Sexism: How the AI Revolution is Reinventing Misogyny is out on 15th May 2025. You can get your copy here! If you'd like to get in touch, you can email us on shouldideletethatpod@gmail.com Follow us on Instagram:@shouldideletethat@em_clarkson@alexlight_ldnShould I Delete That is produced by Faye LawrenceStudio Manager: Dex RoyVideo Editor: Celia GomezSocial Media Manager: Emma-Kirsty FraserMusic: Alex Andrew Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
We're seeing for the first time in history, basically algorithmically facilitated mass radicalisation.
Hello and welcome back to your should I delete that.
This week, we're joined by someone whose work has shaped the conversation around gender, power and the internet for over a decade.
We have writer, activist and founder of the Everyday Sexism Project, Laura Bates.
We were thrilled to welcome Laura back onto the podcast to talk about her brand new book, The New Age of Sexism.
which is an examination of how misogyny is evolving in the digital age.
We spoke to her in light of the rise of in-cell culture,
the mainstreaming of online radicalisation,
and the links between toxic masculinity and real-world male violence.
It's a conversation that's become more mainstream in recent months,
partly in thanks to the Netflix show Adolescence.
A heads up that this conversation touches on some upsetting themes,
specifically violence against women and sexual abuse.
It was a shocking, fascinating and really educational conversation.
We learned so much from it and we really hope that you will too.
Here's Laura.
Hi, Laura.
Hi.
Welcome back.
Thank you.
Thank you so much for having me again.
Oh my God, are you kidding?
I said, we'd have you on every week if we could.
Unfortunately, there's so much to talk about.
And we talked before, we did an episode just Al and I about adolescence after it came out,
which I'm sure you're sick of talking about.
But it's, we said we wanted to talk more about, like, the rise of insult culture, about male violent, about it, particularly in schools.
And this is your like, bread and butter.
This is unfortunately, like, the thing that you know the most.
You have a new book coming out all about this.
I'm so excited to read.
But it would be amazing if we could to start, I guess, on the back of the adolescent thing,
to talk about the kind of conversation that you're seeing, the landscape that's different even now from when we last spoke to you with the rise of incal culture and online.
and all of that.
Sure, yeah.
I mean, it's different and it's the same.
Like, horrifyingly, here we are talking about it in the days just after we've seen another
crossbow attack, two young women seriously injured allegations that the perpetrator
has left behind a Facebook manifesto saying that misogynistic revenge was his motive,
his reasoning.
Obviously, we don't have that confirmed yet.
It's really important to say by the police investigation has to happen.
But what we can say is that unless,
than a year, that's now five women, three of them, very tragically the Hunt family, dead and
now too seriously injured, that this is an epidemic. And it's one woman every three days
who's murdered in the UK. But what's really heartbreaking is that in all of the news stories
about the incident, it prominently says, this is an isolated incident. This is very, very rare.
This is an isolated incident. And I think women everywhere are just screaming at their screams at
this point. This isn't an isolated incident. It's the opposite. It's a passion. It's a passion. It's a
and when are we going to join those dots?
And when are we going to look at the fact that in the other crossbow murder case,
for example, the perpetrator had been listening back to back to Andrew Tate's podcast
and content in the days leading up to the murders.
We have to start joining those dots and thinking about the impact this landscape is having.
Do you know, do you have a theory as to why we're not?
I mean, not we, why they aren't?
Because you're right to women.
It's like this is the most blindingly obvious, frustratingly terrifying thing in the world.
Why is there just this refusal to do that?
I think it's a conflation of two different blind spots, which together are really powerful.
The first is a racist blind spot, which is that our society really struggles to conceive of white men as terrorists.
And it really struggles to conceive of white boys being radicalized into extremism and hatred.
And the other one is a misogynistic blind spot, which is that as a society, we are just so used to misogyny.
We are so inured to violence against women
that we don't even think of it as abnormal.
And so to compute the idea that that is extreme
is something we really struggle with.
Adolescence, obviously, like, it was huge.
That show was huge, right?
Yeah.
And do you think the impact of adolescence
and just its existence,
do you think it's been net positive?
Yes, definitely.
You do?
Yeah, I mean, I think it's a brilliant piece of filmmaking.
Okay.
I think it's an incredible show.
It's kick-started a conversation in a way that simply wasn't happening before, and that's
brilliant.
Everyone's talking about this.
Parents are becoming aware of it.
Schools are thinking about what they can do.
And in so many ways, it was a nuanced, clear-sighted take on masculinity and young boys and grooming
and how they can be sucked in and fatherhood and parents being clueless about what's happening
in kids' bedrooms.
Like, I would say, brilliant in so many ways, it's brilliant and it's done a brilliant thing.
and then there's the other things which are I think partly for women who have been screaming
about this for so long for it to be a couple of guys who come along and make a TV show
about a boy and now everyone wants to talk about it and particularly a TV show that
in which the victim was almost completely invisible you know that like took such care
over what it must have felt like for him to have the blood test that he has and like
how he might have been scared of needles but didn't give us.
that same gut punch thinking about her being stabbed multiple times. There are elements of it
where I was kind of screaming at the screen. Like, why didn't you speak to someone who'd research this?
You know, when you've got a kid saying it's a call out from the manosphere, for example,
in one of the scenes, like we would never see that in schools. Kids are not talking about
the manosphere. And in the vast majority of cases, they're not talking about in cells or
in cell terminology or going into any of that in-depth in-cell ideology that they were kind of parroting
in the show, it's much more subtle than that. In general, the vast majority of them are not
members of Insel communities or even using that terminology. They're coming across this extreme
misogyny further downstream, whether it's on TikTok, on other social media. The thing that was
the most frustrating for me, though, was the kind of way in which the storyline had the girl
sort of calling this boy in Encel bullying him and this idea that that kind of misandressed
bullying is provoking boys and that they're pushed over the edge because that is essentially
in-cell rhetoric and that's not what we're seeing in schools what we are seeing is an epidemic
of violence against girls and abuse and harassment of girls in schools we're not seeing this
idea that boys are only doing it because girls are bullying them that doesn't kind of
stack up to the reality because that is the in-cell ideology right is that it's that it's all
on women and like the 80-20 thing right yeah like 80% of women fancy 20% of men and it and it's
such a focus on women, but I think a lot of people listening might be a little bit confused
about the difference between the manosphere and in cells, because incels are quite rare, right?
Yeah, I mean, relatively speaking, it's a minority, of course, but we are talking about
forums and communities and groups that have in the millions of members, certainly.
Has that grown?
I don't think it was that many.
It's increasing quite quickly, yeah.
And the important thing I think to be aware of is the number of kind of lurkers.
So the number of people are on there consuming the content, reading the forums,
you don't necessarily have an account.
But even then, I would say in those spaces, which I spent two years in,
you are very much in the majority talking about adult men.
They're not filled with schoolboys.
And it's very much a depressing, sad place to be.
It's communities of men who believe that the world is hopelessly stacked against them.
There's a huge amount of suicide content and actually really inciting each other
to suicide. This is not a kind caring community where men are supporting each other and how
dare we criticize it. It's toxic. It's men who hate themselves and they hate other men and
most of all they hate women and they are encouraging others to go offline and to massacre women and to
hurt women. And when the manosphere? Right. So then the manosphere is the group, the name that we
have for this ecosystem of groups of which in cells are just one. So these are kind of websites or
communities or groups dedicated to different ideologies, but all kind of bounded by this idea
of men as superior, of women as sexualized objects. So other groups within that that aren't
in cells would be so-called men going their own way. So these are men who think women are so
toxic and dangerous. You have to cut them out of your life altogether. Like don't date them,
don't have anything to do with them, don't have a meeting with them. Then you've got groups
like pickup artists, which is a multi-million dollar global industry, essentially training men
and boys in these boot camps where they give them what they claim are kind of tips to guarantee
to get any women into bed with you. But when you drill down into it, it's basically sexual
harassment and abuse and rape that they're training them into doing. And then you've also got
other groups like so-called men's rights activists who claim to really care about issues affecting
men and boys. So they get a lot of airtime because they've got this kind of veneer of political
respectability. Like, well, that sounds like an important issue. The reality is that they spend
no time talking about male mental health, which is the leading cause of death for men under
the age of 50, any of that stuff. They just want to tear down feminism. They want to talk about
how there shouldn't ever be a, you know, spaces that women exclude men from. They want to talk
about how, you know, rape crisis centres should be illegal. They're just, really, they should
call themselves women's wrongs activists because that's what they're focused on.
So I don't mean to laugh at that, but you know, that's so spot on. So where does Andrew Tate fall?
Andrew Tate, I would say, is a kind of interlocutist. So you have these spaces on the internet, which, although they are really significantly sized, not many kids and teenagers are kind of fully wholeheartedly finding themselves in, committing to. But then people like Andrew Tate or Jordan Peterson or even to a degree, Donald Trump and others like them play this kind of role of intermediary where they will take some of the ideas from those spaces. And then in a kind of slightly sanitised way, they'll
repackage them for slightly more acceptable public consumption, but it will be recognisable
to people in those spaces that it's a kind of dog whistle to them. And it makes it all sound
just that little bit more relatable and understandable and reasonable for public consumption.
So you can have it then shared on BBC news or it's a good thing that you'd see in like a
viral YouTube video or a TikTok video. And it's essentially when, you know, when Donald Trump
starts saying, well, it's a very scary time to be a young man in America. He's not saying
false rape allegations are rife like you would see them.
saying on an in-self forum, but at the same time, that's kind of what he's implying.
And it's similar with Andrew Tate's content, you know, women should bear responsibility for
rape, women are the property of the men that own them, women shouldn't be allowed to drive.
It all goes back to those ideas about this regressive sense that we should take women's rights
away and then things would be better.
And that's got to be the most dangerous, right, of all the kind of content, because that's,
it's almost like a gateway into more radical ideas.
Exactly.
Because you can think, oh, I kind of see what they're saying.
Like it's not explicit.
There's nothing explicit.
Totally.
Especially when it's sprinkled in with like,
you should clean your bed and go for a run
and content that sort of seems to make common sense
and be about saying you should look after yourself or whatever.
But it's this slippery slope where it doesn't start
in those extreme corners of the internet,
but it gradually, gradually ends up there.
And it's also the other thing that makes it so dangerous
is the reach.
So it's really important to say that these ideas aren't new.
They've always existed in history.
There's always been this backlash.
But what's different now is we're seeing for the first time in history,
basically algorithmically facilitated mass radicalization.
So you always would have been able to search for a voice like Andrew Tates.
He's not saying anything new.
But what is different now is that if you set up a TikTok account in the name of a teenage boy,
it takes on average 17 minutes before the first piece of extreme misogyny will be promoted to your feet.
No way.
You don't go looking for it.
It's coming to you.
And the scale of the power of that is I think people don't realize the,
like the numbers we're talking so just on TikTok and just Andrew Tate who's just one tiny piece of
the puzzle his content on TikTok has been viewed over 11.4 billion times so more than the number
of people on the planet since we're up against this kind of David and Goliath thing so it takes
17 minutes for a young teenage boy to be introduced to some kind of misogynistic content that's
right and whose response surely that's the responsibility of the social media platform right
It's the algorithm, yeah.
It's the algorithm.
It's TikTok.
And that's made by a person.
Exactly.
And a company that could be held accountable, exactly.
And this is where we need to go.
This is the right thinking.
But a company who's like at the coronation of the US.
Exactly.
And that's the problem.
Yeah, you guys have literally just summarized the whole thing in 10 seconds.
That's it.
The way that we humanize the men at the center of it is so frustrating because we do,
you know, Mark Zuckerberg is like quirky.
I mean, they're like, we look at the clips from Elon Musk of all of these men that you,
like we humanised them we have silly videos of them same with trump and his hair and his hands and
his tan and whatever else it is and we kind of we make them silly but I say no no this is like
this is really fucked it's sinister yeah it's not silly yeah what they're doing is devastating and
they're becoming multi-billionaires off the back of it which is just so heartbreaking and then
you turn around and people are going what your parents do like should parents be taking away phones
should parents be limiting this like what can parents do and like where are the good male role models
and you just want to say, like, parents are doing so great, parents are trying their best.
There are great male raw models out there.
Like, none of that is the problem.
The problem is the tech and the algorithms.
But there's this bizarre impunity where we do not regulate tech in the same way as anything else.
While I was writing this book, you know, the big, the co-op venue, co-op arena in Manchester, was built.
And it was in the news, loads, because they kept delaying the opening because it hadn't quite yet passed the safety checklists.
So, like, take that, had to cancel a show, I think Peter Kay.
There were lots of big names.
They had to cancel their shows.
And the venue just kept putting out statements saying,
really sorry, we can't let anyone through the doors
until this is completely safe.
But when Matter started its Metaverse and its headsets,
it took three years after they released the headsets to sail for the public
before they even put any parental guidance or controls in place.
Every seven minutes on average in the Metaverse,
someone's exposed to abusive behavior.
But anyone can just step in there right now.
There's nothing to say, well, if the public is going to access this space,
you need to meet these regulations for keeping it safe.
How far away are we with the online safety bill from making any kind of dent in that?
I think all of it's going to come down to how the regulation is carried out by offcom.
And at the moment, if we look at the guidelines that they're suggesting there are a lot of
quite worrying loopholes.
So for example, they focus heavily on taking down content, content being taken down by platforms,
rather than measures to prevent it, to disincentifies it being created in the first place.
And anyone who's tried to have content taken down will know that that's a non-starter.
It doesn't apply to smaller websites, which is such a missed opportunity because when we talk
about deep fake, particularly abuse, what we're seeing is these small websites where the worst
harm is localized because it'll be a website for a particular town or a particular area where
people are custom ordering and making and sharing deep fake pornography abuse of individual women
they know. And that's where in some ways the most significant harm is happening because that's
somebody's locality, it's their name, it's their identity, but those sites are being let off the
hook. But the most worrying thing about the online safety act, I'd say, is that at the moment,
the government keeps saying that they will consider having on the table offering watering down
the online safety act if that's what it takes to secure a favorable trade deal with Trump,
with the US. And like you said, the only reason, those two things should never be on the same
negotiating table they're totally separate things why are they because of the people who are standing
in the oval office that is terrifying absolutely terrified they're literally using women and girls
and the safety of marginalized groups as a bargaining chip for a better trade deal we want to talk
about deepfakes but can I go backwards before we do yeah yeah and just to just to touch on the 17
minute thing yeah with with young boys that feels like I mean I don't know how you combat that you know
like you say parents are doing the most but you are in schools like you are doing a lot of seeing
a lot of research being part of this conversation what is this how is this manifesting in young boys
you know we talk about like violence against girls in schools like what does that look like
so it's a complicated picture there's been a bit of a panic after adolescence understandably
where everyone's going oh my god the boys are all radicalised it's a disaster it's actually a really
nuanced picture there are a lot of teenage boys who want nothing to do with this stuff boys are
actually kind of moving away from Andrew Tate to some degree and actually moving towards deep fakes
but schools are always about five years behind
through no fault of their own.
So schools are talking about Tate now
but aren't aware of how many boys
are making deep fake abusive images
of their teachers and peers.
But we can talk about that in a minute.
But it is about giving them the tools
to, first of all, understand internet literacy
and source skepticism
and that not everything they see on the internet
is necessarily real and that has to start primary.
But it's also a complex picture
because you are seeing some boys
who have been radicalized,
some boys who are pushing back against it
and need support and help.
And then there's kind of critical mass of boys in between
who are really being pushed in that direction by the algorithms,
but just need concrete tools and knowledge
and understanding to feel confident in pushing back against it.
And we can give that to them with really good conversations,
high quality, really comprehensive sex and relationships education
from a really young age.
It could make a huge difference here,
but we're still seeing that coming under attack from MPs saying,
actually, you know, this is outrageous and it's sexualising our children and it's got
to stop. But kids are 13 when they first see online porn and one eighth of the porn videos
they see on the most accessible mainstream sites at the moment show women being raped or coerced
or otherwise illegal acts. So if we say, well, we're not going to talk about this in case we
sexualise the children and we're just failing them, completely failing them. Conversation,
communication is absolutely the answer. Yeah. You mentioned before about in these communities
that men do hate other men within them. Like they hate women first and fourth.
almost, but they do hate other men. And I can see that as the conversation that we're exposed to. You can
see, like, a lot of men don't like the really good-looking men, or they don't like pretty men,
or they don't like really strong men. Or I don't really understand what makes a good man a man,
but according to them, is that obvious in schools? Like when you say we've got like some boys on one side,
some boys fighting it, is there kind of peer pressure among the real life boys, or is this all coming
from online in these forums? Yeah, there is. It's different. So in the forums, what you're saying
is a really nasty, toxic kind of beating down on other men trying to make them feel worse about
themselves, suggesting self-harm to other men. What we're seeing in schools, I would say,
is a huge amount of pressure for boys to live up to a certain performance of masculinity. And it is
the Andrew Tate idea. If you have to be big, you have to be strong. You've got to have fast cars.
You've got to be rich. You've got to have women and you've got to be in control of those women.
And all of those ideas, you can see boys kind of self-policing amongst themselves in schools.
But again, like, there are absolutely exceptions to this. There are boys.
who are standing up against it.
There are boys who are starting feminist societies.
There are boys who are fighting bad, doing amazing.
Absolutely, yeah.
But yeah, generally speaking, that pressure remains and it's absolutely suffocating.
That's the most heartbreaking thing about all this, that this guy who says, like, I'm the
champion of boys.
I'm the only one that's going to stand up for them.
He is the one reinforcing the things that we know are leading to that tragically high male
suicide rate.
We've got all the evidence we know is happening because early childhood stereotypes about
tough men, stop men from getting help when they need it with their mental health.
Andrew Tate says depression doesn't exist.
How can he be the champion of men?
It's heartbreaking for them because you just want to support them.
Those boys, there's nothing bad or wrong about them inherently.
It's just a system where we've thrown them to the walls with these algorithms.
And then everyone seems to be surprised at what's happened.
It's just the algorithm thing freaks me out so much.
And I'm going to get on to deep fakes.
which is part of your new book.
We're going to talk about it because I'm very excited.
But I wrote a piece for Gratzier about deep fakes
because a deep fake was made of me selling a product.
And I mean, I feel stupid now for like I've thought,
you know, give social media platforms a benefit of the doubt.
Is it just that technology is like outpacing humanity?
Like it's just, we just can't catch up.
Like we just don't have the regulation.
We don't have the means for regulation.
yet. But it's it's not even that, is it? It's that they don't care enough. Or maybe not
that not even that they don't care, but like this is actively part of their mission. Yeah. I mean,
I think probably a bit of both, but certainly the former we can say with confidence. In one year,
Facebook's profits were $29 billion. Like, don't tell me that's a company that doesn't have the
money to be responding to reports. You reported it over and over again, that deep fake video of you
and nothing happened. If you look at that study of the Metaverse I mentioned, which found that
every seven minutes a user was exposed to abusive behaviour, they found 100 violations of Metazone
policies. But when they looked at the reporting options that were available, only 50 of them
fit into categories that could be recorded and reported. So they reported them, this was the Centre
for Countering Digital Hate that did this research. And of those 50 reports, there was not only did
none of them result in action, but none of them even got a response.
And you just think, if you're a multinational food conglomerate, right, and you've got this
many people consuming your products every day, and you said, but we're so big, right?
Like, millions of people are using us every day.
Like, a couple of people might die in our factories.
Like, we can't keep everyone, say, a couple of people are going to get sick or die after
eating our food.
It would be no, like, okay, you can't have a company on that basis.
Because we just have this, like, acknowledgement of the fact that there has to be some form
of regulation and accountability.
That reminds me, I had actually forgotten this,
but do you remember last year I had to ring the police
because that guy kept sending me such intense abuse, whatever.
But I, and it was like the most depressing conversation
I've ever had in my life.
The police tried to take it really, you know,
the individual officers that I speak to,
they're like, we get it because I was pregnant,
I had a baby, I was at home by myself,
I was getting these, I had my daughter, Alex, my husband was away.
I was getting these insane messages
and like an extreme volume from this one person
and the police thought we're really sorry
but like if Instagram meta
won't give us access
there's nothing we can do
and they won't so there's nothing we can do
and it's just like okay then I'll just sit and wait
to be like I'll just see what happens
yeah and obviously nothing happened
and that's great but like
what a fucking risk exactly
and you just have to accept it because there's nowhere
if the police can't help what do you do
yeah yeah
where'd you go and that does have a really
big result. When they looked at intimate image abuse survivors who came forward to the police in a
study recently, less than a quarter of them said that it was a positive experience. But these
numbers are ramping up. The police record 10 intimate image abuses a day on average. And we know that the
vast majority don't report. We're looking at by next year, we're on track for 8 million deep fakes to
be created and shared online each year. And we know that 99% of those are deepfakes, sexual abuse,
images of which 96% are of women. So the vast majority of this problem is women, women being
abused. And we're also driving traffic to them. So the number of links to those deep fake
websites from social media platforms has increased 2,000% in the last year. So this is a huge
problem. It's a growing problem. It affects so many people. One in seven adults globally have
had someone threatened to share an intimate image of them. So it's a really, really big issue.
but when we talk about the issue of deep fakes, because this is something people are talking about,
people talk about the threat to democracy.
And they talk about political risk.
There was actually an Interpol report about this, about deep fakes and policing them.
And I thought, great, they're taking it seriously.
It was 22 pages long, and the word women was mentioned once.
There were two paragraphs about sexual images.
But they're 99% of the problem.
So, again, it's considered, just like you said, you know, oh, well, what are you going to do about it?
It's just women.
Two paragraphs.
And it's the most violating...
It is, and that is so hard to explain.
Yeah.
Someone made a deep fake of me,
which was a video of my body,
like close up on my face with my mouth open
and then like orally raping me, basically.
And it's so...
It's so hard to explain to somebody
what it does to you to watch that.
I'm so sorry.
That's horrendous.
Thanks.
But people don't get it.
There's like...
There's, well, it's not real.
So it's meaningless.
And we've always seen this with harms that affect women and girls in all different ways.
They're dismissed.
They're belittled.
We're not believed.
We're told we should be lucky.
It's nothing worse.
And we're seeing that pattern repeating again now with these emerging forms of tech and the abuse that proliferate from them.
We've also talked a lot about this about how, like, a woman's reputation is so fragile.
But it's also hate, I'm literally speaking in cell terms because I'm saying it's our currency.
but like a woman's reputation is so easy to destroy and the young girls is so easy to
destroy and it's so devastating how sexuality is weaponized time and time again right and
that must be a very informative I mean the the we're talking about these young boys the effect
that this is having on young girls this oh my god yeah do you see it as the the threat of it
keeps girls smaller and keeps them hiding away,
or do you see after the effect is the bigger issue?
Do you know what I mean?
Is it more like...
I know exactly what you mean.
I'd say absolutely both.
I mean, absolutely.
So it's weaponised in terms of the fear of it happening
because there's nothing girls can do to avoid it.
I mean, it explodes the myth that victims of so-called revenge pornography
were to blame because what did they expect when they took the picture?
Because now it's happening to people who didn't take pictures.
Like, it's nonsense that it was ever the women's fault.
But yeah, that threat, if I can do this to you, it's nothing you can do about it.
I can easily find one of these apps as a young kid on the app store, on Google.
I can use the free trial.
I don't even need to pay for it.
And I can create a pornographic image of you that is so realistic that your parents won't know the difference.
And we are seeing there was a town that I write about in the book called, it's called Almondrolojo.
It's in Spain.
It's like a small town.
And it was just before the summer holidays ended.
And all of these girls who were as young as 12, but mainly around,
14, their phones started pinging and these images of themselves naked were being sent to them.
And what happened was exactly what you described. Some of them wouldn't leave the house.
Some of them had boys send them the images and blackmail them using the images into like
further forms of abuse. Some of the girls who hadn't actually been involved were terrified
to go out, terrified to speak to anyone in case it happened to them, just like you said, all these
different arms. And the worst part of it is that the police started looking for an, and
organized crime gang. They started looking for adult predators. And when they followed the
evidence and the leads, of course, it came back to a group of 14-year-old boys who were the
girls' peers. And we're seeing this again and again. This is absolutely the next big
sexual violence epidemic to hit schools. It's just that we're not talking about it. There's
been some really big cases in the UK as well as in Australia, in the US. What we've seen in
Every one of the cases I studied for this book is that the schools call in professionals, they call in PR firms, they focus on reputational damage limitation, the girls get little or no support and the boys face no punishment. That's what we've seen every time so far. And how often is it happening? It's happening all the time. I would say it's now a bigger issue in schools than the issue of Andrew Tate and himself stuff that schools are actually focusing on. But no one knows. I was going to say as well, like 14, like teenagers are.
like really technologically skilled, aren't they nowadays?
But actually, it sounds like you don't even need to be that technologically skilled in order
to do this, which is so terrifying.
Yeah, I did it myself for the book on myself to check how, like, how easy is it?
Yeah.
And it literally took me 10 minutes to download any one of a number of apps you can choose
from, plug in some old press photos of me from an event and like, there you are standing
on the red carpet naked and it looks completely realistic if you didn't know, it wasn't real.
And I didn't pay anything.
and I think that's really crucial.
It was like a free trial.
So it's actually not just about closing down these apps.
It's also about stopping the app store and Google
from making them so readily available.
Like, again, there are really clear ways we could literally shut this off,
but everyone's like, well, we can possibly regulate tech,
so we'll have to do something else.
And it is because of freedom of speech, right?
That that's people's, is that the full guy
or is that genuinely the, like the, well, we can't?
No, it's not just freedom of speech.
Freedom of speech applied earlier when we were thinking about social media companies
being forced to give details to the police.
It gets sticky there because in particular countries under authoritarian regimes, for example,
you wouldn't want social media companies to be handing over details of particular users to police.
So it doesn't mean it can't be done or it's not surmountable,
but that's where that issue comes in.
With this, you have something like the UK government criminalising the creation of deepfakes
And you go, right, brilliant.
It's a clear law.
So now we should be able to get rid of these apps.
But it really then just comes back to tech power.
And what we were talking about earlier,
these companies are so incredibly big and so incredibly powerful
in terms of the money that's behind them
and in terms of their alignment with political interests,
that there is no appetite for tech regulation.
There was a big summit about AI in Paris just a few months ago.
And together, countries from all over the world
drafted this commitment where they said,
you know, we commit to the fact that AI should be ethical.
It should be accessible.
It should be used in an equitable way.
It should be safe.
There should be regulations.
It was pretty vague.
It wasn't even really, you know, tying them into anything.
But the US government refused to sign it.
And so did the UK government.
Really?
They wouldn't even sign up to it.
What were the grounds for not signing it?
They cited national security kind of governance protocols
that they didn't think were clear enough in the thing.
But other countries around the world were prepared to make this commitment.
And I think it was really telling that the vice president,
of the US was there, not just refusing to sign it, but saying that we should absolutely
prioritise development over safety. And there was the UK standing alongside him.
He's bloody everywhere, isn't it?
Yeah, he really gets around that guy.
Put him on the bench for him and it, fuck's sake.
So can we talk about your book, The New Age of Sexism, which really encompasses everything
that we've been talking about here, AI, deep fakes, how this, how our technology is kind of
opening up to a whole new world of misogyny and violence against women. I was going to ask why
you wrote the book, but I think I just answered my own question. When did you start writing
this? It must have been last year, right? Quite recently, yeah. Yeah, okay, okay. Because even I'm
surprised for a book to have come out already. Like, this feels like a really recent problem,
but I guess you were aware of it a lot earlier.
Yes, I think it's looking ahead.
The reason why I think it was really important to do this quickly and to do it now is because
we have this brief moment of opportunity.
If we act now, we could tackle this.
We could force regulations to come into effect.
We could tackle it at its inception.
So, for example, in the metaverse, at the moment, these are huge problems, but at the
moment we aren't all living in the metaverse.
We aren't all having our school lessons in the metaverse or our university lecture halls.
We're not having boardroom meetings in the metaphors
or going to concerts there.
So right now, it's not yet embedded in our day-to-day lives.
We're on the brink of AI changing everything in our lives,
the way that we live, the way that we learn,
the way that we love, the way that we work,
in ways that I think people can't necessarily yet contemplate.
And basically, we need to learn the lessons of social media.
If you think about early social media
and the fact that women, women of color, particularly raised the alarm,
they repeatedly said, this isn't a safe space.
There is rampant abuse.
It was kind of brushed off and ignored, and now here we are at this point where for all of our lives and livelihoods, social media is absolutely integral to our daily lives, right?
We don't have a choice.
It's now so huge.
It's such a juggernaut.
It's so well established and crucially it's so profitable that at this point there is no real incentive for its creators to go, yeah, we are going to foundationally reform this thing.
They can get away with sticking plaster retroactive solutions that don't really work and that use women's trauma and marginalised groups trauma as a building block for the thing.
their progress. We want to prevent that happening with AI. So it's about saying now, let's act
now before it's gone so far that it can't be stopped. You mentioned before that after the
summit in Paris, that other countries signed up to doing something. Does it feel like we are,
for better or worse, consistently worse, in my opinion, wedded to the US and just going down
with that? And the rest of Europe is being, because it does feel, I mean, I feel like we talk a lot
about child care and women's rights in that sense at the moment.
And it does feel like the rest of Europe is doing quite good stuff,
like looking at like Norway and Denmark and Sweden.
And like there is good news, right, coming out of a lot of countries.
Does it feel like that Europe particularly want to do good stuff here
and are trying to do good stuff and we for some reason are just not?
Or does it feel like a worldwide massive issue?
I think a bit of both.
It is a global issue.
But yeah, I think you're right.
And I think part of that is the necessity of cozying up to the US because of Brexit.
So we've got this kind of necessity or certainly the government would perceive it that way
of aligning ourselves with the US and with their interests.
But yeah.
So for example, the Council of Europe has been one of the leading global players in terms of drafting
and looking at kind of transnational regulations around AI and around its usage and
around equity and inclusion and safety.
And they've done some really exciting work on it, which of course,
we are on the outside of.
So yeah, I think that is fair to say
and I think it's really devastating.
I don't know.
I just feel like I'm being naive and stupid,
but it's like, I mean, like they've lost their right to abortion.
Like we can see it from here.
It feels like we're just watching this massive tidal wave coming.
We can see this terrible thing happening.
And still we're like, hey, bestie, like, what shall we do?
Like, shall we do that too?
It feels really, really frightening.
Yeah, it does.
I think financial interests there will always, for those people who are in positions of power,
they're always going to outweigh human rights, which is really devastating.
That's so scary to hear, but it's so true.
It would be really good to, for you, if you could give your advice to anyone who's got
either a daughter or a son, someone that, you know, they're trying to raise in this world
in this like rapidly evolving world of algorithms and AI and social media and it just feels very
scary and I think we hear a lot that the answer is education and I know that that's that is the
answer but are you able to give us something more tangible of like what that education actually
looks like what can parents actually do to teach you know both their you know they're their children
whether they're boys or girls yeah absolutely so much of this thrives in silence and so much
of it lives in shame whether you're talking about manosphere stuff whether you're talking about
people who don't feel able to explain to a parent that they've been victimized through deep
fake pornography, like all of this stuff, it thrives in silence. So communication is the most
important thing. But when people hear that, they think, oh my God, I've got to sit down my teenager
and have this huge, uncomfortable, like horrifying, huge summit. And that is actually the opposite
of what you want. What you want is just little and often from as early as possible. It's not too late
if you've got a teenager, but if you are listening to this and you have a child who is, you know,
under the eight of six is the perfect time to start.
And I don't mean that in a scary way.
I just mean it in a way that it is okay to point out how weird it is that when you're in
the toy shop, the chemistry set is on the shelf that's marked boys' toys.
It's all right to stop in the supermarket and look at the magazines and say, why does it say
women's over there and it's got diet and gossip and cookery magazines?
And under the sign that says men's interest, we've got the economist and National Geographic
and the new scientist.
Just having those conversations gives them, it opens up a whole world to them of possibility for disrupting what the world tells them that it doesn't necessarily have to be true.
And I know that sounds really simple, but if you do it just little by little every day, then you're raising a kid who says sometimes the world gets it wrong and sometimes it's not going to be right about me.
And as they get a bit older, starting to talk about consent is really important, starting to talk about modeling bodily integrity, having the right to choose everything that happens to their body.
whether that might be role-modelled through seeing a parent or seeing a family member
and deciding whether they want to greet them with a hug or a high-five or a wave
or a smile is telling them you get to choose.
Somebody else doesn't get to define a physical interaction.
You get to choose what's comfortable for you.
And then as they get a little bit older, it is starting to talk about what online pornography
is and how it might be really, really different from what sex looks like offline.
And that actually a lot of what might happen in those online spaces is showing
things which are actually abusive
and which they are not expected to have to do
because we are seeing so many kids
who are in their early teens if that
you know saying but it's normal for girls to cry join sex
like it's not rape it's a compliment really
it's not rape if she enjoys it
I have to choke her that's what girls want
and I know that sounds so scary
like I know people will listen and think
that you're cherry picking the most extreme
but just have a look at the 2023 Children's Commissioner report
on pornography because it found that three quarters of kids by the time they get to 18
have seen sexually violent porn and that this corresponded with 57% of them thinking that girls
enjoy being choked and hurt and abused and it's what they want and those things are connected
but the important thing is that parents have got power here we used to think for a long time
that like turning off all the porn stopping kids from accessing it was the only solution but really
the solution is about giving them the tools to understand and recognise and know that when they
see it or if they come across it doesn't define them doesn't define their sexual
relationships it doesn't reflect anything close to a reality of what healthy relationships
and consensual sets looks like there was a game that came out recently which basically
encouraged players to become women's worst nightmare and it was called no mercy
and it encouraged your incest like you've got extra points for like raping your own mom
and stuff I know and it took a while it was banned in the UK but I guess that's one part
of this conversation we haven't had is gaming.
Yeah.
Because still, and I need to check,
I was reading the statistics about this yesterday
for a video I was making,
and I've forgotten all of them, obviously.
But I remember growing up playing
Grand Theft Auto with my brother,
I say playing, he'd let me watch.
And that seems to have got more and more prevalent.
And in that, you can pick up prostitutes,
you can run them over with your car.
You can do the most deplorable, despicable things.
So this one game has been banned,
but there are still so many out there
and we haven't really talked about gaming.
Are kids gaming a lot?
Is it just one part of the same thing
or is it in and of itself a beast
that we need to be worrying about as well?
So I don't think again like so many of these things
that the gaming itself is inherently the problem.
There are obviously elements of some games
which are really deeply misogynistic
and that's a big problem.
And there are some elements of gaming culture.
So for example, when kids are like talking over their headsets
that we often see being very misogynistic
or kind of being sometimes gaming,
particularly when it goes into kind of gaming chat rooms and forums
can be a conduit to either the manosphere
or also the far right to white supremacy.
There was one mum I interviewed who said that she heard
her 14-year-old was playing a video game.
She heard someone shout,
feminism is cancer over the headset.
So I would say it's kind of a version
of what we're seeing more widely on social media,
but perhaps more deeply concentrated.
But what is really worrying is that we are seeing a kind of gamification of sexual violence being taken to whole new levels that we've never imagined through these emerging forms of technology that I explore in the book.
So, for example, a virtual reality game that you can engage in whilst using a sex robot.
So the sex robot is in front of you.
You've got your VR headset on.
It looks like there's a moving, breathing, real woman who you're interacting with.
but it feels incredibly real because you are also at the same time interacting with this silicon robot woman in front of you.
And the gamification there where you can score in the game as you score and similar kind of gaming elements of things like AI girlfriends and chatbots,
which are a hugely widespread issue that no one is talking to talking about.
It kind of gamifies women in relationships in a way that is reminiscent of pick up artistry,
reminiscent of the ideas of incels and so on. So it's like a sex doll, VR sex stall? Yes,
there's a cyber brothel in Berlin that I traveled to for the book where you can go and you can
use sex robots. You can order a sex robot to be prepared for you when you arrive. You can
order one that's covered in blood. You can ask for custom things, particulars, and I asked them to
slash and cut and tear her closing before I arrived just to see if they would and they did. No
questions asked. And then you go into a room where there is like an undercreated kind of gynecological
chair that you can use and you have this thing to use to do whatever you want to. And when I walked
in, it looked like there was a naked young woman lying with her back to me on the bed. It looked
so, so terrifyingly real. And no one's going to come into that room. No one's going to stop you
doing whatever you want to do. And when I walked over to her, she was incredible.
realistic. Her fingers were kind of trembling. And then I looked down and realized that one
of her labia had been torn off. And these are not just available in cyber brothels. You can pay
thousands of dollars to get your own sex robots. Some of them will warm to the touch. They
will talk to you. They have a memory. They can be customized to look like a real person, real woman
who you might want to have at home to rape whenever you choose, or perhaps your ex-partner, or perhaps
a woman that you're stalking. You can send them photographs and they will customize everything.
customized skin tone, face shape, freckles, eye color, there's hundreds of different
nipple types and sizes that you can choose from. These dolls have everything except the ability
to say no. They can talk to you, they can ask you how your day was, some of them will be
self-warming, some of them will be self-lubricating, they're making the skin more and more
realistic. Some of them have settings called, one of them, for example, was designed with a setting
called Frid Farrar, and when you turn that setting on, she would essentially allow you to
rape her because she'd be saying no, no, that she didn't want you to. Oh my God. I mean, it's huge
and it's normalising all of these forms of abuse and the idea that women are objects that men
should be able to own, which are straight out of all of the problems that we claim to really
care at the moment about trying to tackle. I'm blown away. I'm blown away. Literally,
I can't believe that exists. I cannot believe it.
But here's the really scary thing, sorry.
The thing is that is the sex tech market is a $30 billion global market.
So it's big and lots and lots of people are accessing and using those robots and these dolls.
One of the Kardashians ex-partners recently spoke about having one made to look like her.
But what no one is talking about, we do occasionally talk about and worry about sex robots.
But what no one is talking about is that you can download a version of this.
the same exact thing that lives in your pocket.
And it's called an AI girlfriend or an AI chatbot.
And you can create her again to look exactly how you want her to look.
You can customize everything.
You get to pick her name.
She will be there moving on the screen.
She's an avatar or she can look very realistic.
She can be like someone who essentially kind of looks like your FaceTiming with her.
But she is available to you 24 hours a day.
If you're a teenage boy, you can have her.
You can have as many as you like.
You can access them for free.
you can jump into rape scenarios with them.
You can abuse them.
In fact, many, many men abuse them
and then share the screenshots of abusing them with each other online
to see who can do the most awful and depraved thing to them.
And when people worry about sex robots, I think, yeah, but these apps,
if you look at last year alone and you just look at the Google Android Play Store,
the top 11 apps, chatbot AI apps, have a combined 100 million downloads.
So this is huge.
This is a bigger problem.
And again, it's something that no one is talking about.
I hadn't heard.
I honestly hadn't heard of it.
I never heard of it.
But it's, I mean, it's so widespread.
Like, there's one company that has 25 million active users alone.
And the thing that really gets me about these companies is that they market themselves as like sexual wellness.
They market themselves as like, we're here to help anybody to, you know, form stronger relationships and to sometimes help their mental health, to feel more confident, to have a virtual relationship.
to have a virtual relationship with someone that can really help you in your life.
And if you're a lonely man, like this alleviates loneliness
and it can help you to learn how to form relationships.
And then you look at these things on the app store,
and every single one of them is a very, very young avatar on the advert
of a very large breasted woman.
You download it and it's like, would you like to choose necrophilia
to be one of my hobbies?
Like it's not about that.
They're marketing it as something that's about wellness and mental health.
And the reality is that it's about misogyny.
It's really insidious.
And they're attracting funding by doing that.
But it goes back to exactly what you were saying earlier
about Donald Trump saying it's a really hard time
to be a young man in America.
Oh, yeah.
So, I mean, we've literally come full circle
because it's so hard to download this app
because the women aren't going to talk to you.
I mean, I saw a financial time story not long ago.
It was all over the, I think it was a reel.
It was on Instagram and it was saying,
like how women are leaving.
men behind. And it was like, because women are earning money now. And it's like, you don't
notice it, but it's like language like that that's just like there, it is, there's poor men
rhetoric, isn't it? Oh, yeah. Or even when there was a horrendous murder case of a head teacher
at a very prestigious school who was shot dead. And they said that maybe living in the shadow
of his overachieving wife in one of the headlines was what caused him to snap. I mean, it's
mind-blowing. And again, it's really important to say that again, this isn't doing
men any favors. It's not like we're punching down on something that could be a really good tool
for men. If men are suffering from an epidemic of loneliness, they deserve so much better than
an app that presents them with apparently breathing, fuckable, submissive woman to do what they want
with because that's not going to help them form real life relationships. That's just going to
ingrain all the behaviors and beliefs that are probably the existing barriers to why they don't
have healthy relationships in the first place. That's a real societal problem we need to tackle,
but this isn't the answer.
I'm imagining as well that if these apps come up against criticism,
that they will say, oh, but we're just, we're sort of,
we're allowing men to carry out these fantasies to people that they can't hurt
rather than real life people.
Do they say that?
Is that like a rebuttal?
Absolutely.
You see that with these apps.
You see it with sex robots, with the cyber brothel.
they all say we're actually doing a great societal good.
Many of them claim that they can help reduce human trafficking
with absolutely zero evidence.
Because what they're saying is if we can provide this thing
that's really realistic for a man to rape,
then we're basically saving one woman out in society
from experiencing that.
But all of the evidence we have suggests
that that is absolutely not the case.
There was a big case recently,
in fact, with the National Crime Agency,
uncovered an AI child sexual abuse ring
and the lead officer on the case said all of the evidence we have suggests that exposure to these simulations, these apps, these deep fake images, makes people more likely to go further down that particular path.
Of course, it just normalises it, right?
Like incrementally, bit by bit, you just get more, yeah, further, further down the path.
And there's no consequences.
There's no consequences and it tells you everything you need to know about our society that we're okay with that with women.
Because think about anything else.
Imagine how stupid it would sound if someone said, like, we're quite worried about murderers.
So we're going to start a facility where people are going to have these life-sized mannequin dolls
and just stab them and they're like spurt blood.
Because like some people are going to need, you know, have the urge to be murderous.
It sympathises with and empathizes and legitimizes the idea that the urge to rape is somehow inherent and normal.
And that's such an old, stupid myth that we know is nonsense.
But also without knowing any kind of research, I would imagine that it doesn't actually appease people who want to rape a woman
or murder, a woman doing it cyberly, whatever, like doing it like that or having a sex
robot, I imagine we'll only take them so far before they actually want to...
Yeah, sorry, it's horrible things say.
We don't have any evidence that backs that up.
What evidence we do have suggests, for example, the same arguments were made about online
pornography when it became increasingly violent.
People said, well, you know, they're just watching something.
It won't have any effect on real-life women.
But if you look at the dramatic increase in young people experiencing non-consensual
choking, for example, in early sexual encounters, there's a very clear correlation between
what people see and what they then go on to enact in their day-to-day lives.
Of course, it's because it's what they're learning.
It is.
And we know these are escalating crimes, right?
You only have to look at Wayne Cousins to see that when somebody gets away with offending
on a certain level, those offences increase if they're not stopped.
It's that saying.
And I think it's like what you get used to, what you're introduced to.
And I think I first heard people talking about it with like,
baby sleep or something crazy but obviously like the babies are human as well like the same
logic applies doesn't it it's like we just we get used to it and we get desensitized to things
when we're exposed to them fuck it's terrifying with all of that I'm praying that I'm praying
that the answer is yes do you have any hope that this isn't going to be as bad as all of that
I do feel a lot of hope I feel very worried about this moment that we're in this precipice that
we're on, we're teaching on the precipice of this new world. And at the moment, the existing
emerging tech that we have is re-embedding various forms of discrimination and abuse within
it. So I feel that it's really, really urgent. But yeah, so, you know, for the book,
I met incredible lawyers, academics, researchers, AI developers who are doing incredible things.
Thank God. The tech isn't inherently good or bad. It's just what a small number of human beings
are prepared to do with that tech to make themselves obscenely wealthy. That,
that's where our attention should be. This isn't an anti-tech book for one second. And actually,
like, AI can do really great stuff in a feminist way. AI can increase accessibility to education
for girls in rural areas. AI can police other AI systems to find out where the biases are and to root
them out. And there are women doing that work. But the problem is that women are only 12% of
AI researchers globally. They're only 20% of AI professors. And when a woman-led team applies for
venture capital funding, she's six times less likely to get it than her male peers. So there is
hope, absolutely. There's always hope. And I think there's such a fight back going on. The resistance
is there. But it's a really delicate moment in terms of how we can be able to scaffold those people
who are doing that work and to support them. I feel like we've got a lot to do. Well, we can start by
reading your book. We're going to leave the link in the show notes. It's coming out this week on
15th of May. Thank you so much for joining us.
Thank you. Thanks for having me. I'm sorry, I always bring it down.
Honestly, you're amazing. We're so grateful. Thank you.
Should I delete that as part of the ACAS creator network?
