Sara & Cariad's Weirdos Book Club - The New Age of Sexism by Laura Bates with Laura Bates

Episode Date: June 12, 2025

This week's book guest is The New Age of Sexism: How the AI Revolution is Reinventing Misogyny by Laura Bates.Sara and Cariad are joined by activist, best-selling writer, speaker, journalist and found...er of the Everyday Sexism Project Laura Bates.In this episode they discuss sex dolls, masturbation, AI, Meta and Take That.Trigger warning: This book and discussion covers a range of non-consensual sex acts, abuse by deep fake technology and pornography.Thank you for reading with us. We like reading with you!The New Age of Sexism by Laura Bates is available to buy here.You can find Laura on Instagram @laura_bates__Tickets for Sara's tour show I Am A Strange Gloop are available to buy from sarapascoe.co.ukCariad’s children's book Where Did She Go? is available to buy now.Sara’s debut novel Weirdo is published by Faber & Faber and is available to buy here.Cariad’s book You Are Not Alone is published by Bloomsbury and is available to buy here.Follow Sara & Cariad’s Weirdos Book Club on Instagram @saraandcariadsweirdosbookclub and Twitter @weirdosbookclub Recorded and edited by Naomi Parnell for Plosive.Artwork by Welcome Studio.  Hosted on Acast. See acast.com/privacy for more information.

Transcript
Discussion (0)
Starting point is 00:00:00 Sarah Pasco. And I'm Carriead Lloyd. And we're weird about books. We love to read. We read too much. We talk too much. About the too much that we've read. Which is why we created the weirdos book club. A space for the lonely outsider to feel accepted and appreciated. Each week we're joined by amazing comedian guests and writer guests to discuss some wonderfully and crucially weird books, writing, reading and just generally being a weirdo. You don't even need to have read the books to join in. It will be a really interesting, wide-ranging conversation and maybe you'll want to read the book afterwards. We will share all the upcoming. books we're going to be discussing on our Instagram, Sarah and Carriads, Weirdo's Book Club. Thank you for reading with us. We like reading with you. This week's book guest is The New Age of Sexism, How the AI Revolution is reinventing misogyny by Laura Bates. What's it about? It's a deep dive into the ever-evolving world of AI and technology and how it's currently shaping our world. What qualifies it for the Weirdo's book club? Well, with chat, GBT, AI and sex robot brothels, it's going to be a weird new world. In this episode, we discuss sex souls, masturbation, AI, meta.
Starting point is 00:01:06 And don't worry, take that pop up as well. Joining us this week is Laura Bates. Laura is a feminist activist, New York Times and Sunday Times bestselling author, and the founder of the Everyday Sexism Project. Trigger warning, in this episode, we discuss quite a wide variety of non-consensual sex acts, and we also discuss how people are abused using deep fake technology. Hello, welcome to Laura Bates. Welcome to Laura Bates.
Starting point is 00:01:35 Welcome to Laura Bates. I'm doing this like question time. I see. Welcome to Laura Bates. Who's representing Runcorn today. Welcome to Sarah Pascoe, who's representing Rompford. And welcome to Carriotline. Representing Barnford.
Starting point is 00:01:51 Big up Barnet. Big up the Barnet. No, we've gone off topic. Hello, Laura. Hello. Thank you, Laura. Thank you so much for coming in. The first thing I want to say, firstly for anyone listening,
Starting point is 00:02:00 I'm not saying don't listen to the episode ever, but really know that a lot of these subjects, reading about them, I'll use the words, fucking depressing. Yeah, there's a lot of stuff that's happening in the world now. Yeah. It's very difficult to face because it's so awful. So if you're planning like picking flowers in the park right now, do some crafting.
Starting point is 00:02:20 I think you should listen with a good thing. Oh, really? Yeah, you don't want to be listening us talking about this book. Like that's helpful. Okay. Do it while you're picking flowers and you remember nature exists. Whatever you're doing, go out and pick some flowers in a neighbour's garden. Nature exists.
Starting point is 00:02:31 The world's always been here. Whatever happens. I like that. Yeah. I put that on the paperback thanks. They're a fucking depressing. Well, I actually wanted to start by thanking you. I wanted to tell people listening that there's going to be serious topics.
Starting point is 00:02:41 And I wanted to thank you, Laura, because me reading it, there were a couple of times where I had to put it down. Because how much it was changing my mood? And I thought, how must it have been for you to stick with it to research it and not to go, actually, I quit feminism. I'm going to write about, you know, netball teams. We're talking about the new age of sexism, which is your brand new book, which is out now. which is just incredible. How the AI revolution is reinventing misogyny. So researching this, how did you stay sane? I know at the end you say that you did get angry.
Starting point is 00:03:14 I did. I got very angry and there were times that I just stopped and stepped away from it. I think for me the thing is the feeling that you're doing something about it. It's the only way to... Yeah. Like when you're confronting these things that really are incredibly grim and awful, generally speaking, I'm doing it with things that often other people aren't yet aware of. So the way through it for me is to go, this is horrendous, but there's a really important job to do here and making sure that people know about it
Starting point is 00:03:40 so we can fight to change it and that kind of pulls you through. Yes. It must have been odd when adolescents came out a couple of months ago and people like, we need to talk about this stuff and parents have no idea. I'm ready. You expect two years. Here I am, guys.
Starting point is 00:03:55 Because essentially, I mean, I'm a parent. Carriad is a parent. The first step has to be, obviously knowing that things are happening or that apps exist, for instance. Yeah, and at the moment we have this bizarre moment in history that no one ever talks about weirdly, which is a generation of non-digital natives, parenting and educating and raising a generation of digital natives. Oh, I talk about my friends all the time because I feel like a Victorian. We are.
Starting point is 00:04:20 I am closer to the Victorians. I have more in common with like the carriage riding, you know, playing British Bulldog than I do with a child now. Like their experience is so alien to me And I'm constantly having to catch up It's I feel like our generation is really It's going to be the hardest Because we You know we can relate to someone who's 90
Starting point is 00:04:42 Much easier than we can who's 19 sometimes And that is going to cause some problems When parenting teenage As is happening now as adolescents have shown And so the So starting with them I guess are they called the deep fakes That when people do the nudes of people
Starting point is 00:04:57 Is that called deep face? Yeah, when they make a face image that's not real, but it looks real. Yeah. Yeah. So you're talking about that and the examples in schools and really the sort of the shrugging of authorities and parents because the perpetrators are teenage boys. So you might say sometimes they're 13, sometimes they're 14. And so the attitude of what do we do then is because they are just finding out after it's happened rather than knowing that these things can happen and talking to their children years in advance about it. Yeah, and the schools repeatedly are prioritising reputational damage limitations.
Starting point is 00:05:34 So what's happened? This has happened at several schools. It will be the next big sexual violence epidemic to hit schools. It already is, but it's not really known about yet. And what's happening is that when it happens, these schools, hiring these big London PR agencies to come in and kind of handle the narrative in the fallout, whereas the girls are left with no kind of support, often just back in the classroom with perpetrators,
Starting point is 00:05:56 and no kind of punishment or action or education is happening for the boys. Because you've got several examples of well-documented cases of one or more groups of children making fake, very realistic nudes of lots of people that they know, they just have to upload a photo into an app. Sometimes they don't even have to pay very much money or nothing at all, it's a sign-up thing. Yeah, I thought that was gross when you were like the first one's free.
Starting point is 00:06:21 I mean, it's so like drug dealing, isn't it? It's like sort of that old-fashioned idea of like that 1960s, like drug pusher, like gives you the first hit free and then you like won't come back from what? Do you want to know what the really wild part of this is? The apps don't work on pictures of men. It's just terrible. And this thing that you're saying about the deep fix, the thing that I found really horrendous is like all the kind of criticism about it has been the danger of it politically. So you talk about that really eloquently at the start of the book, which is amazing, by the way, it is an absolutely brilliant book, that everything that's been dealt with in terms of,
Starting point is 00:06:51 oh, how do we police this? How do we legislate this has been the dangers this could affect an election. Whereas actually, like, in this side alley over here of the internet, it's affecting fucking 14-year-old schoolgirls, far more than it is affecting elections and, you know, politicians' reputations. But I think it's so indicative of our society, isn't it? It's just like this isn't seen as a serious problem compared to, well, these big important men in suits might be affected. Also, in terms of women in politics, and this isn't a defence.
Starting point is 00:07:18 Could you believe that? But they've already put up with so much. And everything that the internet has ever come up with has been weaponised against them already. So while I'm not saying it's not okay to do it to a politician, no, no, at least when it does, they'll go, yeah, what did I expect? They tell me I'm ugly, they tell me they dissect everything. I've already got all of the hatred from the people who don't want me to exist. Whereas a school girl has none of that.
Starting point is 00:07:41 It's completely nothing to do with her. And she hasn't chosen to be in a public facing position. And I'm not saying that means that it's okay when people do those things to you. No, no, no. But like you said, it's the fact that you're then dealing with it. It's a 14-year-old boy. What understanding does he have of what he's done? when it's being marketed to him as like a video game.
Starting point is 00:07:58 And also, absolutely. But 14 males do do bad jokes. They do make poor decisions. We know lots of things about adolescence in terms of like impulse control, those kind of things. So unless you talk to them about these things and what it's like when they're sort of seven, eight and nine, you know, in terms of like pornography, sex education in school now has to talk to children in the years before they become exposed to things
Starting point is 00:08:20 so that they are prepared. But all of us are really deeply uncomfortable with the idea of what you can. say to a seven and eight year old when it's sad as an adult to know about those things. And like we're saying it's difficult. We grew up in the world where like they would like boys in our school would have nuts magazine or loaded and it was kind of like you had to like as a girl be like, ha ha ha that's funny that there's all these tits everywhere and we're all reading it as if this is fine. And what they're dealing with is just so much more extreme than what we had to deal with. So again we're talking about this non-digital native thing, aren't we? Totally. Yeah. It's that divide again. And also
Starting point is 00:08:51 the age at which it's starting is so young. But like you said, people are terrified to talk about it. but you can talk about it without talking explicitly about porn. There is so much that we could be talking to children about from a much younger age, not just around things like consent and bodily autonomy and respect, but also actually about internet literacy, about understanding image manipulation and source skepticism and all of that stuff can be explained earlier. But you're right.
Starting point is 00:09:16 Like fundamentally, criminalising a 12-year-old isn't the answer here. The answer is why were they able to download an app that does this like that on the app store? why can they put it into Google and get 17 different instant options? Like they can't go into a shop and buy a giant bottle of brandy at 8. Yeah. Like you're going, no, not you. The person who sold it to them would be punished but not checking their age. Exactly.
Starting point is 00:09:37 Yeah. And so it's really simple when you go, all you have to do, now the technology has appeared out of nowhere, is use the same laws we have everywhere else. Yeah. And they're like, but how would we? Yes. It's so confusing. I know.
Starting point is 00:09:50 I think it's really like the whole book, you talk about this so eloquently, that every time there's a really obvious solution, we're fit that it's hit back defensively with like, oh, that's really difficult. And you're like, you're literally creating a digital universe that doesn't exist. But you can't put in like police in the digital universe. And also, they don't mean difficult ever.
Starting point is 00:10:06 They go expensive. It would be a small dent in our profits. You know, $20 billion annual profits. Yeah. It's bizarre, but we've never thought about tech in that way. Yeah. I was thinking, when I was writing the book, the co-op arena in Manchester was being built.
Starting point is 00:10:20 And they kept postponing people's shows because it wasn't ready because of safety regulations that hadn't quite met yet. And all of these huge shows, like take that and people like that got cancelled. That was my favourite bit of the book would take that conversation. That was my little beam of sunshine. Some boys are nice and there and take that.
Starting point is 00:10:39 Hold on to it. But that, you know, as a society, we totally accept that. Like, we're not going to let people go in there until it is completely safe. But people are already walking around in the matter of theirs. People already consuming these forms of tech that have zero regulation and are causing devastating real world harm to particularly young girls' lives.
Starting point is 00:10:58 I found what you said about the Metaverse, just really frightening. So the Metaverse is this new place that's controlled by Meta, the Facebook company, and that people are, at the moment, it's quite new and it's quite, there's not, obviously, lots of people aren't existing on there, but it's like a digital world, you put your VR set on, you walk around, you talk to people. And children are in it. And there's children in there and they're pretending to be adults. You are being, people are being groped as avatars, people are being like raped, abused.
Starting point is 00:11:26 And I was like, oh God, can we just be trusted anywhere? Like, what is wrong with us? The things that happen, the horrible things that happen in the world will also happen in the virtual world. Of course, of course. And then there are things that happen in the virtual world that people maybe wouldn't dare, or then we worry, is this encouraging behaviours that would then come back into the real world? So it is, it's such a grim thing. It reminded me of an article.
Starting point is 00:11:47 I can't remember the name of the journalist. It's a male journalist. It's in The Guardian. a few years ago and he was researching into VR pornography. And as part of the article, he thought it would be interesting to switch and to be inside the, well, I guess the position of the female in the pornography, because obviously it's not a person. Yeah. And he was really, I don't want to use the word traumatised, but he was very shocked and emotional about the experience
Starting point is 00:12:12 because he'd found it very scary, having a fake man, having sex with him as he would a woman in a pornography. And the point of the article, the purpose was him realizing as an act of understanding how interesting it was for him as a man. To go, oh, I see. This is what women are seeing. This is what they are experiencing sometimes. This is what scares. Imagine it to go into the metaverse or to go into VR pornography, you were, before you could access it as playing the role of man, you had to play the role of women.
Starting point is 00:12:42 Like, imagine that was just the law. And you had to have it simulated and be like, oh, that didn't feel very nice when I got slapped across the bay. You've got examples. There's so many people saying it doesn't matter. It wasn't a real assault. It happened fake. Brilliant. Put it on.
Starting point is 00:12:53 And you can tell me your heartbeat didn't change. But I got so angry about that argument saying, oh, it doesn't count. Because why do computer games exist? Computer games exist because they are fun and exciting. Your heart rate literally goes up as you're running around shooting. I'm a big gamer. She's always on the one with the game. I'm on the Super Switch.
Starting point is 00:13:10 Yeah. The Mario Kong. And I'm like, leave the house. I'm always there. But literally, the point of computer games is that it feels really. real? Like when you played them, like people don't... That's why they're addictive. Because they give you all of the same...
Starting point is 00:13:23 Your heart rate rises, your hormones change. It's real. Like, yes, you're not shooting someone, but it feels exciting to be running around. And pleasurable. So if computer games wouldn't exist if the feelings weren't real. So, of course, if there's an assault in the metaverse, it will feel real. Like, I couldn't understand how they were even trying that argument. I know, but so much of it is like, I mean, thought experiments. Like, especially when we get to sex dolls, which we haven't yet. like it's thought crimes, right?
Starting point is 00:13:51 Then that's what they're saying is the person just thought it and enacted it, but it didn't actually touch you, which means any effect on you is an effect on your... But that's denying any effect that has like when you play a shoot-em-up game. It's like, no, you're not actually shooting someone, but you definitely feel excited or nervous or scared. So even if someone isn't physically grabbing you, you will feel scared? Like, I can't understand how that the tech companies even have that argument.
Starting point is 00:14:18 How did you feel when you were writing to the tech company? Like when you were getting those answers back, did you just want to throw your computer or something? Yeah. I think it's frustrating for so many different reasons. Like partly because you can go down this road of having this whole argument about it does it count, is it the same? But then you stop and go, wait a minute.
Starting point is 00:14:35 Why are we even having, this is the argument people have thrown at women for all time, right? Don't make a fuss about a wolf whistle. At least it wasn't a rape. You should think yourself lucky. And you think, well, hang on a minute. First of all, this is a form of harm. Like, why are we even comparing it or saying, is it the same? The point is it shouldn't be happening.
Starting point is 00:14:50 The second thing is that these guys saying it's not real, don't worry about it. They're the same people who are developing full body haptic suits with millions of senses. So it will feel real soon, even if it doesn't feel real now. Also, in terms of our law, assault, the definition isn't making physical contact. It's the point of the threat you think someone's about to hit you. The threat or knowing someone wants to hurt you is what's happening in those virtual spaces. And that is scary and that is upset. Even not even wanting, like you said, the sort of experience you said that someone got groped while you were in the metaverse.
Starting point is 00:15:25 Yeah, in front of me. Literally in front of you? Literally right in front of me. And there was like this big kind of blackboard in the world that I was in at the time. And I went over and wrote on the blackboard, have you been assaulted in the metaverse? And I was just surrounded by women coming up going, yeah, yeah, of course, all the time who hasn't been. Like it's just, we've created this entire new world. And our baseline starting point is, well, obviously women.
Starting point is 00:15:50 They're just going to be assaulted here every, you know, every seven minutes a study found that people are exposed to abusive behaviour there. And that's the other thing that you're like, why are we arguing about the minutiae of like what exactly constitutes assault? Instead of going, like, couldn't we please in building this entire new world that, you know, Zuckerberg and his cronies are hoping we will eventually be not just living in, but working in, socialising in, you know, learning in. couldn't we maybe like aim for a slightly higher bar to start with? Isn't it so often? Like your whole book is like, could we, sorry guys, suicide to us.
Starting point is 00:16:25 Could it not be gross? Could it not be gross? And I'm like, why is that the fucking bar we start with? Exactly. It's like the metaverse is being built in like 1820 or something. It's like the starting from like, well, obviously women will be there for us. And you're like, we've come so far. But yeah, when we build in your world, we go back again.
Starting point is 00:16:43 Because maybe we've come too far in the real world. because otherwise you go, why do we need to live and work and learn somewhere else where we've already got a place? Why don't we just sort out climate change? Boys, boys! Oh, because we can vote now and sometimes give you our opinions. Me like well, lady. Me not like when ladies.
Starting point is 00:17:00 May not go space, make new world or make made up world. We all lived there. It's just nuts, isn't it? That was a caveman. I'm not very talented actors, is what I mean. Laura, you've got it. I thought it sounded like a cave. Good, thank God, okay.
Starting point is 00:17:12 Well, there'll be a deep dive into your Sarah's behaviour. Laura, and as there should be. Well, you did say caveman, those are that, so everyday sexism, obviously. Yeah, okay. Cave person. I'm sorry. The amount of books are, when I'm reading books,
Starting point is 00:17:23 I don't think cave women. I don't think cave women are, that's my everyday sexism. I think they're probably much more evolved than cavemen. Oh, they don't say, me hungry. They say, I'm all, I'm a bit peckish, you. And he says, yeah, I'm hungry. I am hungry, but I couldn't possibly have a leaf in front of anyone
Starting point is 00:17:37 in case you thought I was chubby. Oh, no, they've got the same issues. They invented them. Oh, I'm glad we found out whose fault is. Cave women. I wanted to ask you, because you just mentioned going into the metaphors yourself, you put yourself through lots of things in this book. Yes, my goodness.
Starting point is 00:18:01 And I wanted to ask, did you feel like you had to do it to yourself? So say, for instance, looking at the apps and what they can do to a picture. Yeah. And the cyber brothel? You went to a cyber brothel. So you put yourself through experiences. Did you think I can't write about it unless I've done it? I mean, unfortunately, before I started writing the book,
Starting point is 00:18:21 people had already sent me deep fake pornography of me, abusive deep fake pornography that they'd made as a form of abuse. So very briefly for the book, I just wanted to make it clear to readers, like if you're a 12-year-old boy and you've just got the internet, how easy is it to do this? And obviously there was no way to do that with, I didn't want to do that with anyone else's picture. So I just used a picture of me to quickly log on and see. And it was literally like less than 30 seconds. And then I deleted those and closed them. But the experience was one that I'd already had at the hands of other people. I'm really sorry that's happened.
Starting point is 00:18:51 Thank you. And this isn't a very useful take on it, but it just shows how angry you're making the people who are already very angry. Yeah, yeah. And it is weird. Your existence and you're sort of speaking out to their behaviours. Yeah, and it's just giving them new tools because they've been angry with me for a very long time. But it's interesting because it's this weird like amalgamation, a very, very old misogyny.
Starting point is 00:19:16 Yeah. Which is you're a woman, you're saying things. We don't like it. We want you to shut up. And it's also very old misogyny. How do we shut a woman up using sexual violence? Yeah. But what's happening is these new tools,
Starting point is 00:19:28 which we are pumping billions of dollars into the development of making completely available to anyone without any kind of safeguards, then giving them these incredibly visceral new ways to carry out that abuse. Because sending me a video of himself forcing his dick into my throat and coming on my face is incredibly new as a way. of doing a really, really old thing. It's like a weird mashup. I think that misogyny in a lot of ways is going,
Starting point is 00:19:56 how dare you when you're only this to me? One of the phrases that are vagina salespeople. Oh, yeah. Yeah, that you got from a chat room, didn't you? That a man said about women. Sex robots reminding women that are just vagina salespeople. That's their anger and the use of, I guess, a threat of assault or descriptions of assault.
Starting point is 00:20:17 and now with this technology it's going to stop behaving I see you as this how dare you ask me to respect you as a human being how dare you speak how dare you have thoughts how dare you exist I didn't be in my vision
Starting point is 00:20:31 I think sometimes like women on television that's what we get is like I don't want you to be there to go away it's so strange isn't it because it's like you said it's so deep and ingrained with some people and it's so old fashioned
Starting point is 00:20:45 but it's not like women are new Like, we're here. It's almost like, well, they've just turned up. And it's like, now, this new version of women with opinions. And if anyone who is a feminist knows, like, this is not new. Women have had opinions for a very long time. But like Sarah said, it's this outrage that you still continue to be here or that people are listening to you.
Starting point is 00:21:06 And the way I must punish you is by this horrible horrific. And then the tools are being developed by men to allow that. That's the frustrating thing. Men who have historically felt powerless and miserable, angry, unhappy, all of those things, I think historically would have had a lot more women around them that they could behave despicably to, like maids and servants.
Starting point is 00:21:31 And it's women becoming more and more autonomous, which means that they have to take it out on strangers. I mean, men would have had just, even just in like the home, like if they did have a partner if they did have children there were people around and they could dominate behind clothes yeah yeah
Starting point is 00:21:48 like men always have been despicable I was like that's my daily mail headline men have always been despicable always been despicable and also I thought the one of the things you wrote
Starting point is 00:21:59 sort of blew my mind was that the deep fakes the AI that's needed for this technology is burning more fossil fuels so like we're in this age of climate change we're in this age of like
Starting point is 00:22:10 trying to help this problem them talk about it. And then very silently, again, in this side alley, there's just like, they're like just building coal factories to make sure that women can, against their consent, have porn made for them. And you're like, well, I feel like, I want to go in that room and be like, no, you don't be doing that. The world's on fire. Why are you using this to do this? It's, it's, yeah. They've literally delayed the closure of coal fire power plants because of it. And if you do a search on chat GPT, it takes 10 times as much energy as a standard Google. search. And if we look at the extent of AI which is being used for this, and it's not hysterical.
Starting point is 00:22:48 If you look at deep fake images, 99% of them are pornographic and 96% of those are of women. So you can literally draw a line between them adding fuel to the planet burning and the abuse of women. Like that's what we're burning the planet for. That just made me that I had to put the book down. So I was like, you understand that there's this world of pornography. There's this world of men that want these kind of images. I understand that exists. And then I understand there's climate change.
Starting point is 00:23:13 And I'm like, don't you, what the fuck? Don't mash these two together. It's bad enough that both these things exist. But yeah, I had to put the book down and have a moment because I just thought, is this? And my husband gets saying, you're right. I was like, I think you shouldn't talk to me for a little bit. Like I just, I'm so angry that this is happening. And I find sometimes that this is so hard to take in that you then almost block everything out
Starting point is 00:23:35 because it just feels like, what do we do? How do we cope with this? How do you parent? This is why I find stand up really. useful because when you so it starts often from a place of anger and then the next question is how do I find a way of talking about it which either involves inversions flippancy but eventually at the end of it hopefully what you have is something which is like hearted enough to talk to a group of people about but you still are making your your point or vocalising it all and so I found out
Starting point is 00:24:05 does Mark Zuckerberg come to your gigs because that's what we need so I found out shortly after I had a baby that and I found out that I had pornography made of me from Channel 4 News who were making a program with Kay Burley all about her work and getting it made illegal and taking it to the High Court. And it happened to a couple of my friends as well that that's how we found out it existed as being asked to go on TV and talk about the existence of our pornography. They were very sensitive, I should say. They weren't like, guess what? Here's some screen graphs. Do you want to come on? Got some big news. I talk about it in my stand-up because the purpose of it, and I won't
Starting point is 00:24:37 say the name of the person I was talking to about it because I don't want anyone to go out and look. And I don't think that people listen to a book podcast would then go, oh, hang on, that exists. But I also don't want to name. The reason that some people have chosen not to do stand-up about it is because you don't want to encourage anyone to go look at it if you find the idea horrific. But I sort of had to, had no time to process it because I was so postpartum. So I couldn't have been at a point in my life where, number one, I was less sexual. But actually, I also didn't feel humiliated by it. But I knew that someone was trying to humiliate me.
Starting point is 00:25:07 And so the only way, but stand-up, I literally have a mouthpiece, which was, what most people don't have a platform to go, how do I process this so that I can talk about it and whatever they meant to do to me, they haven't done? And that's not me saying that's how anyone else would be able to process it. That's how you process it. It's how I process things that you get angry about. But it also then sometimes I realise I do make myself a fictional world to live in. So with reading your book, it was getting pierced my world where I was like,
Starting point is 00:25:36 and even you just saying that about the climate change issue. And that's with the internet in general as well. Yeah, yeah. The storage that we don't, I didn't know. I know. I didn't know until really, really recently. How much water it costs to call it all down? Because again, you have this idea that it's all just like up here in the ether.
Starting point is 00:25:52 They call it a cloud. I'm not in charge of clouds. It's a bit cloudy. Yeah, that's not good. I think with teenage boys as well, that was the age where I had the experience of, oh, I am a gatekeeper for something I didn't want to be put in charge of. Yeah, yeah. I'm the one that, or me and my friends, that you have to get drunk or trick or manipulate
Starting point is 00:26:19 because we're withholding a thing they want. We didn't even know it had value. So that's the tricky thing about teenage boys doing it. I have a memory of like being a teenager on a beach on a holiday and doing something with a airboy. And then like a camera flashed. And I remember, and I stormed over and I opened the back of that camera and I ripped out the film. And I was like, fuck you. And it was their mates.
Starting point is 00:26:42 And I remember thinking, ugh. annoying and I was just reading your book and you know talking about this lack of consent of images and just being like I remember you telling me that about university that story so I mean it was a really horrible thing that happened to you and it was horrible yeah it was terrible yeah it was what now sounds again like Victorian equipment yeah exactly the fact the fact that it was I could go over and I could open the back of that camera I knew I'd ruined it and also a flash new you knew that yeah yeah yeah I know it just it's it's yeah Laura how do you cope And do you cope knowing all this stuff?
Starting point is 00:27:15 Do you ever think I don't want to deal with this anymore? Not really. I think because so much of my work is about hearing from women and girls and hearing their stories because the everyday sexism project and the work that I do in schools, I'm meeting girls particularly in schools who are affected by this in ways that's so devastating. And you just have such a sense of urgency of wanting to change it for them, just wanting to find ways to stop it. And thank God for people like you and that amazing woman in the Spain case, that Spanish school, like you said, these girls, the images were shared.
Starting point is 00:27:49 And one of the mums came forward and was like, I'm not having this. I'm not having her be embarrassed by this. Like the boys are the ones who should be embarrassed. But without these people like you and that amazing doctor standing up, it's it takes women standing up and being like, oh, by the way, this isn't okay. Yeah. And that is actually the problem with so much of this technology is that it uses women's trauma. and the abuse and trauma of marginalized groups as a building material. And that's what we've seen with social media, right? They improve it based on when people are abused and they don't improve it in a way that actually works.
Starting point is 00:28:25 So it's about trying to say we're about to build a whole new world. AI is about to transform our world and our lives in ways that people can't even imagine. Could we at this point say, actually this time around, let's do this first, let's build the safety and the equity and the access and the fairness into the designs stage instead of being retroactive inadequate solutions that use our trauma as building materials. It's not. That shouldn't be okay. There was actually a great example of this. Microsoft created this chat bot and they called it Tay. And they put it on Twitter and they said, this is great because this is AI and it's going to learn from every interaction it has with every Twitter user.
Starting point is 00:29:03 It will become more intelligent, which shows that they had never been on Twitter. And then they released it onto Twitter and its first tweet was something like, I love human beings. or something like that. And it was so excited and brand new. Less than 24 hours later, it tweeted, I fucking hate feminists and they should all burn in hell. And like, what that shows is that AI learns from existing data sets. It doesn't just replicate them.
Starting point is 00:29:29 It amplifies them. Then the worst part of it was that they did this thing where they released a statement in response to all the abuse where they said something like, well, this is good data for us. This is great where, you know, we'll use this to learn from. And then they held this kind of like open hackathon. where they invited people to come and help make the chatbot better. But if you look at what they really did there,
Starting point is 00:29:48 they basically said, we'll crowdsource free labour to improve our product and we will use the enormous abuse and pylons that are encouraged, particularly on Women of Color on Twitter and the abuse they endured, as a really convenient building material for us. And that's like a microcosm of the whole problem, I think. But it's like you said, have you built a stadium and you were like, everyone, as many people can get in and then when it collapsed, we'll know,
Starting point is 00:30:11 Oh, it can't take more than 3,000. Thank you. Let's move these bodies. Like, you just wouldn't be allowed to do that. But here in the internet, like you said, nothing is counted as real yet, even though we know it is real. Everything's profit. And to police it would reduce profit
Starting point is 00:30:33 or costs the company themselves money. Because the solutions to a lot of these issues, I guess that's the thing is who would pay. So, for instance, I thought a lot about sex dolls because I think they should be called masturbation tools. I think the language has to change. Yeah, a doll is gross, isn't it? It's not sex.
Starting point is 00:30:52 It's like masturbation. It's a child's toy. It's exactly the same as if you, look, if you want to put some chicken breasts in a shoe and put your dick in it, it's not a lady. You're masturbating. You know what?
Starting point is 00:31:04 You've got $11,000 for that sex doll or two chicken breasts in a shoe down the Womford market. Come up, roll up, right. But in the 90s, that was the sort of, that's where sex dolls were. In Wofford. In Wofford.
Starting point is 00:31:18 They were. That's what that's, and so now, yes, you've got some silicone and maybe it's got eyelashes. Are you talking like raw chicken breast or those chicken brushes you put in a bra to make your boobs that bigger? Oh, like chicken fillets? No, I'd say probably just actual chicken. Anyway, anyway, it's a form of masturbation. And I thought what was so interesting in your book is, you know, how they're advertised to people, especially when it's pretending it's fake sympathy or, you know, his company or these are lonely men.
Starting point is 00:31:42 And it's like, but the trouble is, isn't it? Because there is a solution to loneliness and that is socialising and kind of. this reaching out to people, you'd have to pay people so much money to hang out with these guys. That's the thing. They'd have to be a really well-paid job. As you said in the book, none of these dolls are old. Like if it's for lonely men, why aren't you providing like a 70-year-old woman that looks like a seven-year-old woman so that it would feel like the right companion for him that has equally
Starting point is 00:32:05 memories of the war? Why are you providing a 25-year with massive breasts? What does she have in common with this person? Nothing. Like, no offence to seven-year-old men. But the answer is that none of it's about that. No, exactly. And also that they are trying to trick you into thinking it is a real woman
Starting point is 00:32:21 and they advertise it explicitly as so much better than a real woman. That's what I think they shouldn't be able to. That's where I think the language I would police it. Whereas a little bit like, you know, I know it's not a solution, but any altered fake image has to have something on it saying this is not real. Yeah. This is fake. When you're an actor and you get sent scripts that are like not out there,
Starting point is 00:32:41 you have to sign an NDA. And whenever you get a script, it's got a massive watermark of your name. So they know who leaked it. know who leaked it, right? It's like everybody sees these scripts and you have to be careful with them. This is like a Channel 4 black. Yeah, yeah. Or like a new sitcom for Apple, like it's not a big deal.
Starting point is 00:32:57 Every AI, like, deep fake, should have deep fake written across it. Like, why is that not possible? Getty images. Get the images, yeah. Yeah. So you like, you can't, you can't put it on your Instagram. You just know, and it doesn't mean that's not a solution. It wouldn't solve the problem. But the same with the sex cells, it's not a brothel. You're going to a, it's not a brothel. You're going to a masturbation shop. to masturbate with something someone else has masturbated with.
Starting point is 00:33:20 Silicon masturbation tool, yeah. Can we talk about that a little bit? I mean, it's grim. But you went to Berlin and you went to a cyber broth. Can you explain what a cyber brothel is for those who haven't experienced it? Yes. So sex robots and sex dolls have existed for a while now and there have been various different establishments around the world actually
Starting point is 00:33:40 where you can go and they describe them as doll brothels sometimes. So where you go and you use one of these like sex dolls. And in Berlin, what they've done is they've created a place where you can go and interact with these dolls, but you can also use various different forms of technology to make the experience sort of come to life, as they would describe it. So one thing that you can do is you can have like a voice, so someone will be verbally responding through a speaker to what you're doing with the dolls. So you can kind of pretend that she's alive and responding to you. But what they've gone a step further and done is incorporated AI and virtual reality pornography. So you can have a headset on and in your headset it will look like you are in a real room with a real woman.
Starting point is 00:34:22 But the physical thing that you are interacting with is one of these sex dolls or robots. And so it's kind of, it's all planned to do exactly the opposite of what you said would be helpful, right? Which is to trick the man's brain into thinking this is real, this is a real woman. And in that case, what they as an industry are trying to do is to get closer and closer and closer to the point where it's like, providing you with a real woman, but you can do anything that you want to her because there's no consent. She has no autonomy. And it's on some of the websites it says like what happens behind closed doors stays behind. Like there's very clear language which is like protecting the men. Yeah. It's very clear language of like you can do what you're like. That's why this is exciting.
Starting point is 00:35:05 And a lot of these sex styles are dressed as school. There's school children options. Like it's absolutely actively. Yeah. You can have one that is covered in blood when you arrive. I asked them. I obviously didn't go in under my in. name, I told them I was a man. I went in with hat and glasses and my hair under the hat and stuff. And I asked for mine to have ripped clothing just to see if they'd do it. And there were no questions asked. I turned up and it was ripped like fishnet stockings and a kind of ripped shirt. They looked like something it'd like clawed at it. And when I went over, it looked like a corpse on a bed when I walked in. And after I locked the door behind me, I went over and looked at it more closely and one of its labia had been torn off, which I think gives you an idea of the way that these things are treated. And it is. is very far away from the lofty marketing speak of like an empowering. They call it the future of sex. That's how they market themselves. I don't know if you ever saw Frankie Baud led some amazing stand-up about sex dolls.
Starting point is 00:35:59 So Frankie's had this amazing routine about how they would, obviously, when they got more popular, start having adverts in them. So halfway through sex, they would clamp down on your neck and go buy a rover. Like how YouTube used to be fun and free, but now we've got to sit through. All technology ends up being advertising. I hope, I hope they have to press skip, skip on a nose deal to finish them ourselves off. Oh, God, I mean, like we've said, Laura, what you did is so amazing because these are spaces that we know as women we're not welcome at. We're not made to feel welcome out. We're not made to feel safe.
Starting point is 00:36:33 And that's sort of how it's not got away with. But it becomes this dark corner of the internet that like most of us go, oh, I just won't really look in there that looks really disgusting. So the fact that you've gone down that room and been like, and then come back and be like, guess what they're doing? It's like, oh no! But we did need to know what they were doing down there. But men are doing that with sex workers. Yes. And, you know, making pornography, and obviously not all men.
Starting point is 00:37:00 But a portion of men, mostly men. Well, it's like you said, there's one male in this one in Berlin, isn't it? There's one male. But it's mainly kind of marketed to men. Yeah. But you're right. Sex workers are a completely voiceless community in this. conversation and we're so rarely here.
Starting point is 00:37:19 They've been saying it's not about sex. It's about enjoying the fact we don't want to do it. Yeah. I mean, there's been, in media, we see films like Pretty Woman or the girlfriend experience and we think there are men who are looking for a connection, part of it's sexual and it's easier for them to pay because they're busy businessman, always traveling around. And then there's been women going. They enjoy paying us because they love.
Starting point is 00:37:46 like the fact we don't want to do it. They're not trying to seduce women. It's not too hard to go out and meet women in bars. It's because they like the fact. But it's such what you talk about in the book, which I think is so interesting of like, you know, a sex worker does have a voice, can say something, whether it will be listened to. Obviously it's up to that transaction. But creating a world where this woman has this dull, has no voice, will do anything, can be treated as anything. And I think again, we're hitting this line where people go, oh, well, that's just happening. room that has no ramifications outside of the world. And as you're saying in the book...
Starting point is 00:38:20 Or being argued that it's cathartic. Yeah. Yeah. And sex workers will be the first to bear the brunt of this because the ones, the quotes that I include in the book are from sex workers saying, if this becomes normalized, then they're going to conflate this with sex work. They're going to come to us and expect the same kind of deal that they can do absolutely anything I like, that there's nothing we can say about it.
Starting point is 00:38:45 And some of the things... your job if you say no. Yeah. And then it will have repercussions for the real women that these men come into contact with. Of course, in just the same way that we've seen with the explosion of choking in online pornography that's having a massive impact on teenage girls, but sex workers will feel it first and we don't hear their voices in the conversation. And it just feels like there is this thing that is rushing ahead that is being marketed by men to other men as progress. And they're making sex robots with settings that you can switch on and it's a frigid setting so that you can break in and it will protest. Like, where are we going with this? How is this what we're pouring
Starting point is 00:39:26 our innovation and our research and our funding into? It's embarrassing, isn't it? It's embarrassing. It's like, here we are and what are you spending all this money on? Like, oh, how can I make a schoolgirl available to me that doesn't have an opinion? It's just like, wow, that's what you want to do with your time and money. Well, it would be a good way of interest. trapping them. I mean, if we, look, and again, hear me out. I put like, inside the vagina sort of like Venus fly traps, but made out of metal. Okay, mouse traps, big mouse traps with spikes. Okay. Every time a dick goes in, clamp. We got one. But then the face pulls back, the face pulls back like this, and a computer's, a Kindle with only feminist literature, and it has to, you have, they're clamp
Starting point is 00:40:06 and they have to clump. Oh, you're right, you're right. Hello, I'm Simone Dubois. I've got some reading for you to do, and then she reads it out. She reads it out loud, and then it goes right up to like today's feminism but they are clamped there and they have to listen for 24 hours. Yeah. Look, it's just my urge. That's my urge.
Starting point is 00:40:22 So it starts with Simone de Beauvoir. We go through, who is that American woman? Betty Frieden. Let me just go through the whole spectrum. Go through the whole spectrum. Some bell hooks in there. Yeah, yeah, yeah. And then in between some songs.
Starting point is 00:40:37 There's some music stuff. Some Beyonce and some chapel around just to cheer them up in between. Yeah, it's a break. You're a break. Yeah, brilliant. But then you'll come out a bit like this as well. You can buy your own diamonds, you can buy your own rings. I'm quite lonely and that would make me feel a lot better for my urges.
Starting point is 00:40:52 So I guess we have to build it. Like, you know, that's what I need. That's definitely what's going to cure you in less. So if there are genuine problems. We're joking because it's so depressing. But if there are genuine problems, like one of them is that the world is getting better in lots of ways. And then yet there are failures and holes. Where is it getting better?
Starting point is 00:41:16 I found that's the thing at the moment. It's like I feel this weird moment where like, you know, we had this wave of feminism and it was like, yeah, it's cool and we all agree, right? And then you can feel it slipping. But that's why it's called a wave because it goes forward and it goes back. And there is always push back. There are always holes in it. There are always ways around it. That's what I think actually is really sad about the technology.
Starting point is 00:41:38 And also, you know, all the Andrew Taitism of the internet to young men is because actually things were getting better because the parents of those. children had had different childhoods to their parents and there was an improvement and understanding. It does remind me of growing up in the 80s because it's like we were coming off the back of the 70s feminism and suddenly in the 80s as a little girl it was like, oh, you have to have blonde hair and big tits and wear nothing and be a bikini. And that's, I remember growing up being like, oh. And be a pole dancer but because it's really empowering. That's why you go into it. Or be impostitute because like you really want to write poems and then Richard Gere comes on with a limousie. Yeah. And there's no slight on. That's not an and
Starting point is 00:42:16 sex work sentiment were saying, we're just saying that it was sold as that that's why you would do. Well, it sold as a thing you did before a man came and rescued you. You could do that as a job if you don't want to be a secretary. But don't worry, Richard Gear is going to come and get you. But not her friend. You know about the original draft of the picture woman. You know about it as well.
Starting point is 00:42:34 So Pretty Woman was originally called, I think it was called 40, I don't know, how much does you give it? $5,000, $10,000. The original title was an amount of money, and she was a drug addict in the first draft. And so in the current draft, There's a bit where she's flossing, carlraft in the film.
Starting point is 00:42:49 There's a bit where she's flossing. He breaks into the bathroom because he thinks she's doing drugs. And in the original draft she was doing drugs. And the ending of the original Pretty Woman film was that she took the $10,000 and went back and he dropped her off back on the streets. And then it got changed and they sort of disnified the ending to make it more of a love story.
Starting point is 00:43:05 Yeah, because everyone was like, well, this feels real and it's the 80s, no thanks. And also you really forget, and as she rewatch it, that it does begin with a woman's body being taken out of a bin. And they say, oh, don't worry, she was just a crack whore. Yeah, it's very problematic. I don't think it's a romance. No, I would, I would say, yeah. Yeah.
Starting point is 00:43:23 But to go back to the 80s is that... Yeah, so I feel like we had like, again, like you're right, it is waves, but it does feel at the moment. Like, you write in the book so beautifully, like it is... There are so many places that women's rights are just being rolled back in a way that it feels hard to get your head around when it's 10 years ago, we would have believed, no, that's not going to happen. Like, girl boss!
Starting point is 00:43:44 God was, right? We're all okay. And I think this book is so important to, yeah, to make, not even just men, women like ourselves, who struggled to take this in. Also, I've got two children, right, and they're really, really young. And I've been asked a few times in interviews since having children, like, how are you going to, they don't say stop them being rapists, but that's what's insinuated. You know, as a feminist, how are you planning to bring up your children? I think my head is in the sand about it because they're so small. and I don't want to have to think about those things yet but I was talking to my husband this morning about it saying I was coming to speak to you going we do have to know because even how Theodore's 3 how he watches YouTube he's already tech savvy he's already got access because you know
Starting point is 00:44:29 I was going to shut him up sometimes I'll let him use my phone so I can have a wee so already I said so we already have to be aware of like parameters conversations even when they're really small but I understand the temptation because and also I can imagine
Starting point is 00:44:49 if my child was 14 and he's the one who does something wrong I would be the parent going please don't expel him his education is so important he was just an idiot he didn't know we will deal with it now and that's too late but that's like I guess what I'm trying to say
Starting point is 00:45:06 is I can put myself in those parents' positions and go this isn't your fault with a capital F how do we make sure sure we are already giving it the headspace, having the tools, those kind of things. Do you feel hopeful? Or do you feel like there's a lot more work to be done before you can allow hope in? I feel very worried because I think that we are in this David and Goliath moment where, like you said, there's always been waves and there's always been backlash, but the overall sort of
Starting point is 00:45:31 direction of travel has always been forwards. But we've done this like weird thing now where we've put the backlash on steroids. We've given it this incredible heft of algorithmically facilitating. mass radicalisation. I don't know how we get past this because it really seems to me that regulation is the only solution. We're post-adolescence. Everyone's going, what should parents do? What should schools do? What should teenage boys do? What's the fucking government do? And no one is saying, why are we not holding these tech companies accountable in the way that we could and we should. We had the industrial revolution. Like so much regulation had to be put in after that. I'm like,
Starting point is 00:46:06 hey, don't let children work in factories. Hey, they shouldn't do. You know, people should come out of work with both hands. That should be normal. And I feel like that's what we're having now The same thing. Another revolution where it's just, it's come too late, it shouldn't have happened late, but I firmly believe in like 10 years time children will not be allowed on the internet.
Starting point is 00:46:23 Like it will be like smoking. It will just be like, no, no, no, there's no way they should be allowed. I think we should all have a screen time limit. And I keep saying, when I'm elected, everyone gets two hours a day and that's across any screen, computer, phone, pad and television, choose what you do. Oh, Nanny State, that's what they're looking about with.
Starting point is 00:46:40 Yeah, I've got a nanny state. because actually you need it. Everyone's sad. I don't have you noticed. Metaverse. Literally call it depression soup. Well, the thing that really when you said 47% of online grooming offenses take place on meta-owned products.
Starting point is 00:46:56 Yeah, that's an NSPCC statistic. And the thing is, over the period of the Industrial Revolution, we had 80 years to work it out and to get it right, this is going to happen overnight. Yeah. We have to act fast. And we have a long tradition in this country of legislated for complex issues.
Starting point is 00:47:11 Well, we do it with... We legislate for fraud. That's complicated and ever evolving. When they found out, you know, you can make children through IVF or you can do this with the gene, they do keep up with that quite quickly in Parliament so that they are across what is possible and what is allowed. It's not impossible. It's complicated and it does require cross-jurisdiction, compromise and collaboration.
Starting point is 00:47:33 But the reason we're not doing it isn't because of that. No, it's money. It's because of he's standing behind Trump in the Oval Office. Yeah. And it's because of the fact that Britain is. then trying to cozy up to Trump and, you know, putting the Online Safety Act on the table alongside a favourable trade deal saying, we'll water this down if you'll give us that. They are prepared to throw women under the bus.
Starting point is 00:47:51 And we know that because at the AI summit in Paris, when they created a kind of agreement between countries saying, you know, we should all sign up to this standard that actually AI should be ethical and there should be some safeguards at design stage. The US and the UK government said, yeah, we're not signing that. Oh, Laura. Well, what can we say more than thank you? Like, thank you for doing the hard work. Thank you for writing this amazing book.
Starting point is 00:48:14 And for going into schools and being someone that, you know, young women and girls who are experiencing things and not being heard, I mean, they get to see that your books exist and their stories are important. And that thing is a wave. We can start from the ground up. Yeah. And like, I think it's good that we're not ending on. Like, oh, it's fine. Like, it isn't fine. Like you said, it's not fine.
Starting point is 00:48:34 So there's no point trying to wrap this up and like, oh, it'll be all right. Like, you should read this book. You should know this stuff. you should be having these conversations with your children or other children. And also asking other adults what they know. Did you know this? Did you know that? Because that's it.
Starting point is 00:48:47 We would all prefer to live in a world where it wasn't happening. But it is. Yeah. Yeah, exactly. And yeah, you have to deal with it 24-7. But for the rest of us, we could just spend a little bit more time investigating and thinking about it. You know when like, I don't have this at school and like there's a mom that does a lot, like baking or cakes for the cake sale, you might.
Starting point is 00:49:09 sometimes she needs some help on the sale. And that's something like Laura's there. She's baked all the cake. She got the float out from the parents' room. She set the table up. Come on, guys. Need a few more. Not just the mums.
Starting point is 00:49:21 Laura, thank you so much. Thank you for having you. Thank you. Stick to the weirdos book club. I'm on tour. Tickets for my new show. I am a strange group are on sale now from sarahpascar.com. My new picture book is also available to buy now called Where Did She Go?
Starting point is 00:49:41 You can find out all about the upcoming books. we're going to be discussing this series on our Instagram at Sarah and Carriads Weirdo's Book Club. We'd love to hear your summary suggestions or if you have any questions, burning book questions, please get in contact with us on Instagram and send them in. Thank you for reading with us. We like reading with you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.