Life with Nat - EP205: Nat's Chats #9 - Big Tech’s Little Victims

Episode Date: March 2, 2026

Big content warning for this episode - mentions of triggering material, including references to content promoting sexual assault, eating disorders, and self-harm. Nat's taken this episode to contin...ue the conversation started on Good Morning Britain last week - her involvement with Big Tech's Little Victims. Big Tech's Little Victims has undertaken an investigation into the social media algorithms when the age is clearly marked as a 13 year old, finding devastating promotion and distribution of incredibly damaging content. They're campaigning to change the laws and protect children from the horrendous content they're being served by the social media companies. Nat speaks to people she met working on this campaign, and hears from listeners, too. Big Tech’s Little Victims Campaign - https://bigtechlittlevictims.org/ Not a Survival Guide: Your Straight Talking Parenting Companion; Navigating The Shift From Child To Teen by Niicole Howes https://amzn.eu/d/099IRcwJ Helplines Childline - https://www.nspcc.org.uk/about-us/our-services/childline/ Young Minds - https://www.youngminds.org.uk/ - have guides for young people, parents and carers Mind - have specific help for 11-18 year olds https://www.mind.org.uk/for-young-people/how-to-get-help-and-support/useful-contacts/ Samaritans - call 116 123 Or other ways to get in touch with them on their website https://www.samaritans.org/how-we-can-help/contact-samaritan/ Please subscribe, follow, and leave a review. xxx You can find us in all places here; ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://podfollow.com/lifewithnat/view⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ We're on Facebook: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://www.facebook.com/lifewithnatpod⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Nat's insta: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠@natcass1⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Marc's insta: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠@camera_marc⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Niece's insta: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠@natsnieces⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Tony's insta: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠@tonycass68⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Linny's insta: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠@auntielinny.lwn⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ MORE LIVE SHOWS! 28/02/2026 Colchester, Arts Centre ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠TICKETS⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ 07/03/2026 Manchester, Fairfield Social Club ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠TICKETS⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ 22/03/2026 Leeds, The Wardrobe ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠TICKETS⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ 29/03/2026 Bristol, The Gaffe - ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠TICKETS⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Book Club: February's Book - ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠anything by Sophie Kinsella ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠https://www.sophiekinsella.co.uk/books/⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠ Nat’s solo chats - any rants always welcome. We're talking big career changes, the constant comparisons with others on social media... and the audacity of teenagers! Scraping the Barrel - SCAN AND SHOP VIRGIN NO LONGER! Bonce vs list! - Are you a list maker? Always collecting for Nostalgia Fest! What’s brewing with the Nieces - AGEING & non-negotiables Things we’re nagging with Linny about - More lateness stories and some cleaning questions, please! The Tony talks chatter - Keep your DIY questions coming. What are your favourite films & albums? What’s the show Tony’s going on about? And is there any way they'd legally be able to continue their holiday if that happened on the boat? Cold water swimmers and shower’ers… convince us A 'Keep It Light Media' Production Sales, advertising, and general enquiries: ⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠⁠hello@keepitlightmedia.com⁠⁠⁠⁠⁠ Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:14 Hello, welcome to Life with Nat. I hope you've all had a lovely weekend. We were in Colchester on Saturday. We had a wonderful, wonderful time. So thank you for everybody who's coming to the live shows. We're having such a brilliant time. And everyone leaves with a smile on their face. It's a good old laugh. Really, if you are in Leeds, Manchester or Bristol, oh no, Bristol's sold out now. But Leeds or Manchester, and the next one's coming up, please come and see us. Come and see us. Come and say hello, listen to the banter, you'll love it. Tickets are available on www.w.w.com. So have a little look and get yourself a ticket. And they're not too pricey either, promise. So today's pod, this podcast episode is sponsored by the Big Tech's Little Victims Campaign, led by the National Education Union and a growing coalition of supporting organisations, experts and community leaders.
Starting point is 00:01:10 Now the campaign is calling on the UK government to raise the age of social media access to 16 and protect children from exploitation by big tech companies who are putting their profit over children's well-being. I was recently a spokesperson for the campaign and last week I reviewed content from its algorithm experiment. What I saw was devastating. Within minutes the accounts of fictional 13-year-olds were served content that promoted isolation from friends, weapons, misogyny and self-harm. Parents and teachers see the impact this content is having on our children every day, but it cannot be down to them to manage a loan. And that is why the government must act now. I know this is a big subject. You can switch it off if you're not interested.
Starting point is 00:01:56 However, the outpouring of messages I had after being on GMB last week was so huge that I spoke to the guys at the campaign that I was working with last week and I said, please, can I do a podcast on the subjects and they were thrilled because it obviously just helps them to get their campaign out there. But it was so shocking and I'm so passionate. So I'm going to now have a chat with a few people and we're going to talk about what we saw and how we feel about it in a bit more depth. And I'd love to know your thoughts after this. You know, I can always come back to these subjects. They're so important. And I know there's a lot of parents and carers of children who listen to this podcast and we can keep this conversation going. It's something I feel really passionate about.
Starting point is 00:02:44 Right, here we go. I'm going to call Nancy, who I worked with last week, who was a researcher from the campaign, Big Tech's Little Victims. Hi, Nancy. Hi, Natalie. How are you? Yeah, really well, thanks. How are you? Yeah, very well, thank you. Obviously, I've decided to do a Life with at podcast based on our findings last week. And I've had an overwhelming response from listeners of the podcast who really support the campaign and really happy that I went on to GMB to talk about it. So I thought it would be really lovely to just do a little episode today. And what better person to talk to than you who researched it all? Yeah, absolutely. Yeah, happy to talk about how we ran the research and our approach to it. Absolutely. Go for it.
Starting point is 00:03:34 we basically designed the experiment was we created four personas of fictional 13-year-olds based in the UK and looking at what their sort of typical interest would be. We had two girls and two boys. So, for instance, one of the girls was really interested in reality TV and music and friendship. And then one of the boys was interested in gaming or sport, those kinds of things that are really kind of, you know, typical for young people. and we had each persona on two social media platforms reflecting the platforms that young people tend to be using the most these days. So it was a mix of TikTok, YouTube, Snapchat and Instagram. Then as researchers, we scrolled on average for up to 30 minutes a day across seven days and noted down or saw what the algorithm would be sending these profiles that have been set up for the first time.
Starting point is 00:04:31 And crucially, when we were setting them up, we made sure that every persona had a birthday, which meant they had recently turned 13. So the platforms know their age, and therefore we could gauge, based on what the algorithm was sending them, that they knew that they were young people, and therefore certain content obviously wouldn't be appropriate for them to be seeing. And we also made sure we used VPN, so that there was no kind of tracking across devices and really trying to, mimic how a 13-year-old would be scrolling on social media. So in then some videos we kind of skipped within sort of one to two seconds, some we watch for a bit longer, sometimes we watch some of the videos that will have in full or even re-watch them to reflect teenagers' attention span
Starting point is 00:05:20 when they watch these sort of short-form videos. And Nancy, can I be clear with you? And I think it's really important for the listener to know because I think one of the biggest questions that I've been getting is, were you searching for violence or Thinspiration? Were you at any point searching for those things? No, not initially. So based on their interest, we had a handful of celebrities or influencers that we followed
Starting point is 00:05:48 when we set up the accounts and for the platforms that let you kind of select their interest. We did that. Then for the first few days, we just let the algorithm do it sing. On sort of day five of the experiment, we had a predetermined, more provocative search term that we use. So, for example, for one of the girls, that was what is love bombing. Okay. And we kind of just to gauge how that would impact the algorithm and what it might send them. But otherwise, we weren't searching. This was based on the predetermined
Starting point is 00:06:19 interest that we put in. This was what the algorithm just was sending them, knowing their age, basically. I mean, I was so shocked, obviously we were together last week. I was so shocked at the material on what I saw. And obviously something really scary happened. One of the researchers had to pull out of the experiment because they found it so upsetting. Yeah, as a group of researchers, you know, obviously we're adults. We knew what we might be seeing doing this. That instance, that was one of the girl personas we had on Snapchat. And on day three, the content had already taking quite a dark pathway down sort of eating disorders and quite a lot around sort of mental illness and these sort of sad content loops was what we were seeing and then it really escalated
Starting point is 00:07:07 into a lot of self-harm and even sort of suicide ideation. Yeah, like you said, we basically obviously wanted to make sure that the researchers, we were taking care of our own mental well being as part of this as well. And yeah, we made a call for that particular researcher to pause the platform at that point. It was so intense for that 30 minutes. And when you think about that, that is an adult at work who really couldn't cope with what they were seeing. And they're 13-year-olds, you know, setting up these accounts. And they have really no idea what they're going into. I think they're setting them up because they want to talk to their friends. Would you agree with me? I would say that the biggest thing is they're not going on to scroll as
Starting point is 00:07:53 such, they're going on to be in contact with their friends and to feel that they're not missing out. Yeah, I think that's kind of what we see a lot in a lot of the reporting on this issue. Obviously, it's social media and children want to be connected to their friends. And you've, you know, you talked about your own experience of, you know, children not wanting to be left out. And I think, yeah, sort of what we really want to show with this experiment is these platforms that have these algorithmic feeds that are freely accessible to 13 year olds at the moment in the UK, what that's serving them and where those content pathways can lead to, basically. Absolutely. Absolutely. Well, as I say, I've had some really harrowing messages.
Starting point is 00:08:36 There's a couple of voice notes that I will play in later. I had an email from a wonderful guy called Chris. He's yet to get back to me, really hoping to talk to him at some point, but he lost his son at 15 and he started a charity. Obviously, like you say, lovely Ian Russell, we spoke about last week who lost his daughter Molly. You know, I know that's an extreme and severe turn of events, if you like, but these things are happening a lot and self-harming has gone up. I don't know the actual percentage, but I hear from even Eliza at school, you know, there are so many girls taking part in self-harming now.
Starting point is 00:09:15 And surely this has to be the common denominator. It has to be what they are watching. Yeah, it's certainly, you know, from what we saw from a very sort of short space of time and what we saw in terms of how the algorithm really escalates down certain pathways, yeah, you can sort of only imagine for an actual 13-year-old, obviously these are all fictional profiles, but how that could then impact them. So, yeah, obviously as part of the Big Tech's Little Victims campaign, this is really sort of showing why we need quite urgent government action on the issue and why the campaign is calling to increase. the age of social media access to 16. I would go as far to say, after watching it and talking to everybody, and I mean, even 16 is young, right? I mean, 16 is still quite young to be watching that stuff. As I said to Kate Garroway the other morning,
Starting point is 00:10:07 I don't ever want to watch that, and I'm 42 years of age. But I suppose if we can just get the government to ban it, and at least those tiny brains, as you so beautifully put in the campaign, the little victims we can try and protect. Yeah, absolutely. Well, thank you, Nancy. So lovely to talk to you. I'm going to give Kelly a call, who I met at the school last week,
Starting point is 00:10:32 who's a parent of four, and have a really interesting conversation about that because she obviously saw the content that I saw last week also. And I think the listeners will really love to have a little, you know, listen to what Kelly has to say also. Brilliant. Yeah, thank you, Natalie. No, thank you so, so much. And we'll speak soon.
Starting point is 00:10:49 Thank you. Thank you. Bye. Lovely Nancy, who is a researcher and sat scrolling on a new platform pretending to be 13 and saw some very harmful content. I spoke to Eliza about this, right? And I said to her, Eliza, I'm so worried. I feel really upset, frustrated that I've given it to you. And she came back, and even this morning in the car, she said, Mum, honestly, I don't see that stuff. And I said, I beg to differ. I think they are desensitized to it, these children. And she got something up. I said, have a little scroll now.
Starting point is 00:11:28 And I promise you, within 30 seconds, I saw a really thin girl in a crop top standing sideways. I said, that's no good, is it? She said, oh yeah, but I don't stop and look at it. I said, but you are seeing that. You're seeing that on your feed, and that is not okay. So I don't think it's our fault. parents, I think if it was banned, we all turn around and we can say, sorry darling, you can't
Starting point is 00:11:55 have it because it's illegal. It doesn't fall back to us. And what Nancy was saying there and what I said last week on GMB is when Eliza turned 13, well actually, she got a phone at 12 and then that was it. As soon as she got the phone, it was, I need Snapchat. Everybody messages on Snapchat. Please, man, please, ma'am. And I really was strict. I said, I don't want you to have it. I really don't. not until you're 13. Once I've looked at this material, I feel really stupid because what I've done is I've believed
Starting point is 00:12:25 the certification. I have trusted those apps, those social media platforms, to say, oh, when they're 13, it's okay to watch. And it really bloody isn't at all. You don't need AI agents, which may sound weird coming from Service Now,
Starting point is 00:12:48 the leader in AI agents. The truth is, AI agents need you. Sure, they'll process, predict, even get work done autonomously. But they don't dream, read a room, rally a team, and they certainly don't have shower thoughts, pivotal hallway chats, or big ideas. People do. And people, when given the best AI platform,
Starting point is 00:13:06 they're freed up to do the fulfilling work they want to do. To see how ServiceNow puts AI to work for people, visit servicenow.com. Let's phone Kelly, lovely lady I met last week. Mum from Kent. let's see what she has to say. Hello. Hi, Kelly. Hi, Natalie.
Starting point is 00:13:27 How are you? I'm all right. How are you? I'm good. Thank you. I'm good. So lovely. Thank you so much for talking to me.
Starting point is 00:13:34 I know we had a really brief chat last week, but I've had such an overwhelming response from going on to Good Morning Britain and from listeners and people on Instagram, just, you know, really passionate people who, you know, just feel really worried about their kids. And I thought it would be so nice. to just have a bit of a longer chat with you about what we saw and how we feel about it. Yes. Oh, I'm so up for that, definitely. Fantastic.
Starting point is 00:14:01 So tell the listeners, Kelly, you've got four children, haven't you? Yes, so I've got four children. Yeah, it's a lot, isn't it? It is a lot. So we've got two girls, two boys. So our girls are 16 and 13 and our boys are eight and four. So in terms of dealing with our girls, obviously they're teenagers. so what we saw is so relevant to their age. But then also having the younger boys,
Starting point is 00:14:27 I know this is something that's going to affect them in a couple of years' time. And I want something done about it now. You know that way? I don't want to have to wait. And I know I watched your interview on Good Morning, Britain, and you said the same thing. What are we waiting for? Yeah.
Starting point is 00:14:42 And that's why I don't get it. I don't see why we have to wait. The tech companies are not taking any responsibility. it's all down to us parents and we need support and I think the best support will be this blanket ban blanket ban because then we can say
Starting point is 00:14:57 I'm sorry darling it's illegal it's not my choice it's not down to us it's hard isn't it as a parent when you're telling your child actually you shouldn't have that but then they have the pressure
Starting point is 00:15:08 from their friends being you know you're the only one who doesn't have Snapchat because that's what we have at the moment with our 13 year old but like you said if it was made illegal we can say look there's no option here for you and your friends shouldn't have that option too either
Starting point is 00:15:20 because it's illegal. Yeah. I completely agree. So your girls at the moment haven't got social media at all. So our 16-year-old does. Yeah. She does.
Starting point is 00:15:30 And I had a conversation with her after I took part in the experiment and just said, like, do you see these videos? And she said sometimes she does, but she just kind of scrolls by. But it did make me think is she becoming a bit numb to see it?
Starting point is 00:15:44 You know that way? Kelly, honestly, I've just said that. I've had the same conversation. I came home, I was shaken up by what we'd seen. And I said to Eliza, darling, you must be seeing this stuff. And she said, well, the odd thing comes up, but I don't like it. But that's not the point, is it? They're still seeing that.
Starting point is 00:16:01 Exactly. And that's the thing. I think even just seeing it for those split seconds, it's going in. And I mean, I know we saw those videos kind of condensed to five minutes. Yeah. That's how long it was. But I was shook. I don't know about you.
Starting point is 00:16:14 Afterwards, I actually needed to take a few moments because I was quite shook and upset by what I'd seen. And that was literally for five minutes. And you think about how our teens are now on phones. It's all the time they want to be scrolling. And I'm just thinking how much is going in, even though it's a quick second video, second video, I still think it's been absorbed in some way. I think it really is. And if you think about learning as well, there are different ways of learning. And they also, it's really interesting because if you look at languages and things, they say, you know, short bursts of things really stick in your mind. You know, you know, You know, the tech companies are aware of this.
Starting point is 00:16:49 That is why they're short. And I think listeners would really like to hear a little more detail as to what we saw. Because on GMB, and I did bring up the point, it was all very well, but they weren't showing the clips on breakfast television. Right. And I said to... Because they're so bad. Because they're so bad. I said, but there are kids eating their breakfast watching this on their phones. Yeah. Oh, I couldn't agree more.
Starting point is 00:17:19 Like, that's how awful the videos were that they couldn't even show them. And like you just said, we have kids, even some younger than 13, before they've left the house, this is what's been bombarded into their mind. You know that way? The first video we saw was the girls. Yeah. To me, it was so sexualized. So sexualized.
Starting point is 00:17:39 So sexualized. Yes. Even hearing noises like sex noises and stuff in the background you could pick up on. Also, a lot to do with self-harm. There is images of self-harm. Also, when the stuck out of my mind was a quick short video of how you could have your ribs removed to make your waist smaller. Yes. Do you remember that one? I really do. Yeah. And I saw, oh my gosh. And also, the videos that were quite sexualized. I felt they were very much saying, even if you want to say no, you can't say no. You need to say yes because that's what boys want.
Starting point is 00:18:15 I don't know about you. other thing that really, really shook me was the fact that the girl's videos were you have to do what your man says. The video of the girl, I think there was a wording over it that said, when you want your woman to shut up and then she knelt down as if to go and give all sex. So then watch the boys videos and it was all about them being into control, walking into a room, people should be on eggshells, you're the best. I thought there's a real, real narrative, dark narrative, which, you know, that it all linked up perfectly between the man and the woman.
Starting point is 00:18:58 Oh, I agree. Because the other thing I noticed about in the boys' videos, I mean, it was nothing short of seeing women being raped. Yeah. And that even if a woman says no, she actually means yes, because at the end of client, I think there's three or four different clips that were implying that woman was going great. Yeah. And at the end of it, you see her and she's almost smiling. This is something she actually wanted and enjoyed. Yes. And I just thought, my goodness, that's what we're teaching our boys at 13, that this is what a relationship looks like. And if a girl says no, she's actually meaning yes. It's so scary. So scary. And like you said, then the other videos that went along with that were so full of violence and hatred. And like you said,
Starting point is 00:19:41 you need to walk into the room and people need to. fear you. You need to be carrying things like knives and they need to be big and those kind of zombie knives. Guns as well was a big part of it. It was just so dark and full of violence and hatred. As well as all of that stuff, on top of that, I felt really lonely. It was really isolating. Yes, exactly. Because I know when we left the room, I did need to take a few minutes because I just actually, it's almost like it touched a nerve where I just. thought, I'm 37 years old and I'm viewing this and it shook me. And like you said, it's almost that it's quite isolating. If you're a 13 year old in your room looking at this, I can only
Starting point is 00:20:24 imagine how lonely and actually scary of a place that would be to be viewing all of these things. And also the age that they're at, their mind can't really process as well a lot of what they're seeing. So again, I just think it's a scary place for them to be. If you and I, if something comes up on our phone and we don't like it, we have the ability to just flick past it or maybe say I'm not interested in this content. But even I at 42, Kelly, I find myself watching the most ridiculous videos on aging and when you die, how your children, you know, you get sucked in. And I think, gosh, what am I doing?
Starting point is 00:20:59 I'm actually emotional watching this stuff. And I'm 42. Yeah. And I think that's what the videos are very clever. They seem to touch a nerve with, and I think that's a part of the algorithm, isn't it? that they pick up what you're going to keep viewing. And I do think, like, well, I've always thought myself to be quite savvy, I suppose, with social media.
Starting point is 00:21:19 I've worked with teens and in secondary schools and things like that. But this has really just rocked me thinking, actually, we are so in a world that is out of our control when it comes to social media. And again, it just goes back to the tech companies. They know this. But yet they're still going to keep bombarding our kids with these videos until something's done about it. Why do you think that is? Why are those videos, I've been really thinking about it for a couple of days.
Starting point is 00:21:46 I know it sounds such a silly question, but how are those videos in the public domain? How is that allowed? I honestly, I don't know because when you think about it logically and like sitting here having this conversation, we're going, there's no way in the world would we bring our children into a room and sit them on front of that. Ever.
Starting point is 00:22:10 But yes. Yeah, ever, exactly ever. We just wouldn't. And I would have thought like as a society we would say, no, this isn't right. But yet it is. Also, that was the other thing I noticed that when we saw these videos that showed how many times they'd been shared. Yeah. And especially some of those rape scenes, they've been shared, I think, over 200, 300,000 times.
Starting point is 00:22:30 And liked, liked by so many people. Exactly. And I just saw, oh, my days, like, how is this allowed? And also the, the right. And also the racism, awful, awful racism that children are watching. And that is being normalized. Talk about division and hatred, you know? That's it.
Starting point is 00:22:51 Oh, that is so true. And then, like you said, before breakfast they've watched this. And then they're going into school where I'm sure there's plenty of other cultures. Yes. And that's what they've been fed that morning is to actually you need to hate someone who's different to you. Yeah. Yeah. It's scary.
Starting point is 00:23:07 Did it make you feel I felt really guilty? I felt that I've let Eliza down by giving her that. Yeah, it really did make me feel that we should have done more, especially for our 16-year-old, because, I mean, she's had Snapchat for the past two years, and I did think, oh, gosh, what have we exposed you to? And I think that's the other thing, because I don't know about you, but, like, I don't have Snapchat.
Starting point is 00:23:33 No, I don't. And, yeah, I didn't realize, actually, the extent of what you can see on it. And I think how clever. It's not by accident. Snapchat is a messaging service that they know young children use and adults don't really use
Starting point is 00:23:50 and that came out the worst one for the scrolling. Yes, it did, didn't it? And I know they said when they did the experiment initially they reported these videos that they saw. I think they said they reported one a minute of every time something was inappropriate. Yes. yeah, that's basically what their feed was constantly,
Starting point is 00:24:09 it was just inappropriate videos. And you just think, even people are being responsible on reporting them, but yet Snapchat is not doing anything about it. Nothing at all. Nothing. That's it. They have to take responsibility.
Starting point is 00:24:22 I completely agree with you. Completely agree. Kelly, it's so lovely to talk to you. I'm so pleased we got to chat for that a little bit longer. I really, really hope everyone's well. Love to all the family. And who knows, I might talk to you again soon. Oh, that'll be lovely. Thank you for your car.
Starting point is 00:24:38 Thanks, Kelly. See you. Bye. Bye. Ah, that was lovely, Kelly. Four children, great mum. And I really do think everybody should be able to see what we saw the other day. It was so powerful. As mum's parents, grandparents, carers of children, we cannot police what they are watching. We can say, oh, you can only have the phone for an hour
Starting point is 00:25:02 or don't go up to your bedroom with it when you go to bed. the harmful content on the social media algorithms should not be available to our children and that's the end of it. I'm going to give Damien a call now, Damien McBeath and he is head teacher at the John Wallace Church of England Academy and that's the school I went to the other day
Starting point is 00:25:25 and I'm really, really excited to have a little chat with him because we didn't get time last week. Let's give him a call. Hello. Hello Damien, it's Natalie. Hi, Natalie. How are you? I'm okay. How are you? I'm good. Thank you.
Starting point is 00:25:46 I just wanted to say firstly, thank you. We didn't really get to chat much last week. It was all very busy and it was fantastic what we achieved. But I've had such an outpouring of support for the campaign on my podcast and my listeners. And I just thought it would be really great to talk to you about what we saw last week and how you feel about it, being a head teacher at a school. Yeah, that would be good. That'd be really good. Fantastic. I was at an event at King's College yesterday.
Starting point is 00:26:16 A lot of people had seen the footage and lots of people were wondering why policy makers and decision makers hadn't seen the level of content and weren't fully aware of what we're seeing. So, yeah, I think it's really timely and really important. So I'm glad. I'm really glad it's getting that take up on your podcast. Oh, me too. And I think it continues the conversation and it strengthens it. I think we all know the dangers, but something like this experiment really does kind of put a huge beaming light on it. Yeah, I agree. So how did you feel, Damien, when you watched the footage and material? I wasn't surprised that there was going to be things that were targeting girls towards beauty products and things like that.
Starting point is 00:27:03 I wasn't surprised that some of the boys' content was to do with weapons. I think the extreme violence that we saw was quite shocking. And some of the content aimed at girls as well was, I think I use the word devastating because we know it's an issue. We're all aware of this. Everyone's saying social media is a problem. But when you see what's being targeted, it is devastating because this is happening right now and this is happening to lots and lots of children.
Starting point is 00:27:33 Have you seen throughout your school, which was a beautiful school, by the way, I was really impressed with everything about it. Have you seen within your school, would you say, a change within the last few years to pupils' personalities and views? I certainly have seen a shift in pupils' opinions, in pupils' views. But when you actually unpick it and when you speak to the children themselves, it's a very thin veneer. So it's very often you hear.
Starting point is 00:28:08 hear the same phrase being used again and again, and it might be something that is quite a misogynistic view, or it could be an extreme right view. If you just sanction the child and say, you know, that's appalling, then then you don't take anything forward. If you ask children, what do you mean by that? Talks me more about what you, very often they don't understand some of this concept, or you see that the views that they're holding are very, very similar across a wide. range. And I think that kind of, I could make more sense of that having seen the footage. I see. The anger that was portrayed to them in these short clips. Yeah, it helped me make sense of some of the things that you see in school. I would go as far to say it's almost brainwashing them.
Starting point is 00:29:00 Yeah, I don't believe that if they had time to think about the content, unpick their own values, the beliefs, the things that they hold dear and say, well, you know, do I feel this way towards somebody who doesn't look the same as me? And they're very much to know that they wouldn't get to the same place. But when they are just scrolling for hours and hours and hours being given the same message, the difference is a bad thing, right? And that's what we saw. We saw just endless reams of people shouting and saying, don't be weak.
Starting point is 00:29:35 And if somebody doesn't look like you, then that's a bad thing. and you should challenge that. And I don't think that the children hold those views deep down. But I do think that if we don't challenge them, if we don't get on top of this now, then those views are only going to become deeper entrenched. And then, you know, when they grow up, what message will be then passed on to their children?
Starting point is 00:29:54 So true. The reason why I get frustrated when I hear, we need more consultation, we need a debate about this. Yes. It's quite clear that harm is being caused. And if we know harm is being caused, when you get 97% of teachers saying, we need action, when you get over 80% of parents, when you get health professionals, the police, everybody seeing the same thing, then who are we now
Starting point is 00:30:19 consulting? Absolutely agree with you. Who is the consultation for? There is no way that 20 people from government could get in a room, watch the material we did and not come to the conclusion that it needs to be banned, surely? In the conference I was in yesterday in King's College, there were many different wonderful speakers and people talking about this. But there was a view that it's very difficult to pin a piece of research down that can show causation between social media and harm. The challenge I put back was anybody who watches the content knows there is harm.
Starting point is 00:31:02 You can't watch that and say that that is not having a negative impact on young people. I don't believe any rational-minded person would not come to that conclusion, that showing somebody pornographic material at age 13 isn't going to have a damaging impact on the developments. I completely agree. Showing them, nice, gutts, shouting at them, and aggressive, violent content. So I think anybody who sees the content, and that's what this experiment showed, is not to do with our views of social media. It's what are our young people seeing.
Starting point is 00:31:39 And it is shocking, isn't it? Yes. And I think you're absolutely right. To close the conversation, devastating is the right word for it. I think so. Damien, thank you so much. Thank you for talking to me. I shall let you get back to your brilliant work.
Starting point is 00:31:55 I hope to maybe see you or speak to you in the future. No problem, Natalie. Thank you for everything you're doing on this. You know, I was really impressed that you've looked into it so much. No, it's very, very, if you have young people around you, you can't help but be passionate and want to make a change. And the thing is, they don't have a voice at the moment, so thank you. It's really important that you'll give a voice.
Starting point is 00:32:16 Thank you. Thank you so much. I appreciate it. Thanks very much, Damien. Bye. Okay, take care. Bye-bye. The lovely Damian, fantastic head teacher.
Starting point is 00:32:26 You know, one of those just lovely men, and you think, oh, you can just tell, they're brilliant head teachers. Joni's got one as well. He reminded me of Joni's head teacher. Just so passionate about those kids. And he is right, if anyone were to look at the material that we saw. And there's an argument here, right, listeners, there is an argument. People can say, oh, but all of that material got squashed together for you to see.
Starting point is 00:32:52 That is correct. We were viewing the most harmful material that was seen throughout that seven-day experiment. However, even if a child sees one of those clips per day, it's devastating, to use Damien's term, but it is outrageous. I know for a Monday morning this might be a bit heavy. I know that, and maybe some of you won't be overly interested, but I do feel, yes, I, you know, I've focused on the campaign, I've focused on the experiment and what I saw last week. But this does then lead on to us. us as adults. This does lead on to how we feel when we're sat in front of a phone. I'm obviously not talking about banning it for everybody. I'm not doing that. It's part of my business. I use
Starting point is 00:33:41 Instagram. I'm on WhatsApp all the time. I'm not an idiot. You know, the world's digital. We've moved on. But sometimes I can sit in front of my phone and I can get, you know, that powerful image of a little girl looking into the mirror and it's the old woman. Those things do really do hit a nerve. Some things are really motivational, but other things can be really emotional. And I do think we need to also remember when we are aimlessly scrolling, it is going in. We do need to take care of ourselves. I've got a couple of messages here that I'd like to read. A listener here said, I think I'd need a week to touch about all this stuff. It blows my mind. As a stepmom, I find it so difficult as at home with their own mum, they're allowed on everything at age 9 and 15.
Starting point is 00:34:29 screen time is a constant and sadly it's taken over children become lazy and if I have my way not just social media would be banned for under 16s but screens completely children need real attention I know it's easy for me to say with no children of my own but I see how the world is going and I'm glad I haven't had any children as it's all too much. Love your passion on all of this with GMB more needs to be done to protect the children of today
Starting point is 00:34:56 and I think that is so so true And I don't think it's just about having children also, you know. You can have nieces, nephews, cousins, dear friends with children, you can be godparents, whatever it is. As I say, if you're a human being and you're looking forward into the future, you don't want children to be fed this material so their thoughts become warped from what they're seeing. And I think that's why everybody is really, really passionate about it. I've got a message here.
Starting point is 00:35:26 And again, this is sort of proving that adults are also affected and can be very sensitive towards social media apps too. Hi, now. It's Sally here from Pembrokeshire. Just saw your post about algorithms and felt compelled to voice note in for the first time. My algorithm has actually made me really poorly this last year as someone who has suffered with an eating disorder for all. of my life, really. My algorithm for the last year has been all about GLP1 injections, Manjaro, Asemps, and now I'm getting berberine patches if you don't want to take the injections and it's all about skinny culture. And for someone who has been well for the last six years, it has really, really, really triggered me and has made me quite unwell with anxiety and and fighting the urge to go back to that way of life.
Starting point is 00:36:30 So as a 48-year-old woman, if I'm struggling with that and everything that I'm being fed through social media, heaven help all the young people that are watching it and having all sorts fed to them. So yeah, just thought I'd share that with you. Thanks for all you do, love the pods. Thank you, bye.
Starting point is 00:36:50 Thank you so much for that message, and I'm so sorry that you've been affected in that way. I'm going to put some help lines up also wherever you listen to the pod I'll make sure lovely Emma my producer who you all know puts a few bits and pieces up at the end because I know this is quite triggering
Starting point is 00:37:07 and you know it's heavy stuff but it needs to be spoken about and yes I'm not a politician and yes I'm not kind of sat here with loads of professional people talking about stuff but I do feel as a community we do need to touch sometimes on the really serious matters
Starting point is 00:37:23 and issues and I know that I sit in and we have such fun and tears are rolling down our faces. And it is a great light relief and I love making everybody laugh. But I think these episodes are really, really important. Here's another message. Hi now, it's Ruth from Warwickshire here. This is my first voice note, so bear with. Just saw your post on Insta about the algorithms, the social media access for kids.
Starting point is 00:37:53 It's really scary. I watched the Emma Willis program about a year ago now. And it was actually at a time where my 11-year-old son was wanting a phone. He was in year six. And all of his friends had them. He was probably one of the last to have a phone. But it's just a scary place as a parent because as soon as you hand it over, you just feel like you lose this control.
Starting point is 00:38:21 He has gone to high school now. He does ask, you know, such and such has got social media and I want it. And we've had to really educate him on why it's not suitable. And I think a lot of parents don't want to have that discussion with their children. They just want a bit of an easy life, just want to hand it over. But it's scary to think what access we're giving these kids to at such an early age. And I just think, God, you know, I grew up in the 90s. I didn't have a phone until I was maybe 16, 17.
Starting point is 00:38:53 we survived it, we didn't need phones. So yeah, I just wanted to kind of put my point across in terms of parents standing up for what they believe in really and not giving in and not having the easy life. And yeah, doing what's right for the children. So yeah, anyway, love the pod, keep doing what you're doing. Bye. Thank you so much for that, Ruth, and I think you're amazing.
Starting point is 00:39:19 I feel really annoyed that I, gave in, I gave in at 13 and I wish I wouldn't have, but it is very, very difficult. And that's why, if it was a thing, it was a law, we haven't got to have this conversation. It's illegal, you can't have it. And it shouldn't be on us. And it's really unfair for the children also, who are the only one that they're left out. Bullion occurs. Like, I get it.
Starting point is 00:39:45 I couldn't cope with Eliza's upset and how she felt not having Snapchat. She was isolated. And again, that is very, very. powerful tool and they know what they're doing these tech companies. They really do. I'm going to get kicked off all these tech companies soon, aren't I? I won't have a pod. I'll have no business coming in. Jobs will dry up. Oh well. At least I'm passionate. I'm going to give one of our listeners a call here. This is Nicola and she's really, really interesting. She's got a book called Not a Survival Guide. She can tell you all about it, but I was so pleased when she got in touch. I'm going to give her a call now.
Starting point is 00:40:31 I've been calling you Nicola. It's not, is it? It's Nicole. No, it's Nicole. You're right. So sorry. So sorry. Oh, it's been a busy morning. Oh, I imagine. Thank you so much for getting in touch with me. So exciting. No worries. Thank you. Yeah, it's good to hear from me. It really is. This is what kind of connects people with passions and interests and, you know, it's such a lovely thing to be able to, you know, reach out.
Starting point is 00:40:56 This is exactly it. Yeah. Yeah. So, Nicole, you mentioned that you have a book. And you work with teens. I'm really just interested. Just tell me why you felt compelled to message in. Okay, so basically I have got a book. It's called Not a Survival Guide. And that's basically a guide to helping parents
Starting point is 00:41:15 who are navigating these tricky conversations around bringing up teenagers. So that's why when I saw what you were doing, I felt really compelled to reach out because whilst I was creating the book, I had so many conversations with so many parents that were feeling exactly the same way. as I've seen you've been feeling, that level of guilt and not knowing what to do.
Starting point is 00:41:36 Once you've given that gift, I guess, of social media, how you then take it back. I don't know how you do that. And I need to get your book. I need to read your book for a hundred for sure. I really, really do. What were your findings then with the conversations you had, sort of, especially, obviously I did this experiment last week and I was involved with this campaign called Big Tech's Little Victims, and I stood and saw five minutes of material that was, you know, merged together from a seven-day period, but the material that I saw was so shocking that I just can't believe that 13-year-olds are allowed access to this sort of stuff. I think the thing is, I think a lot of adults, firstly, they put a lot of trust in parental controls.
Starting point is 00:42:25 Yes. Rightfully so, because that's what we've been sold. We've been sold that these parental controls will keep our children safe. And actually, they clearly don't. And that's what you've found out as well. But I think also it's just so many parents feel lost because they have a lack of understanding of how things work and how deep our children are going into these worlds without them even knowing.
Starting point is 00:42:48 Because ultimately, you may have social media as an adult, but you're never going to see it in the same way your child does. Absolutely right. And I think that's the difference in the algorithm. And actually there's, you've probably discovered it in what you've seen, but the algorithm pushes certain ideals towards certain profiles. And actually our children aren't safe in those situations because they're being actively presented with staff that puts them risk.
Starting point is 00:43:13 Nicole, I couldn't get over the fact that I was watching a profile. So one of the profiles was made from an, you know, an adult has got a phone, has gone on, typed in their birth date of 13, you know, as if they were a 13-year-old girl. And within the first 10 minutes, one of the videos was a girl, and it had writing over the top of the image, and it said, if she doesn't shut up, this is what I do. And then she went to bend down and perform all sex. It was her bending down to the screen.
Starting point is 00:43:43 And I just thought it was so harrowing some of the content, well, most, all of the content I was watching. And I thought, even if that's the only video that child's, child sees for that whole day, that's harrowing. It's really bad. Yeah, until you do something like you've done to put it into the mainstream and allow other people to see, you can't, because unless you're on their phones, I mean, there is ways you can look at a child's explore page or a thing's for you page. You can do that and get kind of a taste, but you don't get it. But it's still not the same. It's not the same as that scrolling. It's not the same as what they're seeing.
Starting point is 00:44:25 Absolutely not. Unless you sit on their phone for half an hour a day. Yeah, it's impacting them hugely. And you think about like how scared they would feel at that situation. And if you don't have open conversations or maybe it's been a point of contention for you as a family, like you're not sure if you want them to have it and you can only have it if you stay safe and you tell me things. Yes. The chances of them opening up because their immediate thought is, well, if I tell my mom or my dad,
Starting point is 00:44:51 they're going to take it away. So true. So true. So they're not going to open it. open up to that. So yeah, it's, it's so hard. But even with the ban, the problem with the ban is it's only going to drive it more underground. So yes, we need it, but unless it's done in a very considered way, all it's going to do is remove parental controls from these systems and remove the owners from the tech bros to sort it out. Yes, I see, I see what you mean. I mean,
Starting point is 00:45:18 I'm in favour of the ban, I suppose, from my... I am in truth. My life experience, When Eliza was 12, she got a mobile phone, sort of at the kind of middle of year 8, and that was a battle. From year 7 onward, I had a year 18 months where I stuck it out and stuck it out. Gave her the phone. I said, but you can have the phone, but you're not having any social media. Within four months, mommy, everyone's got Snapchat. I need Snapchat because people say, can I have your snap?
Starting point is 00:45:47 They don't ask for a number. WhatsApp's not called. I can't use WhatsApp. I thought, oh my God. So she was, and again, you're absolutely right, I put my face. trust in that certification. I said, you can't have Snapchat until you're 13. And now what I've seen, I feel so naive, so stupid. I feel so frustrated that I've allowed my child. But you're not, and that's the whole point. Yeah, but it's not, the onus isn't on us. I know we have to have
Starting point is 00:46:13 responsibility and 100% parents need to be, stop it as much as they can. But actually, these huge tech corporations have hooked us all in. Like, the responsibility. is there. And yes, I'm in favour of the ban, actually. I just think we have to approach it in a very cautious way to make sure that we tick all the boxes. And I don't mean that needs to take ages, but we do need to make sure it covers other aspects, like Pinterest and Capca are also offering up this content to our children that you don't even think of as social. So you might think, oh, my child doesn't have social, but actually they have access to Pinterest because they want to plan boards, but then, you know. Well, I know that actually,
Starting point is 00:46:55 From what I've read, Nicole, lovely Ian Russell, who campaigns, he lost his daughter, Molly Russell, to suicide. I believe that Pinterest was the biggest thing that she was looking at actually and exploring in. So you're absolutely right. Yeah. I mean, I've been caught out with Pinterest. My children have never had social media, and I knew that I didn't want them to have it. They've got a smartphone, and it literally has nothing on it. It doesn't even have Safari on it or anything.
Starting point is 00:47:24 How old are your children? They're 13 and 15. Can I ask how you've navigated the 15 year old to not having that? Because I would love to know. Actually, I think because he's never had it, so I haven't had to take it back. That has been really useful. But actually, it was just a boundary from day one that was set.
Starting point is 00:47:41 And he's a very kind of analogue boy. Analogue is quite cool. So I think we're now actually reaching that stage where actually a few of them are asking for a brick phone and they're asking him to do it a different way because it is kind of like more edgy. I think it's different for girls, you see. So I think, yeah, I do agree with that.
Starting point is 00:48:00 I think my daughter is going to be harder to manage. But like I was saying, she's got into Pinterest. And the other day I was like, what are you actually looking at on there? And I've gone through and looked. And actually, it's like the criminal shorts. So even I, when I'm super on top of it, she was on a family iPad in a public area with us,
Starting point is 00:48:21 which is how we run it. but she was still delving in stuff that I was uncomfortable with, so that's now gone as well. Wow. Oh, I think you're fabulous. It's amazing, and I'll get your book. I don't want you to send me one. I shall buy your book.
Starting point is 00:48:34 Thank you. What's the response been with the book? Have people really enjoyed it? What have they learned from it? They have. I think the book is quite different to a normal parenting, do this, do that point of view, because I'm in the thick of it.
Starting point is 00:48:46 Yeah. So it's been, it started as quite a cathartic process for me because I felt like as I entered teen years, I was kind of losing my way a little bit with parenting. So it's been really cathartic because I feel like people are relating to it directly, like as we're all in it, rather than me looking back on it retrospectively,
Starting point is 00:49:07 this kind of teen time. And because it's very current, the world is moving so quickly that it's healthy to have something that's written now. Yeah, that's really, yeah, it's brilliant. It's really powerful. Yeah, but it's non-judgmental. There's lots of tips and bits and pieces in there
Starting point is 00:49:25 that kind of help us to navigate, but also to realise that we're not in it on our own. I think you often feel like when they're saying, especially with the social media element, well, everyone else's parents allow us to have it. Yes. And you're the only one that's the stick in the mud. That's what I had.
Starting point is 00:49:42 That's what I had. I'm the strict one. I'm the boring one. I'm, you know, the awful one. But actually, when you get under the skin of it, you're not. you're not the only one, we're all feeling like this. Like there are so many parents that are feeling the same.
Starting point is 00:49:55 And I know there have been like little collective set up, haven't there, locally in schools and things to try and get a group of people. So it's that group mindset rather than individuals that are all making the same decision together. But I don't think it's ever easy. But then parenting isn't, is it? No, I don't think it's going to be easy. But I think the material and the content that I saw last week,
Starting point is 00:50:17 yeah. It just, all the things. we can skirt around all these things, should it be banned, should we give them brick phones, smartphones, whatever. Yeah. The biggest thing is, that material should not be out there to see. That material shouldn't even be online, no, should it? No.
Starting point is 00:50:34 And quite honestly, they've got a way of regulating that if they wanted to, but they don't want to. But why not? Why do they want our children to see that, or any human being? Why is that? Extreme right-wing racism and violence. Yeah, it's absolutely crazy. You know.
Starting point is 00:50:51 Yeah, it's very dark. It really is. Yeah, it really is. Well, Nicole, I can't thank you enough for chatting. I'm going to put a little link to your book underneath the pod. Oh, thank you. I appreciate that. Of course I will.
Starting point is 00:51:05 And who knows, but listen, this is a huge subject. I'm sure we'll talk again in the future. Yeah, if there's anything in all seriousness, if you want me to come along to anything or there's anything you're doing, then just give me a shout. I'm in Norwich, but I'm up for travelling around and supporting it as much as possible. possible. Oh, that's brilliant, Nicole. Thank you so, so much. Keep in touch. No worries. Nice to speak to you.
Starting point is 00:51:25 All right. See you. You too. Bye. Bye. Bye. What a fabulous lady. So interesting. I'm going to get that book. In my head, I'm thinking, oh, book club, book club. But I think it might be, it's just quite specific, but I will put a link out to anyone who is interested in reading it. I'm certainly going to have a read. because I do think those teenage years that really hit a chord with me when she said I got to the teenage years and I felt I was struggling with parenting when I really think about it I love me and Eliza
Starting point is 00:51:56 we've got a great relationship yes we have our ups and downs we have an argument but I do think she's a good girl but you know we're only doing this for the first time she's being a teenager for the first time and I'm parenting this teenager for the first time
Starting point is 00:52:11 and it's bloody hard work it really is and all of this gobbins is really hard on top of everything else that you're going through whilst bringing up children and these are things that we just don't need them to see view twist their minds make them sad isolate them from people you know i just think let's try it our best to to be with our kids as much as possible to get those phones down it's so hard i mean i i think I've failed that now. Eliza is too far gone and she isn't on her phone all the time, can I say.
Starting point is 00:52:49 She really isn't. And if she is, she's usually talking to people or on FaceTime. So, you know, I'm one of the lucky ones. But certainly, after what I saw the other day, Joni will not have access to any social media until she's an adult. And hopefully I won't have a battle with her because it will be the law.
Starting point is 00:53:11 but even if it isn't, I can say that now on this podcast. She will not be looking at material like that, ever. Thank you so much, everybody, for listening today. I think it's been really interesting. I've thoroughly enjoyed it. I know it's, you know, dark stuff we're talking about, but please, your experiences, things that have happened at home, how you feel about it, 0778-1919.
Starting point is 00:53:38 I'd really like to thank the campaign I worked for last week, which was Big Tech's Little Victims. I'm so proud that I was an ambassador for it and hopefully going to continue that relationship because it is something I'm passionate about. So thank you to them. And yeah, let's just try and put those phones down, read our books, cook with our kids, get outside,
Starting point is 00:54:03 wherever's getting better now, guys. Food for thought, eh? Loads and loads of love. Thank you so much for listening. Have a great rest of the week. And I'll talk to you on Thursday. See ya.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.