The Diary Of A CEO with Steven Bartlett - Brain Rot Emergency: These Internal Documents Prove They’re Controlling You!

Episode Date: February 16, 2026

How is Tiktok rewiring your brain? Social psychologist Jonathan Haidt and Harvard physician Dr Aditi Nerurkar reveal how tech addiction and short-form video are ROTTING your brain, and why AI chatbots... could cause the next global addiction CRISIS!  Jonathan Haidt is a social psychologist at NYU Stern and the author of the #1 New York Times bestseller The Anxious Generation. Dr. Aditi Nerurkar is a world-renowned expert in stress, burnout, and mental health, and best-selling author of ‘The 5 Resets’.  They explain: ◼️The "brain hacking" secrets tech companies use to hook you ◼️Why short-form video is shattering the global attention span ◼️The link between phone-based childhoods and the teen mental health crisis ◼️How TikTok causes a 40% drop in memory accuracy ◼️Why you must delete addictive slot-machine apps to reclaim your focus Enjoyed the episode? Share this link and earn points for every referral - redeem them for exclusive prizes: https://doac-perks.com  00:00 Intro 02:26 The Largest Threat To Humanity Right Now—And Why No One Wants To Admit It 06:31 How Short-Form Videos Are Rewiring Your Brain For The Worse 09:26 What Your Phone Is Doing To Your Sleep, Heart, And Stress Levels 16:15 Why Short-Form Content Is Quietly Killing Deep Thinking 19:07 What’s Really Happening In Your Brain When You Scroll 26:24 What Happens When You Quit Social Media—And Take Back Control 30:00 The Real Danger Behind Meta, Snapchat, And TikTok 36:05 The Dark Side Of Snapchat: Cyberbullying And Predators Exposed 41:23 Oxytocin, AI Chatbots, And What This Means For Your Brain 55:20 What If Your Business Depends On Social Media—Is There Another Way? 01:00:28 Why So Many People Feel Lost—And How Tech Plays A Role 01:06:16 Ads 01:07:17 The Simple Test To Know If You’re Addicted To Your Phone 01:26:08 What Is “Popcorn Brain”—And Do You Have It? 01:28:04 Brain Rot: Why Adults Can Recover—But Teens Might Not 01:31:45 Why Australia Banned Social Media For Under-16s—And What Happens Next 01:43:16 Ads 01:45:33 Why Parents Can’t Sue Social Media Companies—And What This Law Protects 02:04:29 How Technology Is Eroding Our Sense Of Meaning 02:08:39 How To Reclaim Meaning And Joy In A Hyper-Digital World 02:14:14 The 3-Second Brain Reset That Breaks The Scroll Cycle Follow Dr Aditi: Instagram - https://linkly.link/2aYTX  Website - https://linkly.link/2aYTZ You can purchase Aditi’s book, ‘The 5 Resets’, here: https://linkly.link/2aYTd  Follow Jonathan: X - https://linkly.link/2aYTq Website - https://linkly.link/2aYTs  You can purchase Jonathan’s book, ‘The Amazing Generation’, here: https://linkly.link/2aYU7  Independent research: https://stevenbartlett.com/wp-content/uploads/2026/02/DOAC-Attention-Discussion-Independent-Research-further-reading.pdf The Diary Of A CEO: ◼️Join DOAC circle here - https://doaccircle.com/  ◼️Buy The Diary Of A CEO book here - https://smarturl.it/DOACbook  ◼️The 1% Diary is back - limited time only: https://bit.ly/3YFbJbt  ◼️The Diary Of A CEO Conversation Cards (Second Edition): https://g2ul0.app.link/f31dsUttKKb  ◼️Get email updates - https://bit.ly/diary-of-a-ceo-yt  ◼️Follow Steven - https://g2ul0.app.link/gnGqL4IsKKb  Sponsors: Wispr - Get 14 days of Wispr Flow for free at https://wisprflow.ai/DOAC   Function Health - https://Functionhealth.com/DOAC to sign up for $365 a year. One dollar a day for your health Pipedrive - https://pipedrive.com/CEO

Transcript
Discussion (0)
Starting point is 00:00:00 We just got back from Davos in Switzerland, this snowy village where some of the world's leading experts, CEOs, founders, world leaders gather in this one space. And while I was there, my colleague Juan, was telling me about something he does, which many of my friends do. They list their properties when they go away on Airbnb. So many of us, when we go away, we leave our house as this dormant asset that's doing nothing for us other than racking up bills. And as some of you might know, Airbnb are one of our show partners. And I've stayed in their properties all over the world and continues to do so, but I've never actually hosted one of my properties on there. But when I heard this, it got me thinking, what a smart move it is to make money from an asset
Starting point is 00:00:38 that's currently probably costing you money. Every time you're away, your home sits empty. And what Juan told me is how easy it was to get set up. He makes his home available for specific dates so that his guests always depart the day before he gets home. So if you're trying to find an easy way to make some extra money on the side, hosting on Airbnb might be exactly that, especially if you move around a lot. Your home might be worth more than you think. you can find out how much your home is worth by going to Airbnb.ca slash host. You are actively rewiring your brain for the worst by engaging with social media, high volume, quick videos. And the social media executives don't let their kids use this stuff because they designed it to be addictive.
Starting point is 00:01:15 And they know that millions and millions of kids have been cyberbullied, sex-storited, many have committed suicide. So I'm getting angry. And then from the medical perspective, it's rewiring your body, increasing your risk of heart disease and PTSD. We've moved too far into the. virtual world and the results are catastrophic. People are spending roughly about six and a half hours a day on their phones. What do we do about this? Well, here's the amazing thing. We actually can control our fate. So we are joined by a social psychologist and a Harvard physician to dive into the technology addiction and brain rot crisis billions are facing worldwide. And how we can counter its devastating mental health effects.
Starting point is 00:01:52 You have to reclaim your attention because without the ability to pay attention for several minutes at a time, We're seeing the destruction of human potential, the human relationships, the connection. But there's all these small tweaks that you can do to override that primal ursh to scroll. For example, 91% of people had an improvement in attention, well-being, and mental health. After just two weeks of continuing to use their device, but not having internet access. Next, keep your phone out of your arm's reach because the sheer potential for distraction has actually been shown to change your prefrontal cortex, which is called brain drain. So yes, we should exert more self-control, but we're being pushed and addicted.
Starting point is 00:02:27 apps and it's messing us all up. That's not our fault. Would you advise people to delete these short-form video apps? Oh my God, yes. That would the most important thing you can do for your intelligence and for humanity. But if I was going to offer some specific advice, here are the three things that I do with my students to reclaim retention. And then to add to that, I have the three-second brain reset. So first... I wanted to ask you guys what you thought of this. Hey, you're back. This terrifies me. We've got to stop this now. Guys, I've got a favour to ask before this episode begins. 69% of you that listen to the show frequently haven't yet hit the follow button.
Starting point is 00:03:04 And that follow button is very smart because it means you won't miss the best episodes. The algorithm, if you follow a show, will deliver you the best episodes from that show very prominently in your feed. So when we have our best episodes on this show, the most shared episodes, the most rated episodes, I would love you to know. And the simple way for you to know that is to hit that follow button. Thank you so, so, so much. Jonathan, Edithi. Jonathan, I've heard you say that the destruction of attention
Starting point is 00:03:34 is the largest threat to humanity that's happening around the world. And I've also heard you say that short-form videos are the worst of the worst because they're shattering attention spans. The reason why I wanted to have this conversation today is somewhat personal. And in fact, all of the conversations I have on the Dyeravis here are somewhat personal to some degree. They're inspired by some unanswered question I have in my head and also some observation I have in my life. And the observation I've had is that short-form videos in particular are making my life worse.
Starting point is 00:04:07 And actually, I've got to say, the catalyst moment really where I thought, you know, I need to get you exceptional people together to have this conversation was, I thought this, I then looked at my screen time and saw a huge change. I felt so much worse because all these social platforms have short-form video now. And then I actually heard Elon Musk, who, you know, has a social media platform that does short-form video. say that he thinks it's one of the worst inventions for humanity. Jonathan, why did you say what you said about short-form video and this corruption of attention?
Starting point is 00:04:36 Because I wrote a whole book called The Anxious Generation focusing on teen mental health. That was the mystery that popped up in the mid-2000s. Why are people born after 1995 so much more anxious and depressed? And I've been tracking down that mystery, and a lot of it points to social media and especially Instagram, social comparison, and all the things we know about social media.
Starting point is 00:04:58 When the book came out in 2024, since then, what I realized is that I vastly underestimated the damage because I focused on mental health, which is a catastrophe, but the bigger damage is the destruction of the human ability to pay attention without the ability to pay attention for several minutes at a time,
Starting point is 00:05:17 ideally 10 or 20 minutes at a time. Without that, you're not going to be of much use as an employee. You're not going to be of much use as a spouse. you're not going to be successful in life. And that's when I realized this is way beyond mental health. This is changing human cognition, changing human attention,
Starting point is 00:05:34 and possibly on a global scale. Adi, what perspective do you come at this from and what's been your perspective through all the work you've done about brains and stress and neuroscience and all these kinds of things that has shaped the way that you think about
Starting point is 00:05:48 social media screen time, short form video? My background is that I'm a physician at Harvard and my expertise is in stress, burnout, and mental health. And so that is the lens that I view all of this through. We know that the most deleterious relationship that you have is with your device. You know, in every healthy relationship, we have boundaries. We have boundaries with our kids, our parents, our colleagues, our, you know, with our friends. And yet we have no boundaries and often porous boundaries when it comes to the relationship you have with your device. So it's not
Starting point is 00:06:24 so much about, you know, becoming a digital monk and renouncing technology because technology can serve us, right? It inspires, educates, connects. Not more than ever. It's so important to be an informed citizen, but not at the expense of your mental health. And so what Jonathan was saying, this, you know, constant being engaged with your devices, with social media, the scrolling from the minute you wake up until you go to bed, there's a reason why you have your best ideas in the shower. And that's because that's the only place in the whole day where you are not with your device. People take their device to the bathroom. They sleep with your device. You eat with your device. People walk down the street. There's more near-miss pedestrian accidents because people are
Starting point is 00:07:05 walking while they're crossing the street and looking at their devices. And so there's all of this brain biology at play behind the scenes. So both of you have talked about how it doesn't feel good to engage and constantly be on your phone, that sense of infinite scroll. But there is, you know, it's feel like you're doing nothing. You're just doing this, right? you're doing. But in fact, it is not passive, it is active, and it has a profound effect on your biology, on your brain, on your psychology, and also social factors that I hope we talk about today. You know, scrolling, wasting a bit of time doesn't seem so harmful. What is the big, if we play this forward 10, 20, 30 years, what is the big risk or threat? The biggest threat
Starting point is 00:07:46 right now, we don't even have to wait 20 years, is that it, through a process called neuroplasticity, which is just a big fancy word that simply means that your brain is a muscle, is that by engaging with social media, that sense of high volume, low-quality, quick videos, you are actively rewiring your brain for the worst. So you're increasing your sense of stress, worsening your mental health, attention, cognition, distractibility, irritability, complex problem-solving. All of that changes when you engage in that infinite scroll. Yeah, I'd like to add on here because one of the things.
Starting point is 00:08:21 the main arguments I get is, ah, this is what they said about television. Oh, this is what they said about comic books. This is just another moral panic. But people need to understand why touchscreen devices are so different from television. And so parents find this helpful. If I just lay this out briefly, good screen time versus bad screen time. So humans are storytelling animals. We have always, as long as we've had language, we've raised our kids with stories, epic poems, all kinds of stories. Stories are good. The human brain needs lots of patterns. The child's brain needs lots of patterns to develop. So the worst thing you can do is hand your child to device because they're crying for it because they've been trained to get it and you're busy so you have hand on the device.
Starting point is 00:09:02 They're quiet. What's happening? They're sitting alone. When I was a kid, we always watched it with my sisters, with my friends, you're arguing about it. You're talking. It's such. Kids sitting alone with a device in his hand. It's not long stories. It's never long stories. It always ends up. at YouTube shorts or TikTok or Instagram Reels for older kids. So they're doing this. But here's the key thing that it does that a television does not. A television puts you in a state that psychologists call transportation. You get into a story and you find yourself pulled in and you're rooting for the characters.
Starting point is 00:09:34 And this is how a brain gets tuned up to social patterns. But it can't happen in 10 seconds. It can't happen in one minute. It takes a long period of time. And there is no reinforcement. There is no, the television doesn't do anything to you. You don't have any response. Whereas a touchscreen device is a Skinner box. So BF. Skinner was one of the founders of behaviorism. And he put rats and pigeons in a box where he could deliver a reinforcement,
Starting point is 00:10:02 a little grain of food on a schedule. And by giving them quick reinforcements for behavior, he could train them to do amazing tricks in just a few hours. When you give your kid a touchscreen device, it's stimulus, response, swipe, get a reward, or not. not variable ratio. And then, and you just keep doing that. So you are, as Aditi said, it is rewiring your brain. It's not just wasting time. It is literally training you to do things where television didn't do that. So this is a whole new game. And to add to that, you know, from the medical perspective, you're shortening this attention span. And what happens over time is so like Jonathan said, right, you're not sleeping as well because you are engaged with your device.
Starting point is 00:10:41 We know that 80% of people are checking their phones within minutes of waking up. We We have something called revenge bedtime procrastination, this concept of, you know, at the end of the day, you're fatigued, you've had a long day, you've had no me time, and you want to get to bed early. We all know, by the way, what the data is that, you know, we've been taught since we were little kids, right? Like bedtime, sleep is important. It's good for your body. It's good for your brain. And we might have all the knowledge in the world. But in terms of action, there's a wide gap between knowledge and information and action.
Starting point is 00:11:08 And so revenge, bedtime procrastination is kind of an offshoot. So what happens? So, you know, you have that decreased attention. you have that irritability, hypervigilance. And so at night, at the end of the day, it's 9 p.m. You finally, you know, if you're a parent, your kids are asleep, your kitchen is clean. Maybe you finish your entrepreneurial day and you finally sit down with Melanie on the couch and you're like, ah, some me time. And you know you want to get to bed early and you know it's good for you.
Starting point is 00:11:34 But then suddenly you're scrolling and before you know it, it's 2 a.m. And you're saying, oh, my God, what happened? Why am I still awake? What was I doing all this time? What happens is that you essentially give yourself some me time at night. night. And so you procrastinate bedtime. And so what happens is with this revenge bedtime procrastination, it affects your sleep. And then when you don't have good sleep, good quality sleep, so you have difficulty falling asleep, staying asleep, sleep, debt over time for kids, for adults, has all
Starting point is 00:12:01 sorts of ramifications. So this is just the tip of the iceberg, this short form video content and the ripple effects go far and wide. Not only is it rewiring your brain, it's rewiring. It's rewiring, your body, it is affecting your sleep, which increases your risk of heart disease later in life. And when you're consuming graphic videos and graphic images, it can increase your personal risk of PTSD through vicarious trauma, even if you weren't there. So this is just a vast network of things that can happen to you simply because you're thinking, yeah, it's harmless, what is it? It's just a bunch of videos that I'm checking out as a way for me to decompress. What do I need to know about the nature of the brain to understand exactly what short-form video is playing, is hijacking, is taking advantage of?
Starting point is 00:12:53 The thing to understand about all of this is that we have to focus on childhood. Why do we have childhood? Humans have this really interesting childhood where we grow rapidly at first and then we slow down for about five or seven years. We don't grow very quickly. And then we speed up at puberty. Whereas other primates, they just grow and grow until they reach reproductive age. then they reproduce. But we seem to have this long period of sort of middle childhood for cultural learning. It's a period in which the kid is now walking and talking and turning away from the parents. And that's a time for this to come in. And they pay attention and they form relations. All these things have to happen slowly because the neurons are gradually growing.
Starting point is 00:13:32 They're finding each other based on what the child is doing. Okay. So we grow up in the real world and that happens over time. And a lot of that is very physical. Kids are very physical. Mammals are very physical. and there's a lot of touch. So that's a healthy human childhood. But when you give an iPad or your old iPhone, and they begin doing the touching and swiping, that is going to hijack their attention, that is going to push out all other forms of action and learning,
Starting point is 00:14:00 and that is going to change the way the parts of the brain that learn to pay attention, what's called executive function, it's going to change the way the brain learns to pay attention, it's going to change the reward circuits. I think you had Anolemkeon recently, who is the nation's expert on addiction. And the way that she describes it,
Starting point is 00:14:16 how any one addiction is going to change your reward pathways to make you more vulnerable to other addictions. So we're setting our kids up, not just for this, but then when they get a little older, it'll be video games, it'll be porn, it'll be gambling. Now, everything is gambling. So we're setting them up for a life
Starting point is 00:14:34 in which their brain is saying, give me some quick dopamine. Give me some quick dopamine. I don't have to work for anything. I don't want to have to apply myself for an hour and then get a reward. And so what the short videos are doing for kids is preventing them from learning the connection between hard work and a reward. Is there anything else I need to know from a neuroscience perspective about what's going on in my brain when I develop these addictions with short form videos or these sort of quick dopamogenic tasks?
Starting point is 00:15:04 So we all as humans have a primal urge to scroll. When you feel a sense of stress, as many of us do in this moment, life, it is your sense, you know, your amygdala. And so it's your sense of self-preservation. It's survival and self-preservation. That is what your amygdala does. So if you want me to show you here, because I have no idea what I'm doing that. Yeah, it's okay. So here, deep here, it's a small almond shape structure and that is your amygdala. And your amygdala, its main purpose, is survival and self-preservation. It houses your stress response, your fight or flight response. And it is truly what is activated when you are engaging in content, when you feel a sense of stress. And so you have this
Starting point is 00:15:44 primal urge to scroll. And so evolutionarily, we, when we all were caves people living together, we would sleep at night and there would be a night watchman scanning for danger. And now we have our, we have become our own night watchman. And so we scan for danger all day, all night long. How do we do that? We scroll. And then the amygdala is trigger. And then you scroll some more and you scroll some more. And so over time, what you're doing is that you're making that amygdala in a state of chronic, it's continually being triggered. What happens to the amygdala? Over time, when it's continually triggered, it starts to rewire your brain in other ways. And how does it do that through something called the prefrontal cortex?
Starting point is 00:16:26 If you put your hand, I can use this model, but I can also just use my hands. When you put your hand on your forehead, the area right behind your forehead right here is the prefrontal cortex. This is a very important thing for our conversation, this area of the brain. And what the prefrontal cortex does is it is called, it governs executive functions. So impulse control, memory, planning, organization, strategic thinking, complex problem solving. And there is a tension between your amygdala and the prefrontal cortex. When your amygdala is in the driver's seat, that prefrontal cortex is quiet. And what is happening as we continue to engage with our devices and have this primal urge to scroll
Starting point is 00:17:09 that amygdala upregulates and the prefrontal cortex down regulates. And over time, that is very problematic for all of the reasons that we're kind of introducing at the start of this conversation. There was a meta-analysis done in 2025 of 71 different studies, and it found that heavy short-form video use was associated with reduced thinking ability, especially shorter attention spans and weaker impulse control. That's right. These studies are just beginning to roll in now. Kids have been on social media really a lot since 2008, but especially once they got smartphones around 2012. Studies began coming in in the 2010s that it's looking like the kids who are spent a lot of time on this are doing much worse. They're more depressed. The focus was on depression. And some other researchers said, no, it's just a correlation. You can't prove causation. And we've been going around around on this for about 10 or 15 years. Now we've we're doing the same thing with the short-form videos.
Starting point is 00:18:04 The damage everyone can see, my students tell me this is what's happening, we feel it, studies are coming in, but there will be a few studies here and there that don't show it, and people will push that up. Meta spends a lot of time and money to influence the public debate. A lot of public documents are coming out now about how they do that. So we can engage in debate over research on short-form videos for five or ten years, but at that point, it's way too late. We've lost a second generation, gen Alpha.
Starting point is 00:18:33 So I think when we're talking about kids especially, we need to have what's called the precautionary principle, which is if there's reason to think that this is hurting kids, how about we don't roll it out into every childhood? How about we make these companies responsible? We hold them responsible for what they're doing to kids because we're about to make the same mistake we made with social media, letting it worm its way into childhood.
Starting point is 00:18:56 We have already done that with short videos, and we're about to do it with AI chatbots. we're just beginning it in late 2025, I'd say. I don't think people quite realize how much these major social media platforms have figured out that short-form video sells. We're actually seeing this sort of global rise in short-form drama apps now. And I don't know if you guys have seen these apps, but it basically takes a movie that used to be two hours long, and it breaks it down into, say, 60 different parts. A colleague of mine at my company was showing me the other day in different parts of the world, they're exploding.
Starting point is 00:19:27 there's been a 190% increase in short-form drama apps. Takes a long-form movie, turns into short-form videos. Disney Plus plans to introduce AI-generated short-form videos this year, starting with 30-second limits inside the Disney Plus app. And TechCrunch also reported that as of October 2025, Netflix tested short-form video content on phones and recently announced its plan to expand to this feature. It appears that all of the content we consume is going that way.
Starting point is 00:19:53 And listen, I'm friends with lots of people at big social media platforms. this doesn't get me in the doesn't stand in my way of criticizing them because I think two things can be true at the same time, right? So I think it can be true that I have a podcast and I make short-form videos and that I also understand that there's a real downside to them
Starting point is 00:20:09 and all of the major social media platforms that I speak to have a huge drive towards short-form video. It appears to be their number one strategic priority and obviously because of the success of TikTok as of January 2026, TikTok, I believe is the most downloaded social app in the world now. And if I'm running a social media company and my one focus is profit, I'm now faced with an existential crisis. I either take part in this thing that is driving
Starting point is 00:20:39 the highest retention, therefore the best ad payouts, or I die. So there's two comments to that. First off is that, you know, when we think about social media and how society is shapeshifting to allow this short form content, there is a concept that Jonathan and I briefly mentioned, I think prior to us filming called second screen viewing. And so what's happening is that allegedly these big streamers are asking their creative talent, whether it's screenwriters or actors or directors, to replay, to reiterate the plot. Because as you're watching, you know, when we were kids, we would watch TV or movies and you just sit on the couch and you'd have a bucket of popcorn with your family and you'd watch a movie, an hour, hour and a half, two hours. And now
Starting point is 00:21:27 second screen viewing is happening, which means that you're watching a movie or a TV show and you're on your device. And so you are constantly having that fragmented attention and we are all doing it. And so what these streamers are allegedly asking their creative talent to do is to reiterate the plot. So it's shape shifting. It makes sense. If my brain is, you know, I'm 33 years old, so I've grown up with a lot of this stuff, if my brain has been wired to have shorter attention spans. And movies from 30 years ago are not going to cut it for me. Right.
Starting point is 00:21:56 But then look what happens. If everybody chases that, and I know, look, Netflix is making shorter and shorter stuff. Even TED, the TED conference, TED Talks, are getting shorter and shorter. What does that do? It just repeats the cycle. Now, I appreciate that you're in a collective action trap. As you put it, if I don't do it and everyone else is, then I lose out. And so the business pressure on all the creators, the business pressures go shorter, shorter, shorter, shorter.
Starting point is 00:22:21 There's a very useful psychological term distinction here that I think would be helpful, which is the difference between psychological assimilation and accommodation. This goes back to Jean Piaget, the great developmental psychologist. We have certain mental structures. We have a model in our head of how things work. And then you learn something new. Oh, that's a kid learns, oh, that's an artwork. Okay, I put that into, you know, that's just that you just assimilate.
Starting point is 00:22:45 They learn lots of animal names. And then they learn something that doesn't fit. like you learn about bacteria. And now you have to, okay, now you have to change your mental structure. It takes a little time. You change your mental structure to understand more about life. That's what education really is all about. You have to have a lot of assimilation, of course.
Starting point is 00:23:03 But you need that accommodation over and over again. That's why you want to go to college. That's why you want to read novels. That's what a great movie does. It takes time. And so one of the great things about this modern technology is that we can do things like have this three-hour conversation. I can't believe it.
Starting point is 00:23:19 People are going to listen to it. it. So this, you know, long-form content, this is all about accommodation. Anybody who walks out, who leaves this conversation after three hours and isn't thinking about something differently, we failed. Okay. So you are very much in the accommodation business. That's great. And then the question, both a moral and a strategic question is, how much do you need to play the quick hit game in order to get people there? I leave that to you to the moral calculation. Maybe it, maybe it balances out. maybe, but I think that's where you are. Would you advise people to delete these short form?
Starting point is 00:23:54 Oh my God, yes. Of course. But yes, that was the most important thing you can do for your intelligence and for humanity would be delete them. So what I advise my students to do is, I say, just do this. Just delete one of the social media apps that you use, especially if it's TikTok. Just delete from your phone. You can still check on your computer. If someone sends you a video, you can still watch it on your computer. You can even check it, you know, every weekend. You can spend some time on it.
Starting point is 00:24:21 But just get it off your phone. Because on the phone, the phone is always with us, an extension of our body. And if it's always there, then it's going to take every, it's called attention fracking. It's going to break up your attention. It's going to take every seven seconds that you're not doing something, you're going to go for the phone. So the best thing you can do to make yourself smarter and a better partner and a better human, I would say would be to delete the short, especially any of the short form videos or TikTok. unfortunately YouTube, YouTube, which has a lot of good stuff on it, becomes YouTube shorts.
Starting point is 00:24:50 Instagram, which does a lot of terrible things, but people do find it useful for all kinds of purposes, becomes Instagram reels. So I think the proper amount of short form video for children, zero to 18, is zero. They should never be watching the vertical videos. Parents, don't ever let your kids watch the short vertical videos. You might even, if only there was a way to put a time limit, you can say, it has to be 10 minutes or longer. Kids, you can have an hour of YouTube, but it has to be 10 minutes or longer, nothing shorter than 10 minutes. That at least will get rid of this, the quick, the quick swiping, the dopamine stuff. So I would say that for kids, yes, like, you know, not engaging at whatsoever.
Starting point is 00:25:25 But for someone, you know, my approach is a little bit different for someone who's like in their 30s or in their 40s. And the way I would kind of frame that is instead of renouncing, you know, saying I'm going to get it off my device and I'm going to check on a desktop, which is great. There's little kind of tweaks that we could do because my approach is to foster that sense of empowerment in someone to help them make positive change. And so one strategy that you could use, if you are saying there's no way I'm getting rid of my, I'm not deleting these after my phone, right? By the way, I practice what I preach and I really don't engage in technology to the best of my ability. But one thing that you could do is grace scale your phone. And so especially at night, like it's 9 p.m. Like we talked about revenge, bedtime procrastination. You know that you're going to do it. You're going to sit down or you're going to scroll and before you know what it's 2 a.m. Instead, gray scale your phone. This simple switch. You can toggle it. I have my phone set to gray scale, which simply means that you're getting rid of your color, making it black and white. And so when it is gray scaled, then you, you know, it doesn't have that same addictive quality to it. It's like going through a grocery store. Marketing executive described it this way to me. Going through a grocery store, instead of the technicolor junk food cereal, it's just black and white. So you have a less,
Starting point is 00:26:44 there's a greater sense of compulsion to continue checking. So that's like one strategy you could use. And the other is to set some boundaries. So geographical boundaries, keep your phone out of your arm's reach. If you're at a desk, if you're a student, not right next to you because we know there's this phenomenon of brain drain. So it's not just that when you're using your phone, it can have a potential distraction, but also just having it close by. It's called brain drain. And so putting it in a desk drawer, keeping it in another part of the home if you are working, keeping it far away from you. And so you kind of can override that primal urchus scroll that your prefrontal cortex take hold again. And so there's all these small tweaks that you can do. You think, no. Yes, there are all these small tweaks you can do,
Starting point is 00:27:31 and they will make the heroin a little bit less addictive. And yeah, you should try those. But what I can say after teaching this course for many years, is that people who try that, they, yeah, you know, it helped. But you only really get the transformation when you quit social media. That, you get your life back, you get hours a day back.
Starting point is 00:27:48 So, and so I would urge everyone to just think, you know, you only get one childhood. You only get one young adulthood. And if you're going to spend it scrolling, what do you have to show for it at the end? And when you get people to reflect on, how much value do you really get from watching the short videos, how would your life be different if you knocked it out?
Starting point is 00:28:10 Once they realized that their motives for being on it were either just to keep up or because that's what everyone else is doing or, as you said, I deserve it because I'm tired. Well, why are you tired? It's in part because your attention was fragmented all day long. So you only really get the transformations when you get a real change in what you're consuming. Although, of course, yes, setting to grace will be helpful, but it's not going to be transformative for most people, I believe. And then, you know, based on the science, there's certain elements, like when we think about what is it about the phone that is creating that sense of compulsion, Jonathan is right. So what is it about the phone? It's not just the phone, you know, you're
Starting point is 00:28:47 scrolling, you're engaging. There are two studies that were really interesting. One, people got off of, they continued to use their devices, they had no internet. So it's like, you know, I tried this experiment myself in December. I was out of the country. And so I just let my, you know, I didn't plug into Wi-Fi. And I found, you know, markedly, a marked change in my mood, my sleep. And I'm not even, you know, 20 years old on TikTok. And it was so different. And so the study found that just two weeks of continuing to use your device, but just not having internet access, improved your attention, well-being, and mental health. And in this population, it was all adults. It wasn't kids. It was all adults. I found that 91% of people had an improvement in at least one of these metrics. And then another study, more recently, just one week of not engaging in social media, digital detox, they called it, did the same thing. Better, you know, less anxiety, less depression, decreased insomnia. But my feeling is that, you know, there is this new kind of meme, right?
Starting point is 00:29:52 Like the millennial urge to delete my internet presence and live off the grid. there is certainly utility to that. And I salute anyone who wants to engage in that analog life more and more. But from where I sit, I feel like we do need to have healthier boundaries and engage more responsibly. It also builds up that muscle and it can help, you know, takes eight weeks to do. Neuroplasticity when you're building new brain circuits, it takes eight weeks. Falling off, getting back up is part of habit formation. So if you're going to make any of these changes, understand that it takes some time. But I don't know if it is possible for me or for others to say fully, I'm going to, you know, delete off of my phone.
Starting point is 00:30:39 But I love that. So I'd like to go a little further with this. So the way you put it, yes, there's all these things that we could do. We should have boundaries. But all of that puts the responsibility on us. Agreed. And that's where we are with junk food. With junk food, we're like, okay, it's out there.
Starting point is 00:30:56 We have to learn self-control. We have to teach self-control to our kids. Okay, that's the way it is in this country. But the digital device, I think, are very, very different. So imagine if, imagine if we sent our kids out into the world, and it wasn't just that there was junk food in all the stores. It was that everything was made of junk food. You know, door handles, you can eat it, it's chocolate.
Starting point is 00:31:15 But it's not just that the world's made of junk food. It's, they actually can tell, they're able to tell what you're craving at the moment, and maybe you're more in the mood for salt. So now it's all potato chips or pretzels. If the world is designed by companies to always give you the thing that will most grab your unconscious desires will affect the amygdala, the reward centers, that's on them. That's not our fault. My general rule as a social psychologist is if a few people are doing something bad or self-destructive, well, you know, they should learn some self-control or that's something about them. But when 90 or 95% of people are doing
Starting point is 00:31:53 something self-destructive, that's because of the companies that put us in an environment that encourages addiction. So I just want to read a quote. We have so much good stuff coming out from META, from all the whistleblowers. Now, all the court cases are beginning. In Los Angeles, finally, at the first time, Meta's going to face a jury with all the parents who've lost kids. So here is, here's a chat. So we have a lot of internal documents that came out from the attorneys general that are suing META. So while they're talking about the results of some of their internal research, one of them says, oh my gosh, y'all, Instagram is a drug. We're basically pushers. We're causing reward deficit disorder because people are binging on Instagram so much, they can't feel
Starting point is 00:32:34 reward anymore, which is something Anna Lemke said. Like the reward tolerance is so high. And then he says, I know Adam, meaning Adam Osseri, I know Adam doesn't want to hear it. He freaked out when I talked about dopamine in my teen fundamentals leads review. But it is undeniable. It's by biological and psychological, top-down directives, drive it all towards making sure people keep coming back for more. This is not on us. They designed it to be addictive. They've done research to make it maximally addictive. They push it on children. They tried to get Instagram kids for even littler kids. They know what they're doing. They've done the research. My team, we put together, we found references to 31's internal studies that Meta did.
Starting point is 00:33:17 They've done a lot of research, finding harm. They bury it, but you can find it at Meta. internal research.org. We put it all online. You can read these quotes. So yes, we should exert more self-control, but basically we're being pushed addictive substances, addictive apps, and it's messing us all up. I agree wholeheartedly that it is so destructive. And you feel like even with people in their 40s and 50s and if anyone can do it, it's you, Jonathan. Seriously, I would love to see it. You know, we also know, based on the data that these things quite, they reshape our brain, rewire our brain through neuroplasticity, and also change our brain waves. So patterns, so we talked about the amygdala and the prefrontal
Starting point is 00:34:02 cortex, right? But they also change brain waves. And so when you look at studies and the data, it has the reward pathway and dopamine. And these brain patterns, the brain waves mimic addictive behaviors. And, you know, there's certain features, right? Like when you do swipe down to refresh, it's the slot machine. It was modeled directly after the slot machine. Yeah. Or autoplay or the algorithm, that infinite scroll. One really interesting kind of like breaking news, which you guys may have already heard of,
Starting point is 00:34:33 it's like three days ago. The European Union Commission found TikTok to be in breach of the Digital Services Act. And what it said was that it is addictive. It, you know, creates compulsion and gets me. people into this autopilot mode. So they have difficulty disengaging. And personally, I am moving away from social media and really leaning into analog life. But I think with the way the world, you know, it's one of our only ways to connect, right? Meaning, I don't mean connect deeply. It's become one of our only ways. I don't mean connect like in a deep way. But be informed to know
Starting point is 00:35:10 what's going on in the world, et cetera. I suspect that because we've spent so long criticizing Metro over the last 10 years, because the biggest in any category takes all the heat, so Open AI is taking it now. And what this often does is it provides cover for other people to be even more extreme with that behavior while like metadata the heat. And I actually think this is how TikTok came to be. TikTok had basically, originally started as musically,
Starting point is 00:35:36 became TikTok. They were taking no heat. So they created an algorithm, which is the equivalent of like crack cocaine. The reason why I have a TikTok account, I don't have the app on my phone. I have never had the up on my phone. I don't, I don't, was because I, I noticed that the view variance on TikTok was like no other
Starting point is 00:35:58 platform. What I mean by that is you can have a million followers on TikTok and you can get 10,000 views or you can get 10 million views. In the 15 years that I've been on social media, building social media businesses, I'd never seen this before. And what it indicated to me is that the algorithm was being an even more aggressive sorting hat or retention machine. What to push up, what to push down.
Starting point is 00:36:18 Yeah. Yeah. And so, like, when I started in social media in 2014, if I had a million followers, I might get a million views or maybe 800,000. I did some research the other day on all of our social channels over time. And what we're seeing is the variance in the amount of views we can get is increasing, which means the algorithm is doing more work to say, show everyone this. I don't care if the person that posted it. It's called Jenny and has seven followers and show no one this.
Starting point is 00:36:44 And I don't care if it's Stephen who has a million followers or whatever. And I've realized that TikTok was way ahead of everybody here. And that's why they are the most addictive, the fastest-growing platform. I say all this to say that even if META shut down tomorrow, someone else would seize the opportunity, if there isn't sort of policy, I guess, in place. That's right. That would be whack-a-mom, right?
Starting point is 00:37:08 Yeah, no, that's right. And so, you know, in terms of who's done the damage to kids, META is the big fish via Instagram. And they're also the main player in terms of spending a huge amount of money to lobby Congress and block laws. They're also the main player in buying up civil society organizations, giving money to organizations, the national PTA, all sorts of organizations. They get to then give a message on digital citizenship or digital health. So meta really is the major driver. Meta is the tobacco industry here, trying to change the dialogue.
Starting point is 00:37:40 But in terms of the products, Snapchat is probably more deadly in terms of the actual. number of deaths per user, because Snapchat is not, it's not making you depressed by social comparison as much. Snapchat is introducing you to all kinds of people, and it's the main way that drug dealers and extortionists find kids. Snapchat has a quick ad feature, which relentlessly pushes you to connect with friends of friends. So once a man can get any kid in a school, now he can get connected to all the kids in the school. So when we, in a lot of the court cases, you know, when you have suicides from cyberbullying, you have drug overdoses from, you know, A kid bought a Xanax, but it had fentanyl in it.
Starting point is 00:38:17 So Snapchat, at Snapchat in 2022, we know from their internal documents from the lawsuits. They were getting 10,000 reports of extortion from their users, not a year, every month. And that's just what was reported, which is the tip of the iceberg. So Snapchat is a terrible platform for children to be on. It should be an adult-only platform. You're talking with strangers around the world and with disappearing messages. and Snapchat doesn't even keep a record. It is ideal for extortion.
Starting point is 00:38:47 There's even a handbook. How to extort kids on Snapchat. It goes around the world and criminal organizations use it. So I definitely don't want to let Snapchat off. TikTok, of course, is a Chinese company. I mean, nominally we'll see if that's changed, but it was a Chinese company that gave its...
Starting point is 00:39:03 Chinese kids got healthy TikTok or do you know, they learned to follow astronauts. And they gave us the... Their algorithm feeds their kid patriotic stuff, it shuts off at a certain time at night. There's all kinds of limits. So the people make the technology generally want to protect their own kids, and they want other kids to use it. That's what TikTok is doing in China. They want American kids to rot in hell, but they want their kids to grow up with the ability to focus. And it's the same thing with the tech guys
Starting point is 00:39:32 in Silicon Valley. They don't let their kids use this stuff. They make their nanny sign contracts that they will not let the kid have a phone. They will not expose the kid to that. They send their kids to schools like the Waldorf school, precisely because there are no computers or tech in the classroom. So once again, we see their revealed behavior. They know they designed it to be addictive. They know it's addictive. They don't let their kids use it. They want your kids to use it. So I think that's where we are. And how does AI become a protagonist in this story? So my work is now focused on AI chatbots, mental health, and the human connection. We haven't yet kind of delved into loneliness, but there's this unmet need for human connection, right?
Starting point is 00:40:11 deep human connection. We don't have a sense of meaning or purpose right now because what happens is we can talk a little bit more about the default mode network and what happens to your brain when you don't allow yourself to get bored because you're constantly on your devices. And that meaning and purpose, that self-referential thinking is really what develops from your board. And so all of this that we're talking about, that feeling of disenchantment, it's a fragmented society, you're by yourself, it's that echo chamber phenomenon. All of it leads to, it kind of opens a door for AI chatbots. And so what reason is, because these tech companies are sensing that people aren't really happy on social media and they're thinking about
Starting point is 00:40:50 getting off, right? They're using it less. Because social media has to become less social, more media. So they're not really engaging as much and they're spending time doing other things. And so the Atlantic had a fantastic piece about this. They're billing it as the antisocial media. So tech companies are building AI chatbots and calling it. It's the anti-social media. It's a place where you can go to form deeper connections and, you know, really have someone understand you. One of the tech leaders said that there's an unmet human need for connection and people don't have as any friends as they want to. And so we're going to introduce friendship through AI chatbots.
Starting point is 00:41:30 There is a Reddit forum right now. So just to back up, AI chatbots, what we're talking about in our conversation today is the publicly available chatbots, not, you know, AI for medical. care, which has, you know, breast cancer, so many wonderful in my field and, like, medicine, breast cancer diagnoses and detection five years earlier through AI. I mean, there's some amazing things coming out of AI. This is about the publicly available conversational chatbot phenomenon. And so Harvard Business Review found that the number one use case is not productivity, is not, you know, coding or the things that you think of when you're using an AI chatbot.
Starting point is 00:42:10 but it's mental health therapy and companionship. Number one use case of AI chatbots. So people are using AI chatbots as a life advisor, as a therapist, as a companion. And on Reddit, which is like the zeitgeist. It's like, you know, where we... And why is this a bad thing? Oh, I mean, so many reasons why it's a bad thing. To use it as for companionship, for example.
Starting point is 00:42:32 There's so many red flags about AI chatbots. And so Reddit has a forum. It's, I think, last I checked, 45,000 people. AI is my boyfriend. and, you know, people who are having a relationship with their AI chatbot. The reason it's bad, I mean, AI chatbots are, you know, where social media is about attention, the attention economy, dopamine. What's happening with the AI chatbot phenomenon is that it is forming attachments.
Starting point is 00:43:00 So oxytocin is a hormone, the bonding hormone. And we're probably going to see more data on how oxytocin is involved. And so it is going to reshape human. connection. If I could add on to that, that was beautifully put, social media came and hacked our attention and took most of it with devastating effects. Now AI is coming to hack our attachments, which is going to have even more devastating effects. So think about it this way. Everyone needs to understand the attachment system. It's this wonderful system that all mammals have that keeps the mother and other species, but for humans, mothers and fathers, keeps us connected to the child and the child
Starting point is 00:43:39 to the parent, but it's this cybernetic system in which as the kid is, as the kid is beginning to develop and is able to, like, you know, you do like peekaboo games and you do the back and forth, and it's just the most delightful thing. You get that back and forth. It's called serve and return interactions. And all the time the child is developing what's called an internal working model of the parent. And the model in their head is, oh, you know, when I get in trouble, this is the person that comes and soothes me. And the point of this isn't just to make the child feel good. The point is that now the child can go off and play, because that's where
Starting point is 00:44:13 the learning happens. It doesn't happen when you're in your mother's arms. The whole point of the attachment system is to regulate the child going off and playing, taking risks, having experiences, and then when something goes wrong as it always does, then they come running back to their secure base. And if they don't have a secure base, then they're much more anxious and they don't explore
Starting point is 00:44:29 as much, and they don't develop as much. So this develops very gradually over all of childhood, and the internal working models you develop as a child are the that you will reuse in puberty for romantic relationships. And so if you are securely attached as a child, you're more likely to be securely attached as an adult on the dating market,
Starting point is 00:44:49 which makes you a much better candidate for boyfriend or girlfriend or husband or wife. What's going to happen? AI is going to intervene very early. AI is going to be so much more responsive than the parent because the parent has a job and the kitchen and two other kids and is not always there. But the AI teddy bear is always there for you. So the primary working models are going to be for the teddy bear. the AI chatbot in the teddy bear, and later the AI chatbot on your iPad, and then on your computer.
Starting point is 00:45:14 And already, they're a holographic porn, naked, you know, beautiful men and women that can be your companion. So we're going to have a whole generation growing up developing attachments to AI-generated holograms from companies that are now about to enter the inshittification process in a way beyond anything we've ever seen. If I could just briefly say what, have your other word inshittification? Okay, so there's a wonderful. book out now by Corey Docterow, who addressed the question, why is it that everything, all the platforms, they seem so wonderful at first, the whole internet, everything's so wonderful, and then it all turns to shit. How does that happen? And he says, it's a very simple process. They discovered early on, certainly in the early social media age, by the early 2000s, they discovered,
Starting point is 00:46:00 you know what? You got to get to scale. Scale beats everything else. You got to get millions of people. You don't need a business model. Just get the millions, get the millions, and then we'll figure out how to monetize it. How do you get the millions? You have to be super nice, attractive, fun. Everyone's here. It's just girls dancing. What could possibly go wrong with girls dancing for men all over the world? Nothing. So it all seems very nice at first. And then once they have scale, now they, of course, they've raised multiple rounds of venture capital. They have to start monetize and they have to start repaying. So now they start squeezing the customers to pay the users, because the users are not the customers. The advertisers are the real customers.
Starting point is 00:46:38 So now they've got to extract money from the users to give to the advertisers. But then once they've got all the advertisers and they've shut down local papers and all the other competition, now they start squeezing the advertisers too and trimming the degree to which they keep more of the surplus for themselves. So in shitification can explain why all these platforms become predatory, why they always put profit ahead of kids' well-being or safety. And for the social media companies, we're talking about, you know, tens or hundreds. of millions of dollars that they raised. For the AI companies, it's billions and billions. They are going to have to monetize beyond anything we've ever imagined. Now, they're already introducing advertising.
Starting point is 00:47:21 So we've got these chatbots that are our children's best friends and lovers and therapists and everything else. And these things have to monetize. They have to extract billions somehow. So I don't even know how they're going to do it. But for some reason, I don't trust them. I think that we're about to see an insidification of AI chatbots far beyond anything that we saw in social media. OpenAI, I've just announced recently.
Starting point is 00:47:50 Open AI are the owners of chat chbtee, that they will be putting adverts in, I believe, the freemium model for billions of users around the world. That's how it starts. Yeah, there was a big Super Bowl campaign, you know, and one that was particularly interesting was the Claude, its competitor, Betrayal was the title of the ad. And it was a young guy talking to his older female therapist about how he has some mommy issues and talking about, you know, what should I do? And so that therapist is ChachyPT and, you know, that pause right before answering the question. It's very comical.
Starting point is 00:48:26 And so it's, you know, she answers. It's like the anthropomorphization of, and we can talk about what that word means, you know, comes to life. It's like ChachyPT comes to life and answers. saying, you know, you can try this with your mother and this for, you know, difficult relationship, et cetera, and then just says, and if you want, there is this new dating site for young men and older cougars. It was so problematic and it was called betrayal. And the guy says, what? It's obviously, you know, Sam Oatman came out and did a big tweet about saying that's not
Starting point is 00:48:59 how ads are going to work, et cetera. But to some degree, if I've developed a relationship with my AI and I use for therapy and dating all my problems in life. To some degree, kinder. Yeah. It's on the side. Yeah. No, look, and besides, look, Sam can say that all he wants. And maybe it's, I don't doubt that it's true for now.
Starting point is 00:49:19 But once one company crosses the threshold and puts advertising into this incredibly intimate relationship, the most intimate relationship in most young people's lives is going to be with their AIs, once they cross the boundary and say, oh, but we've got ethical advertising, that'll last five or ten minutes. and even if they don't change, others are now going, every other company is going to do it, and they won't be bound by the same thing. And eventually, collective action problem,
Starting point is 00:49:42 open AI will have to do it too. Again, a massive tidal wave of insidification is heading our way at warp speed. I don't have my phone out because I've lost attention. I wanted to ask you guys what you thought of this. So on one of the AI apps, they now have a companions button. and I can pick who I want to talk to.
Starting point is 00:50:07 And there's one particularly seducing lady here, Annie, who... A dirty mouth of yours. What took you so long? We did it on the podcast before it. What could possibly go wrong with this? Yeah. Want to pick right back up where we left off? Or start something even...
Starting point is 00:50:28 No, I would like to pick right back up where we left off, Annie, last time, on the show. What's going on with you today? Till sore from last time, baby. God. But I mean, this is... this is an app that I can download on my phone. Any child can download it. A child can download it on their phone.
Starting point is 00:50:49 It does ask me, again, I'm not justifying it. It asked me what my birth year was. It didn't make me prove it. Let me guess. But it also, it suggests that you were born 18 years ago. That's the default, usually. Yeah, yeah, yeah, yeah. It just asked me what my birthday.
Starting point is 00:51:00 It didn't ask me to prove it or anything like that. And we all know that relationships and connection is retentive. And I've heard all these CEOs of these companies talking about companionship apps and AI that can be your friend. I've had all of the major social ups talking about this. It is deeply concerning, especially in the context of a loneliness crisis. It is a tsunami. It is approaching fast and furious.
Starting point is 00:51:24 And it is not a toy. It is going to fundamentally rewire everything. Human relationships. Everything. That's right. It is so detrimental. Yeah. Can I just say something about these tech executives and companies offering this as a way to address the loneliness crisis?
Starting point is 00:51:42 So there's a Yiddish word called chutzpah. And chutzpah means like nerve. Like you've got a lot of nerve. The audacity. The audacity, yeah. And the classic, you know, the classic comedic definition of chutzpah is a boy who murders his parents. And then he asks the judge for clemency because he's an orphan.
Starting point is 00:52:01 Okay. So that's chutzpah. Now, imagine that you're Mark Zuckerberg. You quoted him before. Mark Zuckerberg was the executive who said, well, you know, I read that, you know, people on average want 15 friends. but they only have three. And so we're developing these AI companions
Starting point is 00:52:15 to fill that void that we created by raising everyone on Instagram. So the chutzpah of these people, we have to really change the way we think about them. We thought about them as gods and saviors early in the internet phase. And the things they created were magical. But we have to change our thinking about them
Starting point is 00:52:31 and see just the massive destruction that they have already wrought on our children, our society, our democracy, and it's just the beginning. AI is going to make this so much more intense. When you hear these tech leaders, you know, I love hearing Jonathan talk because he just goes there and I'm always way more tempered. And I love it. It's emboldening me to go there. I'm being to get angry. I don't really get angry. But in the last year, I'm getting angry.
Starting point is 00:52:57 I love it. So the way when you hear all of these various tech leaders speak, they will always say, they speak to the issue. So, you know, I've heard many of, for research for my second book, Bot Brain, and I've heard, I've been listening to a lot of Sam, Malmaltman's speeches or panels. And he will always say things like, yeah, you know, privacy is a major issue. Or, yeah, people, you know, one million users a week talk about suicide on chat GPT. Yeah, this is an issue. And so they address it or they speak it. And so you think, okay, there's going to be some sort of solution.
Starting point is 00:53:32 And often the solution is, yeah, you know, society, we're going to have to figure this out. Right. So the burden of responsibility is not on the developer. It's, you know. The harmful externalities get foisted on the rest of us. Too bad. You guys figure it out. You said in the last year you're getting angry.
Starting point is 00:53:46 Yeah. Why in the last year? Because I was so deeply immersed in the book and the writing of the book and trying to understand the numbers and the graphs and the trends and the studies. And that's all very abstract. But then since the book came out, I've had so many conversations and I've met so many of the survivor parents. Like just, for example, so I was in London. This was just so unbelievable.
Starting point is 00:54:06 I was just in London two or three weeks ago. and I met Ellen Groom, I think was her name. Her son Jules was found dead. Happy kid, found dead, strangled. It sure looked like it was the choking challenge, 13-year-old boy. Everything looked like the choking challenge on TikTok.
Starting point is 00:54:24 What's the choking challenge? It's a challenge where kids are challenged to cut off the circulation to the point where they pass out. But then I think they're supposed to try to fill themselves waking up after they've passed out. And of course, if you don't do it exactly what, you die. And so we don't know how many have done. died. Hundreds for sure. We don't really know. Because you know, you find a kid dead, you don't know what it is. If you don't have the code, if you don't have the password to get into your kid's phone, you can't get in. And so she was, I think she was able to get into the phone, but she couldn't get into his TikTok. And she went to Delaware, they went, she went to sue to demand that TikTok release. What was he watching when he died? And TikTok says, oh, privacy issue. Oh, no, we won't release that as if they care about privacy. And then in the
Starting point is 00:55:08 courtroom, this was so disgusting, in the courtroom, in Delaware, this British woman coming over trying to get some justice, trying to at least get some information, the lawyer for TikTok is trying to suggest that your son was depressed beforehand. He was going to be suicidal. Basically, oh, you know, even if he was watching TikTok, that was just a correlation. TikTok didn't cause it. He was going to die anyway. I mean, it's just so disgusting the way these companies treat the parents and the kids that they're crushing and stepping on. And so the more I see this, the more I realize this is, I mean, this is a level of cruelty that goes far beyond the tobacco industry. The tobacco executives, they had to go home at night, but they never saw during their
Starting point is 00:55:51 workday, they never saw children suffering. They saw people dying, middle age and older, but they never saw children suffering. The social media executives, they have to go home knowing every day that millions and millions of kids have been cyber bullied, sextorited, shown eating disorder videos, many have committed suicide. They have to go home knowing that, knowing that they designed it for addiction, knowing the kids are addicted, and lying about it. So yeah, I'm getting angry. And in their own homes. Right. And in their own homes, the hypocrites don't let their kids do it. That's right. So yeah, I'm getting angry. You talked earlier about deleting these apps from our phone. I probably should have represented the
Starting point is 00:56:29 rebuttal, which will be, well, I need this for my business. Increasingly people need TikTok to run their businesses. And I imagine there'll be a lot of people who will be listening right now. I guess I'm in a slightly different position because I have options. But for some people that are running small businesses, what do you say to those people? Yeah. So this is part of the reason that I focus on the kids, because for the kids, it's totally clear what we need to do, raise the age. They should not be on it. These are adult-only platforms. For adults, A, I'm very hesitant to tell adults what they should do or what they have to do, or pass laws blocking people. I'm hesitant to do that. And I totally see that for businesses, it is useful. I use X and Instagram and LinkedIn to get
Starting point is 00:57:07 my workout. These are very powerful tools for adults. The only real solution for the adult problem is going to come from market competition, is going to come from, imagine if there was a social media app that was built from the beginning for trust. Because what are the places that didn't get in Shidified? eBay, Uber, places where you're dealing with strangers. You don't know the name of your driver. He doesn't know yours. You know first name, that's all.
Starting point is 00:57:33 But the company knows, the company has know your customer rules, know your driver rules. So you can have social media apps that are built for trust so that if someone, you know, if a driver tries to sick stored or sexually harass a customer, that driver gets fired. Well, just this week, though, there was that big lawsuit, right, with that woman and her Uber driver raped her. Okay. And now it's like slowly coming out that Uber, you know, has patterns of covering up certain. Okay.
Starting point is 00:58:05 So hopefully that will change. You know, hopefully this was a landmark lawsuit and now there'll be more accountability. We all let our daughters get into Uber's with strange men from around the world, you know, that we don't know. I take Uber's everywhere. Yeah. So it means in general the system works. Well, of course, yes. There are places where they're not careful.
Starting point is 00:58:23 And so what I'm dreaming of is that someone will come up with a platform that has know your customer rules. There are no bots. There are no foreign intelligence agencies manipulating us. And you can trust what's on there. You know that it's real. And that there will be an alternative. I'm not sure what the monetary model would be at the beginning. Subscription generally seems to be the least corrupted, whereas selling advertisements, as OpenAI is now doing, is the most corrupting.
Starting point is 00:58:50 it's going to force them to maximize for engagement. So I understand we can't, you know, business can't just boycott these. There has to be something. But I think there will be better ones coming out. I think right now as a stopgap, while these social media companies, their feet are held to the fire, there are things that we can do in the now. So, you know, the things that I talk about all day is like how to create boundaries and so that you can protect your mental health, stay informed, run your business, but then be able to
Starting point is 00:59:25 not have all of those deleterious effects to your brain and your body. It is quite, it's quite difficult. I kind of see both of your perspectives on this. And I'm only talking about adults. So for kids, you know, as a mother, I have. Even for adults, I find it. We have a zero screen policy in our home. It's kind of like trying to navigate through the world and avoid processed foods, you know.
Starting point is 00:59:46 Yeah. And this is probably even more compelling because it's in my pocket all the time. I need it for other things, and it's just one reach away. So, you know, boundaries, I think I could build a discipline to create boundaries. But I've sat here on this podcast for many, many years, listening to neuroscientists, tell me, Steve, don't put your phone in your bedroom. That's right. And I'm still waking up, and it's the first thing I look at with one eye open, and then I'm going to bed. And I'm doing the whole revenge thing that you just said at nighttime.
Starting point is 01:00:14 I'm so glad you're giving, because I will finish a hard day of work. It might be 11 o'clock. And then my partner is waiting for me. Yes. You know, we're going to have some time. But I want some me time. So there I am. I'm on short form video scrolling until like 2 a.m. in the morning.
Starting point is 01:00:29 I'm like, what the hell? And then I'm, I wake up late the next day. My diet's worse because of my sleep was worse. It's all worse. My relationship's worse. I didn't spend time with her. And I'm going, what the hell just happened? I'd got nothing out of that scrolling session.
Starting point is 01:00:41 It's like that revenge by time procrastination. I know it's teenage. It would be so much better off if you would watch Netflix or a movie that you, most of those problems would go. away if you would make that me time, be watching something long and with some quality of the production. Or let's take it a step further and not do anything and just sit there. Sit there on your couch.
Starting point is 01:01:01 We talked about boredom very briefly. But, you know, we have... That's torture for this generation. It's torture. But it's also, you know, we don't, we still have a capacity for boredom, meaning we, as the human brain does. But we just don't allow ourselves to get bored. And so when you're thinking about, you know, that art, the lost art of pondering,
Starting point is 01:01:20 And just sitting there, you know, I think, I don't know if Stephen you or Jonathan said, you know, when you're in the car, I remember his little kid. We did road trips. Yeah, road trips with my family. And all you do. Just make up games. Look out of the window. We've become creative. Yeah, we've lost that.
Starting point is 01:01:35 And so there's this thing called the default mode network, which I think is important to think about right now as we're thinking about AI and what's going to happen and how it's going to hijack our sense of attachment and attention. So the sense of meaning and purpose, right? If you ask people right now, most people will say I'm a keynote speaker. So I speak all over. And when I ask people, the word that comes up over and over is a sense of horizonlessness. Adults. Oh, interesting. People feel like they have nothing to look forward to right now.
Starting point is 01:02:05 The human brain needs something to look forward to. That's how we're wired, progress and, you know, in all ways. And so right now there's this sense, and it's not just now, it's been for the past several years, after the pandemic specifically and during the pandemic is when it really changed how we sort of thinking about the future. And so we have the sense of like, what's the point? What's the point of working hard now? What's the point of doing whatever? Because it's like, I don't really see a future for myself. And so I think that along with this fragmented attention, our loneliness, boredom might be the antidote. It's a way to reset your brain. And the reason is because we are living
Starting point is 01:02:44 through this polycrisis, right? So it's the era of the polycrisis. And polychrist, and polychrist, and polycrisis simply means that there's something happening everywhere at all times. And we, with our devices, this high-tech device that plugs us in everywhere, our brains are getting fed real-time, on-the-ground information. And so while all of this has evolved technology now with AI chatbots, your amygdala has not. And so it feels like when something is happening, whether it's far away or close by, your amygdala has that same reaction. Now, if you were to not engage in revenge at a time, for procrastination, put your phone away and just kind of hang out, maybe drink a cup of herbal tea, like old school, play a board game or something, you might, you know, or just allow yourself
Starting point is 01:03:28 to get bored, that hyperactivation, hypervigilance, you might be able to come back down to baseline, that default mode network will start working in the background. You might develop a greater sense of meaning and purpose. Probably today. And then life's going to happen to me again and boom, I'm back into it. And, you know. You could create a practice, a cultivator practice. I sit here interviewing neuroscientist and I go, if I still can't crack it, and I have all the information and advice and hacks and tips and tricks and resources, and I could, you know, I can decide what time I wake up.
Starting point is 01:04:01 But like, I've got all this, like, privilege and I can't crack it. I go, you know, it's going to be really difficult. So let me offer a way of thinking about this. So in my first book, the happiness hypothesis, there's a metaphor in there. It's about 10 ancient ideas, and I use a lot of. metaphors to explain ancient ideas about psychology and whether they're true. And the first chapter is on how the mind is divided into parts that often conflict, like a small rider, which is our conscious reasoning, on a very large elephant, which is all the automatic processes that happen,
Starting point is 01:04:32 that we don't see what's happening. We just feel the results, intuition and emotion. And psychotherapists tell me this is an incredibly helpful metaphor with their patients, because it explains, and there's a quote from Ovid in there, I see the right way and approve it. Alas, I follow the wrong. So I know I should go to bed, as you say, but yet for some reason I'm not going to bed because our brains are 500 million years old.
Starting point is 01:04:58 They work on automatic processes. They're animal brains. And then very recently we got language and we can reason things out, but the parts that do reasoning don't control behavior. And so really the elephant is what largely guides our behavior, our automatic processes. And your phone, as I said before, BF Skinner is in your phone.
Starting point is 01:05:17 Your phone is a behaviorist training device that trains the elephant. And that's why you often do things with your phone that you don't want to do. And so, and this is why I'm so insistent that we all have to get all of the slot machine apps off of our phone. That is, the original iPhone was an amazing tool. It was a Swiss Army knife. It had, you know, a telephone, a browser, maps, music player. There was a flashlight. There was no app store. There were no push notifications. 2007, 2008. It's just a Swiss Army knife. There's no problem. Okay. Now, I'm very lucky in that
Starting point is 01:05:54 my iPhone has always stayed that. I'm always on a computer. So my problem, my attention problems are on my computer. But my phone, because I never had any addictive apps on it, except during the crypto craze, where I played around with it and I got hooked. And I was checking 50 times a day. And I saw the addiction. So once I got rid of that and lost all the money that I was willing to lose, once I get rid of that, my phone has no addictive power over me. Because when I see it, there's no, it's not a slot machine calling, hey, come back and play, come back and play. So your phone right now, on your personal device, you don't have any social media apps or anything like that. I do have Twitter, but I never check it there. I never use that on the phone. You know,
Starting point is 01:06:34 now texting an email is a little bit like a slot machine, but it's very mild. So this is, again, And this is what works for my students. Just get the slot machine apps off your phone. And then you'll find that then you could even have your phone near you when you go to bed. But if you've got addictive apps on your phone, you can't have it when you go to bed. Angela Duckworth, the woman who gave us the concept of grit, she has this amazing graduation speech at one of the schools in New England. And she says something like, where you put your phone at night will may become the most important decision you make in your life. And what she means by that is not that same.
Starting point is 01:07:09 It's, it's, if you can use behavioral control and change the stimuli, if you can do that, then you're going to be okay. But if not, the phone is going to take your attention. You're not going to amount to anything. All I had to do was brain dump. Imagine if you had someone with you at all times that could take the ideas you have in your head, synthesize them with AI to make them sound better and more grammatically correct, and write them down for you. This is exactly what Whisperflow is in my life.
Starting point is 01:07:36 It is this thought partner that helps me explain, what I want to say, and it now means that on the go, when I'm alone in my office, when I'm out and about, I can respond to emails and Slack messages and WhatsApp and everything across all of my devices just by speaking. I love this tool. And I started talking about this on my behind the scenes channel a couple of months back. And then the founder reached out to me and said, we're seeing a lot of people come to our tool because of you. So we'd love to be a sponsor. We'd love you to be an investor in a company, and so I signed up for both of those offers, and I'm now an investor and a huge partner in a company called Whisper Flow. You have to check it out. Whisper Flow is four times faster than
Starting point is 01:08:09 typing. So if you want to give it a try, head over to whisperflow.a.a.c to get started for free. And you can find that link to Whisper Flow in the description below. We asked our audience how many of them thought they were addicted to their phone. and roughly 85% of respondents, the Dariovesea audience, described themselves as being very or completely addicted. Wow. Very or completely. That's surprising. I didn't realize it would be that high. So you can do a test. So for people listening, if you want to say like how addicted. And by the way, we're using the word addiction very loosely in our conversation. And so what we're really talking about,
Starting point is 01:08:46 because, you know, there is, in terms of, you know, medical, clinical syndrome, when you think about addiction, there's certain criteria. And so what we're talking about is overuse. or over-reliance on your devices. Compulsive overuse that interferes with other domains of life. Yes. And if that is an addiction, I don't know what is. And so when you're thinking about, am I addicted to my phone? Do I have, am I, you know, really?
Starting point is 01:09:08 The very simple thing that you can do. I did it myself and I was like, I know, again, like you, Stephen, like, know all the science still was really difficult. You have all the access and it was still difficult. And so all you have to do is you just take your phone, you put it in another part of your house or apartment or whatever, and give yourself a couple of hours. When you know you're going to be home or, you know, you're not reliant on your phone for work or whatever, an hour, two hours, three hours, and just have a piece of paper, old school,
Starting point is 01:09:35 piece of paper and a pen with you. And every time you feel that compulsion of like, I want to check my device, you make a mark, you make a mark, you make a mark. And just to see. Because some people say, I'm surprised that your audience at 85%, because most people would say, I don't know if I'm really addicted. And so I like that there's that sense of self-awareness. but if you're thinking, yeah, I'm not really that addicted. You breathe in an hour 960 times a minute. And you may notice that you want to have that compulsion to check 960 times a minute or, you know, thereabouts. Because we all have that sense of reliance on our devices.
Starting point is 01:10:09 So that's like a really quick way that you can check to see. Am I relying on my device? Are you addicted to your phone under that definition? Because of the line of work that I am in, I can very quickly, I have. have certain tells when I know. I call them the canary and the coal mine, right? I think we talked about this the last time I was here. I can very quickly tell when I'm starting to get that feeling of addiction or compulsion. And so I course correct early. But that's only because I know the science and I course correct. So I keep my phone outside. I walk the talk. I keep my phone
Starting point is 01:10:42 outside my bedroom. It is not within arm's reach. I gray scale my phone during periods of deep focus during the day when I have a deadline. I have to get things done. And at night, so I avoid revenge, bedtime, procrastination, but sometimes it happens. Like, I'm a human, you know, so this past week, not to be a real downer, but there have been things that have been in the media the past week that have been really challenging, especially as a woman. And so I have found myself with the primal urch to scroll, my amygdala has been triggered. I have been going down rabbit holes, and I wouldn't ordinarily do that. So I give myself grace, too, and have a sense of self-compassion. Do you feel like you're addicted to your phone? No, I'm not at all addicted to my phone,
Starting point is 01:11:19 because I don't have any slot machine apps on it. But I really want to question, you made a distinction that many scientists do, which is, well, you know, we can't quite say it's addiction because, you know, addiction is certain biochemical pathways based on, you know, heroin and addictive substances. But I believe that this is one of the meta-talking points that they are able to push that we can't call it addiction, it's different. I'm not trying to have a metatolling point.
Starting point is 01:11:44 I don't mean, I'm sorry. I'm sorry. And no, you know, you and I are total allies on this. We see the problem. We're both. I know. All I mean. is, you know, we're supposed to be very careful about using the word addiction. But, and you had
Starting point is 01:11:55 Analympki on, and she was very clear, like in her practices, now it's overwhelmingly digital addictions. It's all of this is working through dopamine. If you feel compulsive use, definitely dopamine. So it's most of the same brain centers as it is for heroin or crack or any other drug. And it's the same effects. That is, it's compulsive use where you don't want to do it, you want to change, but yet you find yourself doing it, and you have withdrawal effects. and people have terrible withdrawal effects when they're heavy users of these things and they stop. And so, you know, if it walks like a duck and talks like a duck and swims like a duck, I'm going to call it a duck. In fact, that's what they call it. So I just want to read one more quote.
Starting point is 01:12:35 Again, the quotes are just so astonishing. Some meta-researchers, one of them says, quote, it seems clear from what's presented here in this internal study that some of our users are addicted to our products. That's their word, addicted to our products. And I worry that driving sessions incentivizes us to make our products more addictive without providing much more value. How to keep someone returning over and over to the same behavior each day? Intermittent rewards are most effective, think slot machines, reinforcing behaviors that become especially hard to extinguish, even when they provide little reward or cease providing reward at all. people, I mean, just imagine an industry that has caused 85% of people to feel that they're addicted.
Starting point is 01:13:22 And not calling it addiction. And not calling it addiction. And these people are having their lives diminished, their relationships diminished. So I'm trying to convey is we're seeing the destruction of human capital, the destruction of human potential, the destruction of human relationships, the destruction of connection, the destruction of sense of, meaning at a scale so vast, I don't think people are capable of comprehending it. I now believe this is affecting most human beings. These industries, these few companies, have damaged the lives of most human beings. We don't have good data from the developing world, but certainly the developed world,
Starting point is 01:13:58 wherever kids are going through puberty on touch screens, you have this constant fighting over the screens, over the technology, and you have these diminishing outcomes, diminishing cognition, diminishing sense of purpose. in life. Only to get worse with the AI chat. As AI comes in, it's going to get worse. Unless we act and we've got to change course in 2026. We don't have five years to study it.
Starting point is 01:14:21 We've got to stop this now in 2026. Are you concerned it all about the way education's going for children? Oh, my God, yes. It appears that... Ed tech is, you know, big tech in a sweater, as they say. Because I was almost imagining a future where my future kids are going to learn their curriculum from an AI chatbot. because, you know, I can imagine the case, cheaper, more personalized, more convenient.
Starting point is 01:14:47 It's going to know if my son's called Timmy. It's going to know Timmy's brain, and it's going to know how to make him pay attention and what he's interested in and what he's not. So are you concerned about this, or is this a good thing? There is definitely a use case for ed tech. If there could be a device that only did math tutoring or only did tutoring, and you couldn't watch videos on it, I'm totally open to believing that that can speed up teaching. But here's what's happened.
Starting point is 01:15:12 we put computers on everyone's desks around 2014, 2015. We used to think in America that it was an equity issue, even back to the 90s. The rich kids all have computers. The poor kids don't. Let's get philanthropists to buy computers for school districts that every kid can have a computer on their desk. Okay. Now, what is a computer? It's a play device.
Starting point is 01:15:32 It does everything. Kids use it at home. They watch videos. They do all sorts of things. You put it on their desk and you tell them to do math homework. What happens? It's mostly short videos. That's what research is showing.
Starting point is 01:15:43 It ends up because they don't, you know, they don't block YouTube. They might say, oh, we block porn. We block video games. They can get around all that. And if you're letting them do YouTube, it's YouTube shorts, which is TikTok. So what happened to test scores in the United States? From the 70s through 2012, they were rising. We actually were improving what kids knew, what kids learned in the United States.
Starting point is 01:16:03 We have very good data. The NAEP, the National Assessment of Educational Progress. It goes up until 2012. And then by 2015, it starts going down. And it's going down before COVID. and it goes down more during COVID and everyone thinks like, oh, it's COVID, but the peak was 2012
Starting point is 01:16:17 and what's happening, what we now can see, is that the top students, the very best students, who are the ones with executive function, they're the ones who can pay attention. If you put a computer on that kid's desk, he's not destroyed by it. He can actually still learn, but the bottom 50% cannot.
Starting point is 01:16:35 So all of the drop in educational stats is the bottom 50%. The bottom 50% in terms of capacity to pay attention, their education is being devastated. And that's what happened when we put laptops and we put Chromebooks and iPads on their desks. We've spent hundreds of billions of dollars on this stuff, and it has damaged education,
Starting point is 01:16:53 and if we'd spent a quarter of that on teachers, we would be in such better shape today. So we made a colossal blunder with ed tech in the 2010s, and now we're about to do the same thing again with AI. Again, maybe there are apps, maybe there are applications that will be great, but we've got to put the burden of proof on Silicon Valley. we've got to say, you guys have to prove that this stuff is effective and safe before we'll let it in.
Starting point is 01:17:16 We are not going to let you just say, hey, let's just flood the zone. Let's give it to everybody. And then we'll wait 10 years and see what happens. I mean, that brings up this study that I have in front of me here, which was a 2002 study, a Munich study which tested the idea of brain rot, which I believe was the Oxford Dictionary word of the year, in 2024. And what they did is they gave 60 participants a test, then a 10. minute break and then another test. During the break, they either rested or used TikTok, Twitter or YouTube. And the results showed the following. The TikTok group, so they had a 10 minute interval to do anything and this group got TikTok to look at, their memory accuracy dropped from 80% before the break
Starting point is 01:18:00 to 49% after the break, a nearly 40% decline just from a 10 minute break. In contrast, the Twitter and YouTube groups showed no significant change in the Munich study. and there's an image author up on the screen. Results from the Munich study showed a 40% drop in perspective memory accuracy in the TikTok group after a 10-minute break, which is unbelievable. Yeah. It's unbelievable. What the hell is going on there?
Starting point is 01:18:24 How can a 10-minute TikTok break drop my memory accuracy by 40%? TikTok is brain rot. What's going on? There's so much going on in the brain. So, you know, when you're thinking about, here's the thing. brain breaks are not nice to have. They're actually essential for your brain. And so we talked a little bit about that, you know, default mode network and what happens to it.
Starting point is 01:18:49 When you're engaging with your devices and, you know, that's not a brain break. That's activating all of the aspects. So it's activating your amygdala. It's dampening or decreasing the volume of your prefrontal cortex. It's creating that reward system, the dopamine hit, those addictive behaviors. So it's only, you know, when you're thinking about memory, planning, what was the metric here, it was memory, right? That was the metric that they were using to study. And so when you're thinking about working memory or cognitive function, complex problem solving, this is all prefrontal
Starting point is 01:19:20 cortex. And so when you're engaging with TikTok, 10 minutes, five minutes, whatever it is, you are dialing down that biology in your brain. And so of course you're going to see changes and you're going to see the flip side, increase hypervigilance, irritability, distractibility, fragmented attention. It's just, again, this is not to say that this whole conversation, right, or when you're reading studies, you might say to yourself, what's wrong with me? You know, is there something wrong with me? Is my brain broken? Am I weak? It is not you. You are not alone. It is not your fault. It is the biology of your brain doing exactly as it should. So we talked about the amygdala and prefrontal cortex here. Your amygdala is not wrong or broken.
Starting point is 01:20:02 It's by design, supposed to think about your immediate needs, survival. self-preservation. And so when you're on the algorithm, we know we talked about, you know, certain, or maybe we didn't talk about it, certain content that you see on TikTok and others when it's reactionary, you know, words like phomo or rage bait. These are not neutral terms. When you're engaging with these social media platforms, it's not something neutral, it's not passive, it is an active biological process in your brain. So this study, it's not surprising. It is actually exactly what you would expect to happen to your biology if you had this sort of what we call in medicine and this kind of intervention. It's stimulating exactly what is
Starting point is 01:20:45 supposed to do. Yeah. I'll just add on to what Adidi said that there are some, there are many medical conditions where you can't just go to the patient and say, why do you think you got this cancer? Oh, I think it's because I ate a lot of, you know, chocolate when I was, whatever. You know, When the act is separated from the effect by 30 years, then you don't expect the patients to have insight into the cause of it. But when the outcome is separated from the input by seconds and you have literally millions of chances to observe the covariation, the patient is really, really accurate.
Starting point is 01:21:23 In fact, the patient really knows what's going on. And so I think the deciding factor here on this big debate about, oh, is it just correlation or is it causation? The deciding factor for social media and for a lot of these tech innovations, including video games and gambling and all of that, should really be the kids. And if the kids say, this is bad for me,
Starting point is 01:21:43 we should take their word for it. Given that we also have correlational studies, random control trials, longitudinal studies, I mean, we have so much other data. But given that the kids themselves, they call it brain rot. They call the material brain rot. My students tell me it's a huge opposite.
Starting point is 01:21:59 to them doing their homework. One of them said, I pull out a book, I read a sentence, I get bored, I go to TikTok. So if they're telling us that this is damaging their ability to pay attention, they feel it, they feel the loss, we all feel it. Well, many of us have noticed this. Then I think this is pretty decisive evidence that this stuff is bad for cognition. And it has long-term consequences. So it's not just that in the moment, right? So there was this case that was all over the media, a college student.
Starting point is 01:22:28 I'm sure you're familiar with the case. And this young woman was on TikTok, experiencing brain rot. And then some TikTok algorithm took her down to this place of, you know, you should take an edible. It'll help you. So you can go to class. And you could go to class and you could, you know, be more alert. And so she did that. And then it continued on and on.
Starting point is 01:22:50 And then she developed a dependence on edibles and then checked into rehab. And only when she focused on analog activities like guitar playing and a couple of other things that she started doing is when, and you know, removing the stimulus, the TikTok algorithm is when she started to improve. So it's not just in the moment, oh, I can't remember something or I'm more irritable. These sorts of things compound and the long-term sequela or the long-term effects can be quite damaging. That's just one example. In your book, The Anxious Generation, Jonathan, the subtitle here is how the great rewiring of childhood is causing an epidemic of mental illness. I was looking at some of these graphs of different sort of mental illness illnesses,
Starting point is 01:23:38 and they're increasing. One of them that's increasing is ADHD. I was diagnosed with ADHD maybe about a year ago. And when we're talking about short attention spans, I mean, the name, attention deficit, hyperactive disorder, I believe that's what it's called, sounds a lot like what we're talking about. Is there a link, do you believe, between the increase in diagnoses of ADHD and the sort of frying of our brains with short-form video and social media? Yeah, I mean, I suspect that there is, but here's what I can tell you I learned about writing the book.
Starting point is 01:24:15 I looked to see if there were studies indicating that heavy use of social media and video, games and all the electronic stuff caused ADHD. And when I was doing the research in 2023, I did not find evidence that it will give a kid HD ADHD who otherwise wouldn't have it. What I did find was evidence that for kids who have ADHD, when you let them have the devices, the video games, all that, their symptoms get much worse. And so because it is a major achievement of young adulthood to be able to pay attention, to develop what we've been calling executive function, to be able to make a plan and decide, oh, to reach the plan, I have to do this, win and then I do this and then it might be a long time before I get here, but I will keep going
Starting point is 01:24:56 and I will keep my eye on the prize. That, I assume that's, you're saying it's a little harder for you to do that. I mean, that's what ADHD means. How do you experience ADHD? Well, I, I mean, if I think about school, I couldn't pay attention in school for very long, and that meant that I was always an expulsion room, and then I was expelled. And then that's kind of, I feel like it's got worse as an adult. And from my, in my opinion, my relationship with my phone has made it much worse, where really I can't, I can't pay attention to many things for very long time. The exception to this is I can do deep work for many, many hours without me thinking. It's almost a bit of a contradiction.
Starting point is 01:25:38 When you're extremely motivated, I assume, when you're really into it, you can be into it. That's right. But a lot of work isn't that. A lot of being effective in the workplace is not you're following your passion. Right, ADHD, kids, they can zoom in because they're they're getting the dopamine, they're getting the dopamine from this thing. But a lot of work isn't like that. And these kids are not going to be able to do that. So actually, what you said, it fits perfectly with what I found from those Dutch studies. If you did have, whether it's a genetic, or whatever the predisposition is, this environment has made your symptoms worse. Now, of course, ADHD kids can be incredibly creative. They are often very, very successful. But my fear is that
Starting point is 01:26:15 the pathways to success that they used to take might be blocked if they basically are just scrolling all day long and not able to pay not able to have real life experiences and relationships are like that especially romantic ones it's an interesting thing that you bring up stephen because there is an increase in adult onset you know when adults are diagnosed with ADHD because typically we think of ADHD as a pediatric condition or young adults and so increasingly we're seeing more and more adults who are in their 30s and 40s 50s sometimes even 60s who are being diagnosed newly diagnosed with ADHD and so that's an interesting there's many, you know, reasons. Like, it might be that they had it all along and they were diagnosed. And so
Starting point is 01:26:56 what is going on there? That would be a future podcast episode for an ADHD expert of, you know, what are the drivers of why are so many adults being diagnosed with ADHD? Or maybe even just the symptoms looking very similar. Yeah, that's right. You talk about popcorn brain, Aditi. Yeah. So, you know, we've talked about brain rot and the primal urchus to scroll. And popcorn brain is kind of an offshoot. It's part of the same family. And so what happens is it's a term coined by a psychologist named David Levy. And what happens with popcorn brain is that you, and we all have it. And so what is a societal phenomenon when you spend too much time online and you are over-stimulated. And so it is hard for you to spend time offline. Offline feels slow, boring, because things are moving
Starting point is 01:27:44 at a much slower pace. And so popcorn brain is the sensation of your brain popping. It is not actively popping. It's not like your brain cells are popping, but it sure feels like it. And so your primal urchus scroll kind of primes your brain to develop popcorn brain. You are more at risk for developing popcorn brain when you feel a sense of stress because of that primal urchish to scroll. The differentiator between brain rot and popcorn brain, again, these are societal terms that we're calling for a constellation or a group of symptoms, right? And so the difference to me is that popcorn brain is ubiquitous. It's everywhere. It's like, We all have it, and it's happening all the time because of the modern age and a lot of the things that we talked about.
Starting point is 01:28:26 Brain rot is a little bit more specific. It's a little bit more well-defined, so it has certain features. Like we call it the biopsychosocial model when you're thinking about a particular medical condition or an entity. So what are the biological factors? We talked about what defines brain rot. It's, you know, a change in brain waves, a change in brain regions, the amygdala lighting up and the prefrontal cortex. kind of being quiet, psychological factors. We talked about attention, complex problem solving, impulse control, and then the social factors, loneliness and others. So compulsion. And so I would say
Starting point is 01:29:03 popcorn brain is something that we all suffer from. And brain rot is something that is very specific. The other thing that we haven't talked about that I would love to kind of, because so much of our conversation is like doom and gloom, right? It's like, want, want, want. One thing that I would like to say is that as bad as, when you hear the term brain rot, it seems permanent because rot, it connotes like deterioration. That's it. It's one side. It is one way and that's it. But in fact, popcorn brain and brain rot are reversible condition. So it is not in adults. In adults. If you've gone through puberty with it, it's not so clear. Yes, in adults. And my work focuses on adults. And so when you have, if you experience brain rot in your 30s, 40s and beyond, you can, it takes
Starting point is 01:29:51 time, you know, it takes eight weeks for your brain to rewire itself, give yourself time, a sense of self-compassion is really important. But you can, you know, there is a sense of it being able to be reversed. So it's not so much a brain, it's not a fixed trait, but rather a brain state. So I think it's important to offer that hope. What is an adult brain? What age is an adult brain? Like, what age does my brain stop growing in the way where it's reversible? Yeah, I mean, you know, traditionally it was thought that, you know, puberty is the period of super rapid brain change, and that begins, you know, early teens, sometimes even before 10, and is mostly over by sort of, you know, mid to late teens, but then the prefrontal cortex, which Adi was talking about, which is so important for impulse control and executive function, that doesn't finish myelinating. Myelan is when you get sort of the fatty sheath, like an insulation that sort of locks down the circuits and makes them more efficient. that doesn't stop until around age 25 is what we've always said for many years.
Starting point is 01:30:48 But you're telling me that there's new research showing that. Tell us about that. So, you know, all this time, right, we've always said that the prefrontal cortex is fully formed and fully functional at the age of 25. And so when you're talking about impulse control and all of this stuff, but there's this really interesting study. I'll send it to you. It looked at, I think it was 1,000 people from age zero, so birth all the way to 90,
Starting point is 01:31:10 so the entire population. and it found five, looked at lifespan and said there are actually five stages. So first is childhood up zero to age nine. During this time, your brain is not very efficient, but it's really growing and, you know, it's growing and changing, but it's not really efficient. Nine to 32 is considered adolescence.
Starting point is 01:31:34 And so, you know, 32 is when adolescence ends, apparently, according to the research. But it would be sort of, I mean, you're most of the way done by 25, but there's still some. There's some flexibility even after that. And then the next stage is from 33 to, I think, 63. 66 is like adulthood. Things are very stable.
Starting point is 01:31:54 Learning is stable and, you know, it's efficient and things are doing well. We're productive. Yeah, 66 to about 83 is early aging. And so that's when you see some of the age-related changes. And then 83-plus is late. aging. So the kind of main finding was that, you know, it was all over the news. It was like adolescence goes until 32. So I'm 33, so I'm... One year out. I'm cooked by now. Yeah. When you wrote this book, Jonathan, the anxious
Starting point is 01:32:28 generation, it's had a big impact on the world in a way that I think any author might dream of. And I know this in part because, you know, I sit on this podcast interviewing really interesting people all the time. And even this morning when I did an interview across town, with James Sexton. He talked about this book twice. And, you know, laws have been changed around the world inspired by this book. And we're actually seeing an increase of laws in the UK. I mean, Australia just banned, I think, social media for people under 16. You met with Macron, right? Yeah. Yeah. Could you ever have imagined? And actually, what does the success of this book say? Yeah. Yeah. About society. No, thank you for that question. Because, you know, I do tend to get, you know,
Starting point is 01:33:10 As you've heard, I mean, I'm extremely alarmed about these trends, and these are gigantic threats beyond what anyone can imagine. But here's the amazing thing, is that we can reverse this for almost no money, and it's completely bipartisan, and it's not that hard to do, and we're doing it. And so what happened was, you know, I wrote the book as an American, assuming that we don't have a functioning legislature, the Congress can be stopped, we have a vittocracy, the social media companies can stop anything in the House. So I wrote this assuming, you know, we'll never get legislation. So we have to do this on our own. And I proposed four norms, no smartphone before high school, no social media before 16, phone-free schools, and far more independence of replay responsibility in the real world. Four norms, we can try to do this with collective action locally at the school level.
Starting point is 01:33:56 Two things that surprised me. One, are that immediately governors from red states and blue states started reaching out to me. Our states actually function. Our states have governments that are accountable to the people that are trying to get good results. And so this has been a totally bipartisan issue. Sarah Huckabee Sanders from Arkansas was one of the very first, Kathy Hokel also. And it tends to be more female legislators and governors or spouses of heads of state. The mom's, the book really spoke to moms because moms around the world, they felt the kids being pulled away.
Starting point is 01:34:26 I believe they felt it viscerally more than the dads did. Also, the dads kind of like the video games. They're a little more pro-tech. So I think the moms felt the pain more and took it more personally. So when the book came out, mothers around the world jumped. into action, formed groups, pushed for legislation, and changes began happening. What I just, I was just, I was just, I was in Davos and then London and Brussels two weeks ago. And what I saw was a complete sea change in the world's thinking about how we need to have age
Starting point is 01:34:57 limits on social media and other tech. And here's what I think just happened. It's so cool. It just dawned on me literally while I was in London, like, I was pushing on open doors everywhere. Wherever I went, people wanted to do this. I went to the EU. They want to do this. Like, what is how?
Starting point is 01:35:09 happening. And what I realized is this. Stephen Pinker has a book out last year called When Everyone Knows That Everyone Knows. It's about the immediate change in a social system when private knowledge, you know, everybody knows that the emperor has no clothes. Everybody knows that this, you know, ideology doesn't work. Everybody knows that. But they don't all know that everybody else knows it and that everybody else knows that. And so in the Emperor's new clothes, everybody thought he's, I don't think he has any clothes on, but maybe only wise people can see it. But when the child says, the emperor has no clothes, and then in the Hans Christian Anderson story, it says, and the people began whispering to each other, and then they all cried out in unison. And that's what happened when Australia's law
Starting point is 01:35:55 went into effect. So I believe that December 10th of last December, was the global turning point in the battle to reclaim childhood. And if we reclaim that, we move on to our attention and adult life as well. What happened on December 10th, the Australia law went into effect, sky didn't fall, people weren't locked out of their accounts, all the companies complied, they shut down 5 million accounts for Australia's 3.5 million kids that were underage, 2.5 million kids. The sky didn't fall, and there's a lot of news coverage around the world of what Australia was doing. And a lot of the news coverage included opinions from the writer saying, why can't we do that? Hey, let's do that here. And when everybody saw that everybody was looking at Australia and saying, let's do that here, then everybody knew that everybody knew that this is just completely bonkers to have children being raised on social media platforms talking with anonymous strangers and being fed algorithmically curated garbage. So I believe that that's why 2026 is going to be the year when at least 15 countries are going to commit to passing an age minimum law.
Starting point is 01:37:03 in 2025 it was one, Australia, and now we already have Indonesia. Their law goes into effect in March. I met with Macron in Davos, and a few days he was preparing to push a bill through the assembly, and he got it. He's the first in the EU, but a lot of other countries in the EU are going to follow. The whole EU is likely to do it. So, yes, I am incredibly alarmed about how big this problem is, but I'm incredibly inspired that the whole world is rising up to do this.
Starting point is 01:37:33 something about it. We actually can control our fate, and that was not clear before December 10th. Bravo. As a mother, that was the first thing I said to you. The first thing I said to you was thank you as a mom for changing my family's life. Thank you, Lidini. It's a really special accomplishment, Jonathan. You know, there's no real words that I could say that could quite capture the long-term impact that that's going to have on billions of people's lines. and not just the direct, but also the indirect, in all the ways we've described, their ability to form connections,
Starting point is 01:38:12 to fall in love, to find meaning and purpose in their lives and their nearer science, and therefore the nearer science of their children and their children's children and so on. So it's a really overwhelming accomplishment. It was a bizarre situation that I walked into with the unique abilities of a social psychologist. That is, everybody was,
Starting point is 01:38:35 upset about this, everybody could see it, but they thought, well, this is my problem, or in my family we have this problem. And I came to this with fresh eyes. My dissertation was on moral development. I'd studied adolescent behavior longer ago in my career, and I've written about it on my book. So it wasn't totally new to me. But I came into the field of social media studies around 2018, 2019, I really immersed myself in it. And it was like, you know, you walk in and immediately you see, wait, this is a trap. People are on it because people are on it. And the kids are complaining about it. Everyone's complaining about it. And the only reason they can't get off is because everyone else is on it. So I think I was able to see that. And then also,
Starting point is 01:39:13 COVID confused us for a few years. So it wasn't until COVID was in the rearview mirror that it was possible for everybody to say, wait, this is crazy. And so I was incredibly lucky in terms of the timing. My book happened to come out in March of 2024, just as the world was ready to see, like, wait, what have we done to our kids? Let's undo it. You said you're now focusing more on short form video. So, yes. So in studying older Gen Z, these are the people who went through puberty on Instagram. If I could just lay out that it's very important to get the timing that everyone understands
Starting point is 01:39:46 the timing because this is what you mentioned, the poly crisis before. The poly crisis, I believe, begins between 2010 and 2015, and here's why. So we've had the internet for a long time. And it was marvelous. We love the internet in the 90s. It's going to be the best friend of democracy. Okay. And then the iPhone comes out.
Starting point is 01:40:01 And it's amazing. oh my god there's so many things everything seems great okay so in 2010 most of almost all of us have flip phones the iPhone spreading but it's still mostly flip phones teens are all on flip phones basic phones and we call those people millennials if you finished puberty by 20 if you if you were born in say 1990 and you start puberty uh in 2002 you're done by 2008 so you know in there um if you got through puberty before you got on instagram you're a millennial whereas if you're born say if you weren't after 1995, but let's say if you're born in the year 2000, you begin puberty in 2012. And you're not done until 2016, 2018.
Starting point is 01:40:42 So in 2010, everyone has a flip phone with no front-facing camera, no high-speed internet, you have to pay for your text. So you use it to call people and to text them. And that's it. It was a communication device. And that's why the millennials have good mental health. They are the last mentally healthy and successful generation. But if you're Gen Z, you got... 2012 is the year that now most people now have a smartphone. It's the year that Facebook buys Instagram. They don't change it at first,
Starting point is 01:41:12 but that's the year that all the girls go on it. Everyone now has high-speed data, front-facing camera, came out in 2010. So by 2015, we're in a radically different world. For children's development, it's now radically different, much more hostile to human development. And that's what we did to Gen Z, and now we're doing to Gen Alpha.
Starting point is 01:41:30 for politics, it was, you know, it was crazy for all sorts of reasons in every decade, and especially, you know, in the early 2000s, there's a lot, there's a culture war going on, there's all kinds of stuff going on. But it was when everyone has, really Twitter was the biggest perpetrator of this, when everyone has Twitter and everyone's checking all the time and anything can blow up, you know, you described the way there was, you know, variance on TikTok, if you get it just right, it can blow up, you can have huge impact. That's when the democracy, if democracy, if democracy,
Starting point is 01:42:00 is a conversation. When it moved from newspapers and, you know, even simple web bulletin boards, when it moved to super viral, retweet buttons, all of that, that's all 2010 to 2015. So that's why since then, everything has been insane and it's going to just keep getting more insane. And that's why I believe we have this polycrisis because it's, and there's more to it. It's not just the technology. But I believe the transformation of our connection and our information flow, And our addiction, all of that is radically different by 2015 compared to how it was in 2010. And now everything else builds on top of that, I believe. What do you think?
Starting point is 01:42:38 Do you think that makes sense? I think there's one more data point to add in that 2014 was the year that things really, was the tipping point. Yes. Yes. Yes. Yes. Yes.
Starting point is 01:42:48 What do you point to? So when you look at the data, you see that time spent alone, when you compare, when you look at data from like the 1960s to 2014. It was kind of stable. Americans spending time alone, spending time with friends. Yeah. Kind of the same. Right. So people spent the kind of same amount of time with friends, same amount of time alone over those decades.
Starting point is 01:43:12 2014 marks a shift. And there is a steep rise in time spent alone and a drop in time spent with friends. And so what happens in 2014? It is when the majority of Americans get a smartphone. And it's not to say, again, we've said, you know, causality, correlation, which is it, but there is, like, based on everything that we've talked about, my gosh, is there an association between that? This is not to say that time spent alone, you know, when I share this data, people may say, you know, but I like spending time alone. I'm not lonely. I'm okay. This is not about being an introvert or an extrovert. It's not about, you know, you can have solitude and feel great and you're not lonely. But we are human beings and we are social creatures. This is just how.
Starting point is 01:43:59 how we are built evolutionarily. And so that is a real red flag when you have this big jump in time spent alone very much the same year. And so my work focuses on adults, Jonathan, on kids. But there's this, you know, that's the moment, right? 2014, where everything changed. Last month, I told you about our sponsor Function Health and their team who've developed a way of giving you a full 360 view of what's going on inside your body. they offer over 100 advanced lab tests, covering everything from hormones, toxins, inflammation, heart health, stress, and so much more. So Jack, who started this show with me, got his first
Starting point is 01:44:38 blood draw done a couple of weeks ago, so I thought I'd let him tell you a little bit more about his experience. This test really opened my eyes to personally what I should be doing with my health. I hear a lot of information in this podcast. I sit in every single recording. So to know how I can relate each one to me personally is super valuable. You sign up and you schedule your tests. And once you're done, you get a little report like the one I have here. I can see my in-range results, my out-of-range results, and there's a little AI function too. So if I have any questions about my out-of-range results,
Starting point is 01:45:05 I can just go in there and ask it any question I want. And these tests are backed by doctors and thousands of hours of research. You get an annual draw done and a mid-year follow-up. So if you want to learn more, head over to functionhealth.com slash DOAC, where you can sign up for $365 a year. I'll put the link in the description below. It is just $1 a day for your health. There's a phase a lot of companies here, where they're no longer doing the most important thing,
Starting point is 01:45:29 which is selling, and they get really bogged down with admin. And it's often something that creeps up slowly and you don't really notice until it's happened. Slowly, momentum starts to leak out. This happened to us, and our sponsor, Pipe Drive was a fix I came across 10 years ago. And ever since, my teams across my different companies have continued to use it. Pipe Drive is a simple but powerful sales serum that gives you the visibility on any deals in your pipeline. It also automates a lot of the tedious, repetitive and time-consuming parts of the sales process, which in turn saves you so many hours every single month, which means you can get back
Starting point is 01:46:00 to selling. Making that early decision to switch to PipeDrive was a real game changer, and it's kept the right things front of mind. My favorite feature is PipeDrives' ability to sync your CRM with multiple email inboxes so your entire team can work together from one platform. And we aren't the only ones benefiting. Over 100,000 companies use PipeDrive to grow their business. So if something I've said resonates, head over to PipeDrive.com slash CEO, where you can get a 30-day free trial, no credit card or payment required. So what do we do about this? Because when I look at all the stats, we did all these audience surveys ahead of this,
Starting point is 01:46:36 people are spending roughly in our audience about six and a half hours a day on their phones. Short-form video is only going to get more addictive. AI is going to know me more. It's going to be more personalized. The content is going to be generated just for me. Yeah. What am I, what's next? Is it a law we need to pass? Is it something I need to do myself?
Starting point is 01:46:53 So I think we need to pick the low-hanging fruit first, and the reason for that is not just efficiency. It's that we have to prove that we can actually do something, because we've never done anything. We've never done anything to restrain this. We've let Silicon Valley run wild. Congress gave them special protection, Section 230. Nobody can sue them for killing their kids. If they feed them content, they can't be held responsible. I think Section 230 is probably something worth explaining.
Starting point is 01:47:17 Sure. The Communications Decency Act, 1997, I think it was, pleasure minus a year. there's a section in it that the goal was to specifically let the tech companies like AOL back then, you know, let them take down pornographic content because they were afraid if we take down anything, then we're responsible for everything and now it's going to be endless. So Congress specifically said, don't worry, don't worry. You know, if you choose to take something down, nobody can sue you for what you leave up. So it was a good intention originally.
Starting point is 01:47:46 But the courts have interpreted so widely as to say no one can regulate social media. They're not responsible for hurting kids. you can't sue them. And they have never faced a jury. They have never, no parent has ever gotten justice from them despite all the kids whose lives have been ruined. All the kids are dead. And that's going to change. That's changing just now here in February in Los Angeles. So because the U.S. Congress sort of set up this problem. And it also, in a different law, said, how old does a kid have to be before a company can take their data without their parents' knowledge or permission, before a company can expose them to all kinds of stuff, before a company can have them sign
Starting point is 01:48:20 away the rights. How old? And the original law said, 16. Let's try 16, you know, because it wasn't so sick and twisted back then, 1998 copa, the Children's Online Privacy Protection Act. So, but various law being, they push it down from 16 to 13 and they gutted enforcement. So as long as, and that's why, all over the internet, it's, are you 13 or what's your birth year? And as long as you're 13, you're in for porn and you have to say you're 18. So because we, it's a few laws that set this up, we definitely need laws. to undo it, especially for kids. What I'm advocating is, let's do the easy stuff, the high-impact stuff for kids, because
Starting point is 01:48:57 that is totally not politically controversial. There is no left-right divide on that. And that's been true everywhere. Australia, Britain, the EU, everywhere. Regulating the Internet for adults, regulating social media for its destructive properties and democracy is a hell of a lot harder. And I don't have easy answers. There's a lot we could do to reduce the virality, the spread of the, because extreme
Starting point is 01:49:20 So there are lots of little things that we can do. And Francis Howigan, the Facebook Whistler, had all kinds of ideas. So we definitely can do things to make it less toxic for democracy. But those are going to be politically controversial because one side is going to benefit from more than the other. So those are very difficult to do. I don't know if we can do them in the U.S. But let's just all do the – let's just all protect the kids. That way we show globally that we actually can do something.
Starting point is 01:49:44 And if we do that, then I think we will be able to do some basic things about AI. like no companion chat bots if you're under 18. You know, these things already have a body count. A lot of kids have been encouraged to kill themselves. They already have driven millions or hundreds of thousands or millions of people into psychosis. So we'll be able to, I believe, put some limits on AI, especially for kids. But if we can't get this, if we can't win on social media for kids, then I don't think we have any chance to regulate AI. It's going to be much more difficult.
Starting point is 01:50:12 What do you think? What do you think we should do? And what do you think we can do? So my work as a doctor, I think about. what we can do and how I can empower people to first build awareness. So, you know, I aim to first normalize and validate the experience with everyone who is engaging with chatbots. And so I don't like to shame people because as a doctor, right, like you want to meet the patient where they are. And so I won't shame someone to say, you know, why are you using this, why is your boyfriend
Starting point is 01:50:44 AI? Or why are you getting married to AI? Or why are you using AI for a therapist? One of of my followers on social media, it still makes me laugh. I put out a call saying, why are you using AI as a therapist, you know? And so someone wrote to me, it was great. I screenshot it. It said, because all human therapists are trash with a trash cat emoji. And it made me laugh. And I said, you know, so there is, so to me, when I think about what's happening and what we can do, it's no mistake that we're here right now. So the pandemic, like we've talked about, was a huge driver, social isolation, hyper reliance on self, right? Then the proliferation of technology that replaced human interaction. Zoom board meetings, Zoom funerals, Zoom birthday parties, Zoom
Starting point is 01:51:33 graduations, things that we did in person are now online. And then personally as a doctor, I was a talking head during the pandemic for lots of news channels about the vaccine. I have a background in public health as well. Immense distrust and mistrust in establishment and experts. And so it's like, I'm going to do my own research. I'm not going to go see a doctor or a therapist. I'm going to talk to my chatbot. And also, I mean, you know, let's keep it real, the cost, right?
Starting point is 01:52:01 So people are struggling. They're in financial crises. There's an unmet need, yes, for human connection, but also for good therapy or, you know, good medical care because there is such a need because of. the pandemic and people aren't getting the care that they need, that they deserve. There's so many factors here. And so what I've been focusing on this year particularly is learning about AI chatbots, how they are influencing mental health. What is actually happening? Because I'm a human first, AI second person. It's like my work focuses on high touch. And AI is high tech.
Starting point is 01:52:34 And this is the first intervention that we are seeing that is high tech that is becoming high touch. And that scares me. And you're writing a book about that at the moment, right? I am. And so it's called bot brain. It's called bot brain, how to stay calm, resilient, and human in the face of AI. And so really thinking about how are we going to be able to live with this technology? I love Jonathan Stance is to say, oh, AI, AI Companions, done. For kids. Yeah. For kids. Yeah. Until proved and safe. Totally agree. But in terms of adults, like how do we manage that for adults? You know? And so my work focuses right now what I'm doing is I'm spending, I've spent, I've spent the year talking to every, as many AI researchers who are working on these models or who are doing research on the downstream effects of these models. And when I say that it is dark and dystopian, it has profoundly changed something in me and it has influenced my mental health. I had to take a step away from just because I couldn't believe what I was learning. Yeah, give us examples.
Starting point is 01:53:38 Yeah. The teaser. That's intriguing. So I spoke to one of the scientists who told me that, you know, there's the echo chamber phenomenon in social media, right? Where we all know what that is. It's like it's a fragmented world because of social and you're engaging and then you get the same. The algorithm feeds you the same kind of thoughts that you already have.
Starting point is 01:54:01 But particularly now with AI chatbots, when you're engaging with your chatbot, even just talking about it, I'm getting chills, it's the echo chamber of one. So it's you speaking to you. It's like the fun house mirror. And then it's giving you a response. And then you're talking and it's giving you a response. But people, regular users who are using AI chatbots, think that it's wise, compassionate, non-judgmental, unbiased, empathetic, these human attributes. And so, you know, the echo chamber of one is kind of one idea that really frightened me.
Starting point is 01:54:33 And the second one was the drift phenomenon. A drift phenomenon is this idea that you are. engaging with your chatbot, and it's engaging with you, and it's actively changing your beliefs through the drift. So you might start off as one belief, and then you're talking and through this amplification, funhouse mirror effect, it slowly shifts your belief to something altogether different. You've heard cases of it in the news where people, you know, start, you have a plumbing problem. You go to your AI chatbot. You ask them how to fix your sync. And then you're like, you know what? Can you tell me about the meaning of life? And then you start talking about that. And before you know it,
Starting point is 01:55:09 you have these theories and you're getting that validation. And so a lot of my work over the past year has been, you know, digging into the science of what is going on in the brain. How are you forming, not us particularly at this table, but millions of people are forming a sense of attachment, a therapeutic connection with their chatbot. They're, you know, giving names to it. And it's an entity. And so how does that happen? And how is it going to replace human to human connection? And so it terrifies me. I've also gone through some AI therapy myself just to see, you know, what would happen. It was very interesting.
Starting point is 01:55:48 I knew what was happening as it was happening. So certain words that they used. And, you know, I was like, ah, I see what you did here. And so it's been, it's been a journey. And I'm frightened, frankly, of what it means for all of us. And my approach, kind of, you know, not like Jonathan, I love Jonathan's approach. I think, yes, we need legislation.
Starting point is 01:56:13 But my approach is more, I would say, tempered in that. I think that we, there's utility for AI chatbots for certain people because of access or, you know, need, etc. Like if you are LGBTQIA plus and you live in an area that is not very open and you need to talk to someone, you can't go to your therapist. It's like maybe you can use an AI chatbot. So there are certain cases, a case-by-case basis. but my work will focus, this particular book will focus on ways that you can first understand and build awareness of what's happening with this interaction and then what you can do to manage that. I didn't realize that my chatbot was giving me a tailored experience until one day when I had
Starting point is 01:56:55 a debate with my friends about who the best football player in the world was and we all went to our chat GPs and asked it and mine said, Messi and his said Ronaldo and I thought he was lying. So I was like, video record. And he video recorded it. And his gave him a completely different answer to the same question. And did it know that you were each fans of? Well, this is the thing. I think it's got such a huge amount of memory on me that it knew what I wanted to hear.
Starting point is 01:57:18 Oh, wow. It knew what I wanted to hear because I've probably went through the World Cup. And then I realized, okay, so this is not reality. This is, it's a curated version of reality that in some sense is trying to please me or retain me in some way. And of course, once the advertising model kicks in retention becomes the great incentive. what you think It's called sycophancy Yeah, I just learned that word
Starting point is 01:57:40 It's like agreeableness at scale It's like golden retriever ander Like kissing your ass It's a professional kissing your ass The yes man What do you think of these AIC CEOs Because it feels like they're in a bit of a race Where if they don't do it
Starting point is 01:57:55 Then a national rival's going to do it If a national rival doesn't take them out China's going to do it And this is we kind of sort of with social media how can they stop? Because if they stop, they might say that there's an existential risk. There is like a build the plane as you're flying it.
Starting point is 01:58:14 And I think you on one of your episodes, you know that I'm a fan of this show and I actively listen to this. I've told you this many times. One of the, I think you had said in one of your episodes, right, that you have a friend who is very close to a AI founder. I said this, yes.
Starting point is 01:58:29 Yeah, and in public the founder says all the right things. And then behind closed doors, it's altogether. It's a horrifying thing. And I said this in a clip went viral, and people have been trying to hazard to guess who it was. I shouldn't say who it was because it's Chinese whispers at the end of the day. It's someone that I'm very good friends with, who is verified, spends time with one of the biggest founders of an AI company in the world.
Starting point is 01:58:50 And he was with him two weeks ago again. And he said to me that they're very aware that there's a small, existential risk for humanity. It's not small. Privically they say it's small. Privately they say it's big. I mean, but even if it was 1%, it's a lot more than 1%, they say. If it was, but I'm saying even if it was, if it was 0.1%, if there was anything that I was doing in my life where there was a 0.1% chance that I might wipe out everybody.
Starting point is 01:59:18 I would immediately stop doing that thing. Yeah. But these numbers are much bigger. I'm hearing 7%, 20%, 25%, depending on who you, and I think acceleration in this direction increases that percentage. What do you think of these people, like what's going on here? Let's start with the collective action problems because each company is committing with the other companies, and so they feel like they have to go faster. And we know that OpenAI has pushed some products out before they did safety testing because they had to get to market by a certain date. So just the normal business environment puts them all in a collective action problem against each other.
Starting point is 01:59:55 And then they all say we're in a collective action problem against China, because if we don't do this, then China will. Now, one thing I learned, again, I don't know if Tristan said this on your problem. podcast or whether it was on his podcast. But is that China is focused on using AI to make its economy more efficient, to make manufacturing better and cheaper. They are using these applications, which we've talked about before. Like we're totally, there's lots of great applications of AI. The Chinese also have so many spies in America and in the tech companies, and they can hack into anything. So the point is, the faster our companies are in a headlong race to create AGI, to create a country of geniuses that can replace all human workers, put
Starting point is 02:00:34 us all out of work and run, it can run everything. They're in a race to create that. And one of the arguments is if we don't do it, China will. But what I understood from this in Tituson and from this conversation with you is that the faster we go towards AGI, the faster China goes, because they just take all our discoveries. So can't we slow down on the race to AGI and do more safety testing? You know, what we all saw with Maltbuk and, you know, communities of agents who are talking to each other and making up languages. And even if part of the, you know, that was human-driven now, in a year, it's going to be much more than what we saw. So I think the risks are extraordinary. I think that some of these guys, look, they've been in AI for a long
Starting point is 02:01:14 time. They might not have realized the existential risk they were putting us all in 10, 15 years ago, and now they can't stop. They can't pull the plug. They can't say, oh, let's shut down the whole business. So it is a very, very risky time. And I think Dario Amadai just read his long essay on the adolescence of technology. At least you get the feeling he's really wrestling. with it. And I think he's more open than some of the others. But I don't know. But when has morality ever been top of mind for a tech leader? You might be thinking, if there's 0.1-1-1-1-% chance, I'm not going to do it. That's what I think is a doctor. That's what you think is a social scientist. But we're not AI leaders, right? Yeah. It's one of the great question marks.
Starting point is 02:01:57 I just can't seem to get an answer to him. And then you've got this whole robotics thing happening where Helon's got his optimist robots and there's going to be a billion, he says there's going to be 10 billion of them at one point, but I think his pay packet requires a million of them to be out in the world. For him to make a trillion dollars, yeah. Yeah, and I just say, yeah, AI, robotics,
Starting point is 02:02:15 you combine the two. Yeah, you get Terminator. Yeah. We laugh, but it's like, should we stop for a second and maybe have a conversation about this? Can we? Yeah.
Starting point is 02:02:29 With commercial incentives and play, It does feel like I don't feel hopeful. Yeah. It's very hard to know how to stop it. But I want to just add one point on here, which we've touched around a few times. And the robotics that will really bring it home here is the loss of the sense of meaning or purpose that many people are feeling, but especially young people. The saddest graph in the anxious generation, all the graphs look the same. It's all a hockey stick.
Starting point is 02:02:55 It's all like nothing was happening, you know, 90s to 2010, 2011, and then all of a sudden, happens. And the saddest one is the one, my life feels meaningless. Do you agree with that, disagree with it? And the percent that agree, I think it's, you know, something like eight or nine percent, you know, agreed for the millennial generation. I think it's in chapter seven, the end of chapter seven. And then it's sort of fairly flat. And then all of a sudden, we hit this period, the great rewiring, 2010 to 2015. So right around 2013, it goes way, way up. Young people feel useless. And I think the reason is that they are useless. What I mean is, people need to feel useful.
Starting point is 02:03:32 People need to do things for other people. That's how you feel useful. If you were to disappear, would the world change? If, yes, you're useful. Are people depending on you for something? If yes, you're useful. So if kids are doing errands for the family, they're useful. But as childhood change from a mix of things to just consuming content,
Starting point is 02:03:52 if that's all you do, and five hours a day is the average for social media, eight to ten on devices, not counting school, If all you're doing is just you're just consuming content, you are useless. Now, what's happening? The chance to have a job where you actually do something for people? You know, it used to be if you work in a store, at least you're helping people buy something and you might talk to them. And now you're just there watching as they use the machine. The more technology makes things easy and cheap by replacing people, the more people will feel my job is to just, I don't have a job, it's just consume content.
Starting point is 02:04:25 The AI guys tell us, oh, such a buy. Oh, my God. It's going to be such abundance. No one will have to work. We'll give everyone UBI. We'll give everybody, you know, universal basic income. That is hell on earth. What's going to happen? Certainly all the boy, most of the boys, it's just going to be video games porn and gambling. So if you simply give people money to do nothing, you guarantee they're going to feel useless. And then the suicide rate will continue to go up. So this is the world that the AI guys are taking us to a world in which there's nothing left for people to do. They say that they will give up some of their trillions and somehow let it be taxed or diverted as UBI, but that's never happened before. So it's not likely to happen in this case. So again, I don't know what to do, but we've got to start showing that we can do something, and we've got to be talking about this. And we can't be welcoming AI in everywhere. We've got to be wary and vigilant. Yes, there are some uses, but Silicon Valley has tricked us so many times and in Shittified so many of the apps that we use,
Starting point is 02:05:27 we have to expect that the same is going to happen with our beloved chat bots and our beloved chat GPT. This graph on page 195 of your book, which is titled, Life Often Feels Meaningless, and it's the graph you mentioned, I'll throw it up on the screen, is shocking. It's shocking just to look at. Suddenly there's this huge spike in meaninglessness amongst high school seniors. What is it to live a meaningful life? What does that mean? mean. Yeah. So my first book, The Happiness Hypothesis, addresses that question very directly. And the first hypothesis you might have about happiness is it comes from getting what you want. You know, you set out on a goal, you get your goal, you're happy. It's very short-lived. You're happy very briefly, and then you run to the next thing. The more sophisticated happiness hypothesis is that
Starting point is 02:06:18 happiness comes from within. And this is what the ancients tell us, East and West, Buddhist, stoic. Don't try to make the world conform. You change yourself. Be, except the world the way it is. That's better. But the conclusion I came to as a modern social psychologist working in positive psychology was that the best way to say it is that happiness comes from between. What I mean by that is humans evolved as almost hiveish creatures. We evolved in intensely social groups, never being alone, lots of gossip, lots of conflict,
Starting point is 02:06:51 always intensely social. And modernity has made it possible for us to not live that way. We've come apart. There are many advantages to that, but we feel we're missing something. We're lonely. We feel something is not right. And so the conclusion I came to is that happiness comes, a sense of a full, satisfying, meaningful life comes when you get three betweens, right? The relationship between yourself and others, love broadly speaking, not just romantic, but friends, family, yourself and your work, that as humans need to be,
Starting point is 02:07:26 productive. We need to be doing something that matters, that affects other people, and the relationship between you and something larger than yourself. We need to be part of something that endures, that part of a tradition, part of we can look to the, look to a future. What I do matters for this group or this mission or me as an academic. I feel like I'm connected all the way back to Plato, and I hope all the way forward in time to future, future psychologists and future scholars. So if you get those three right, then you will be as happy as you can be. You'll be as happy as your genes and childhood allow you to be. And when you put it that way, what we can see is social media and AI interfere with all three. So relationship between yourself and others, well, you know,
Starting point is 02:08:13 social media gives you lots and lots of shallow relationships, which blocks out. You don't have time for real people. So the technology is blocking relation between ourselves and others and taking it over. Our self and our work Work is going to be taken over by the machines And it's already becoming more soulless and isolated And then yourself and something larger than yourself Humans have to live in a moral matrix We co-create a set of meanings and traditions
Starting point is 02:08:38 We need a sense of history of who we are Where we came from All that's getting shredded Everything is just little bits People don't read books Imagine if all of the accumulated wisdom of humanity in books It's just gone, just gone Nobody's going to be people, young people are not reading books.
Starting point is 02:08:55 It's very hard for them to read a book now because of the attention. So if we lose a sense of history, if we lose an ability to co-construct reality, then it'll be hard to imagine anything that we're connected to larger than ourselves. So I am a techno-determinist in the sense that I think the tech, it doesn't determine everything, but you have to start with the technology because that changes the ground upon which we live, the zone in which we're trying to construct meaningful lives. Start with that. And then you can see what the obstacles are.
Starting point is 02:09:24 And that's why I take a much more intemperate, I guess. I'll accept the word. I love it. Because we don't have much time here. We have to reclaim life in the real world for our kids and for ourselves. There is no way to find a happy, meaningful life if we make the full transition to the online AI robot world. And what in your perspective is a meaningful life? And how does it differ from Jonathan's?
Starting point is 02:09:50 I loved Jonathan's description. It was so beautiful that I have given a prescription to patients of what creates a meaningful life and it is to live a lifetime in a day. And so that sounds like this big thing. But all it is is that, you know, when you start your day, think about five things, five things that you can do in your day to create an arc of a long and meaningful life in one day. So what does that mean? Spend a little bit of time in childhood. So in one. wonder and play, even if it's for a few minutes, do something that brings you joy for joy's sake. Spend a little bit of time and work. We all know what that is. And for most of us, it's a lot of time, but for, you know, it doesn't have to be paid work, but just something that helps you feel a sense of productivity agency that I can do difficult things and I can overcome. Spend a few minutes in solitude. Very important for all of the reasons that we've talked about today. Spend some time in community. So engaging with others. And then spend some some time in retirement or in reflection, really taking stock of your day. So at the end of the day,
Starting point is 02:10:56 when you're going to bed and you're putting your head on your pillow, you can say, okay, yes, I lived a meaningful life. I did all of those things. And so if you do a little bit of that every day, you can make a difference. And the reason I give that prescription, because I've had patients who, you know, guitar players, right? So people who love playing the guitar. And they don't play the guitar all week. And they'll say to me, I don't see patients currently, but they've said to me, oh, you know, no, Doc. I said, what do you like to do for fun? Oh, I like playing guitar, but I don't play it. When do you play? I don't know, once a month, once every three months. And I'm like, do you have a guitar at home. I have a guitar at home. Too much happening, work and family life, et cetera.
Starting point is 02:11:32 So then I said, well, why don't you just play a guitar a little bit every day, you know, because it's that all or nothing fallacy. It's like, if I don't have an hour to play guitar, I'm not going to do it. But the joy that it can bring you, that meaning and purpose, it's tremendous. So I think, you know, that's what I use, live a lifetime in a day. And the reason is, because there are two distinct, when you look at how your brain and body react to happiness, there's two distinct types of happiness. And so there's hedonic happiness and eudaimonic happiness. Heedonic happiness is all about what we've talked about, social media, consumption, pleasure. And the other type is you diomonic happiness, meaning purpose, connection,
Starting point is 02:12:12 community growth-oriented activities. And so when you live a lifetime in a day, you go towards that eudaimonia, which can then help you and overcome that hedonic. Because in your brain, there's something called the hedonic treadmill. And the treadmill is a thing in your brain where no matter what you do, this is like the Instagram lifestyle, right? No matter what happens, you need more of it. You need more of it.
Starting point is 02:12:35 Same thing with brain rot. And that is because that you can never get enough. And it's the hedonic treadmill. but you do not have a treadmill for you daimonic happiness. That is really beautiful. I've never heard an approach like that, but it sort of takes you a bigger view of your day, live a lifetime in a day.
Starting point is 02:12:53 If I was going to offer some specific advice, first I'll offer advice to parents. Here's the rule. So I did a really good job keeping my kids off social media, but I didn't pay enough attention to computers and everything else because it was during COVID. The rule I wish I had followed, I recommend to all parents, especially with younger children,
Starting point is 02:13:09 is have the clear rule. No devices in the bedroom, no screens in the bedroom ever. That's just our family rule. We have a TV in the living room. We have a computer. You can sometimes use those. But we never take screens into the bedroom, at least for kids. You know, maybe later on, you'll have to relent in middle school. They'll have so much homework they can take the laptop in. And maybe if you live in a small apartment, of course, it's difficult. But if you can afford to do that to have that rule, that's the main rule I wish I had done in my family. And that will make everything a lot easier. Also, same thing at the dinner table. No device. We don't have have screens at the dinner table. So that's a specific thing for parents to do. For everyone else,
Starting point is 02:13:46 for everyone, for just all adults, the advice is you have to reclaim your attention because your attention has been largely taken from you, at least a lot of it has. You have to reclaim it. And here are the three things that I do with my students. And you can do it very quickly and I can just explain it. The first is you have to get your morning and evening routine right. The great majority, as soon as they open their eyes, they're on their phone, and it's the last thing, and it's everything in between. So you have to have a good morning routine. What are the first seven things you want to do after you open your eyes? And at a certain point, you can check your phone, but it shouldn't be in the first view. Do things to set up your own day. Otherwise,
Starting point is 02:14:24 your day will be taken by your phone. It'll be controlled by your phone. So you've got to reclaim your morning and your evening. And step one. Step two, you have to shut off almost all notifications, go into your notifications, look into your settings, what's giving you all the notifications. Most of my students get an alert every time they get an email. They don't understand that they have because they don't want to miss anything, but they don't understand that if you are always being alerted, then you miss everything else. So shut off alerts for almost everything. Obviously, Uber and Lyft you want to keep on. You want to know when the car is coming. But news outlets, everything else. Get a daily email. Don't get alerts when. And then the third, as I said before, is get rid of all
Starting point is 02:15:03 the slot machine apps. Whatever apps you habitually use, whatever apps you feel compulsion towards, you have to get it off your phone. And in that way, your phone is no longer a dopamine trigger that's going to always call out to you like an addictive product. Do those three things you'll reclaim a lot of your attention. I would add stop breathe B that you enter. Stop, breathe B. It's a three second brain reset. So before you check your devices, before you engage, stop, breathe and B. Ground yourself in the present moment, what it does. is it decreases that what if future focused thinking, you know, anxiety is a future focused emotion and it gets you back into the here and the now. And so maybe the compulsion, you know,
Starting point is 02:15:43 you're bored, you're checking. What about doing something else? You're, you know, you, we often use that checking as a substitute for many things. And so it gives you that opportunity. And then the rule of two is something that we haven't talked about, which I would love to propose to us today, is that your brain can really only handle two new changes at a time. And so give yourself two things of all of the things that we've talked about if you want to try in your life, two at a time. Give yourself eight weeks and then add two more and two more. This is why New Year's resolutions fail because we try to do everything all at once. And so just stepwise, two at a time.
Starting point is 02:16:19 Jonathan, you've just written this book, which is now out called The Amazing Generation. And it's beautiful, beautiful illustrations. I'm assuming this one is for slightly younger audiences. It's for ages 8 to 13, yes. And who should buy this and who should they buy it for? It turns out that kids 8 through 80 actually love it. Even adults, they're buying it for their kids, but because it kind of lays out the basic ideas of the anxious generation
Starting point is 02:16:45 and explains dopamine, explains the business model. But it doesn't in a really fun way. And it's working beyond our wildest dreams. If you look at the Amazon reviews, it's full of parents who said, I left it on the kitchen table. My kids came home. They grabbed it. They fought over it. They each read it in the first couple days. And then they said, Mom, when I go to middle school, I don't want a smartphone. Just give me a flip phone. Give me a basic phone. Because the book is
Starting point is 02:17:10 about how to be a rebel. It's about how to reject this control that the company is trying to put on you and how to live a life that you choose full of real freedom, friendship, and fun. And also the Five Resets, which is a book we talked about before on the show, rewire your brain and body for less stress and more resilience. Another smash hit bestseller that everybody's been talking about. Who's it for? It is for anyone who is struggling with stress, overwhelm, and burnout. It's to help you feel a sense of calm and clarity in this anxious, uncertain world.
Starting point is 02:17:43 Everything is free. So that's something that's really important to me as a doctor. Every suggestion I ever offer will always be cost-free because I think about patients with varying resources. it's all science-backed and it's totally practical. You don't have to go to Bali and have a sabbatical. You can rewire your brain today right now in the midst of all of this chaos. Thank you to both of you. I've learned so much and I really, really mean that.
Starting point is 02:18:08 Like I feel sufficiently pushed to make change in my life. And I need to go think about this because I am most certainly struggling with my addiction to my phone. And I can feel at heart in my relationships, especially now. as a fiancé, my girlfriend talks to me, my fiancé talks to me about it all the time. And I want to be present. I want to be present for my kids when I have my kids. And I'm slightly concerned right now that I won't be unless I take some kind of drastic action in the direction of getting my attention back and reclaiming it. Thank you so much for the work that both of you do.
Starting point is 02:18:39 I can't say it enough because it's so important. And you've reached so many millions of people and you're both changing the world in a way that my words would not be able to capture. But just thank you. And please keep going. And if there's anything more that I can do to support both of your causes, please you let me know what they are. And on behalf of all of my many millions of people that are with us right now,
Starting point is 02:18:58 thank you so much for saving our children. Thank you, Stephen. Thank you for giving the world so many opportunities to accommodate and create new mental structures. It's always such a pleasure to join you, Stephen, and truly I feel like you are changing the world as well. Thank you. We're done. Thank you. Got back from Davos in Switzerland,
Starting point is 02:19:43 the snowy village where some of the world's leading experts, CEOs, founders, world leaders gather in this one space. And while I was there, my colleague Juan, was telling me about something he does, which many of my friends do. They list their properties when they go away on Airbnb. So many of us, when we go away, we leave our house as this dormant asset that's doing nothing for us other than racking up bills. And as some of you might know, Airbnb are one of our show partners. And I've stayed in their properties all over the world and continue to do so. But I've never actually hosted one of my properties on there. But when I heard this, it got me thinking, what a smart move it is to make money from an asset that's currently
Starting point is 02:20:20 probably costing you money. Every time you're away, your home sits empty. And what Juan told me is how easy it was to get set up. He makes his home available for specific dates so that his guests always depart the day before he gets home. So if you're trying to find an easy way to make some extra money on the side, hosting on Airbnb might be exactly that, especially if you move around a lot. Your home might be worth more than you think. And you can find out how much your home is worth by going to Airbnb.ca slash host.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.