The Daily - Introducing 'Rabbit Hole'

Episode Date: April 17, 2020

What is the internet doing to us? Today, we’re sharing the first episode of a new Times audio series called “Rabbit Hole.”In the episode, “Wonderland,” we hear from a young man named Caleb, ...who finds escape and direction on the internet. We follow his journey into the YouTube universe.“Rabbit Hole," a New York Times audio series with tech columnist Kevin Roose, explores what happens when our lives move online. You can find more information about it here.

Transcript
Discussion (0)
Starting point is 00:00:00 Earlier today on The Daily, we played the introduction of our new series, Rabbit Hole. Now, Episode 1. Take a spin, now you're in with the techno set You're going surfing on the internet It's another day in the life of the Jamesons Maybe a family a little bit like yours Except, it's not really just another day Today's the day I'm taking my family surfing around the world on the internet It's cool, Dad's finally installed the internet on our home computer
Starting point is 00:00:41 Now I can surf the net anytime I want. Test, test, test. Is this a state line? I just saw the West Virginia Welcome Center. Wild, wonderful West Virginia with Jim Justice. Well, the governor's name is Jim Justice. I feel like the campaign signs, like, make themselves, you know. Okay, so
Starting point is 00:01:13 just, like, tell me the story and I'll poke you along the way. Okay. This is our exit here. So, last year. You and I hopped in a car. Can you just, in the time we have here, like, can you just tell me, like, what are you up to today and what are you up to today? And what are you hoping to find out? To meet this guy.
Starting point is 00:01:31 His story is really interesting. So he says that he was radicalized through YouTube videos and spent, you know, several years becoming progressively more extreme in his politics. After this shooting in New Zealand last March, there was a lot of talk about online radicalization. The shooter was a white nationalist who clearly spent a lot of time in far-right internet communities. I've been looking for a good case study of how it happens to just one person,
Starting point is 00:02:03 what that path looks like. And I kept hearing over and over again from sources I was talking to just one person, what that path looks like. And I kept hearing over and over again from sources I was talking to in this world, like, you have to look at YouTube. There are elements of it that seem very familiar to me and some that don't seem so familiar. So we're going to have him walk us through it. I just missed our turn.
Starting point is 00:02:21 I was so excited about this that I missed my turn. So we drove down to West Virginia, where he lives. Hey, how's it going? All right, great. He's standing right there. He's on his headphones. Big smile. Nice to meet you guys.
Starting point is 00:02:36 Hi, Andy. This is Andy. Andy? I'm going to stick a microphone around you. That's fine. He's got a Gorillaz t-shirt on. Hey. He says hi and brings us into his friend's house. He's got a Gorillaz t-shirt on. Hey. He says hi and brings us into
Starting point is 00:02:48 his friend's house. Hi, I'm Kevin. Nice to meet you. Thank you for letting us crash your house here. And the first thing he does is... I'm gonna just set this over here. Can you describe what it is that you're setting down? A Glock 43. Pull out his gun. Is that the first time
Starting point is 00:03:04 you've had a gun? I've liked firearms my whole life. I'm not against firearms, but yeah, that's... I've always been like, well, I don't really need one. But then the day after I got death threats, went out and bought one. I'll probably never have to use it. These guys usually just send SWAT teams to your house and shit like that,
Starting point is 00:03:20 but, you know, it's just to be safe. Where should we sit? Where's the best place for us to... Eventually, we go over, we sit on the couch. Give me a little test here. Oh, sure. Testing. One, two, three. And he starts telling what begins, like, a pretty relatable and familiar story. So, can you just start, like, tell us your name and how old you are?
Starting point is 00:03:42 Yeah, my name's Caleb Kane. I'm 26 years old. I was born in Florida, but I grew up here in West Virginia. When did you move to West Virginia? My mother had me with some man that I never met, and then she, like, immediately left Florida, I guess. I don't know all the details there. He had kind of a rough childhood, as he describes it. Got raised by my grandparents.
Starting point is 00:04:03 Didn't really have a lot of friends. What were you like as a kid? Really shy and ner it. Got raised by my grandparents. Didn't really have a lot of friends. What were you like as a kid? Um, really shy and nerdy. Picked on on the bus. Didn't really feel like you fit in. Going to high school was hell. I hated it.
Starting point is 00:04:14 I saw everybody's conformance. Like a lot of teenagers, he got really into video games. So Zelda was a big one, Donkey Kong. And then he discovered... Freshman year of high school was whenever I got high-speed internet. The internet. I don't know what I I got high-speed internet. The internet. I don't know what I would have done without the internet. This thing is awesome! It was like an escape.
Starting point is 00:04:30 And the internet is a revolution for him because finally... That's when I started playing a lot of online video games. Oh, shit, in the window of the house! We're loading! He finds people who are like him. People all over the country, all over the world. Oh, that was so good, man. Oh, nice.
Starting point is 00:04:46 I met a lot of friends. They grow up so quickly. And he develops this new routine where, like, every day, he comes home from school, gets on his computer. All right, this is Call of Duty 2. Plays a bunch of video games. And then, later at night, I'd turn on YouTube. He goes on YouTube.
Starting point is 00:05:07 And YouTube at the time, it was mostly viral videos and comedy sketches. This was like, you know, early, early YouTube. Stay up all night and watch Lake. And bust up laughing.
Starting point is 00:05:32 If Caleb didn't feel like he fit in in high school, in his sort of physical surroundings... And the categories are potent, potent. You wouldn't know it, but I like people. YouTube was the place that he felt most at home. I like people, but I like them in short bursts. You're speaking to me very deeply here. Yeah, exactly.
Starting point is 00:05:54 I experienced this too. I was not cool in high school, and I remember the internet kind of being a place that you would go to escape. Yeah. And it was sort of nicer than your real life in some cases. You're right, it was nicer back then. Were you political at the time? Political in, like, a very surface-level sense, right?
Starting point is 00:06:16 Like, anti-authority and, like, you know, most of my politics as a teenager came from, like, Ted Kennedy's. This is Michael Moore documentaries. And that influenced me a lot. I mean, I wish that CNN and the other mainstream media would just for once tell the truth about what's going on in this country. So there was very much that punk rock influence inside of me.
Starting point is 00:06:43 He also got really into these new atheist videos. When I say that I think religion poisons everything. I remember watching like Christopher Hitchens on YouTube. I mean to say it infects us in our most basic integrity. You'd get old uploads. For me, what matters is the truth. Of like a Richard Dawkins speech. There is nothing special about the Bible.
Starting point is 00:07:05 Oh, I remember these videos. They felt like kind of scandalous at the time. Right. They felt subversive. They felt like watching people say the uncomfortable thing. God is good and loving and just, and he wanted to guide us morally with a book. Why give us a book that supports slavery? By today's standards, obviously, like this is extremely tame. But at the
Starting point is 00:07:26 time, like this was pretty edgy stuff. So this would have been like early Obama years, right? Yeah, early Obama years. Did you like Obama or what did you feel about him? I liked him. I didn't know much about him because I didn't look into actual politics. So I didn't know what he was doing. But yeah, I liked him. I thought, yeah, we have a black president. That's cool. Like, you know, first black president making progress. Graduated 2011, went to college. And then he goes off to college. And college like just doesn't really take for him. I wanted to go and do like an environmental major. And I didn't have a good time.
Starting point is 00:08:08 Most of the time I'd stay in my room. Even on nice days when people were out on the quad throwing Frisbees, I'd sit in my room and play video games. And then there was also this embarrassing moment where I got in a fight with this kid on campus and kind of got laughed off of campus. And that night I freaking left because I wasn't going to class
Starting point is 00:08:25 anyway. And I just withdrew from my classes and I left and I came back to West Virginia. So he moves back in with his grandparents and there like he doesn't have much to do. He doesn't have a job. He doesn't really have a direction. Just me in a room in a bed. He spent a lot of time. Slept a lot. Feeling depressed. Sit in my basement and just be like so freaking down. At one point, he even loses his gaming computer. My gaming computer got stolen, by the way, which really fucking drove me up a wall. And so now I didn't even have video games. Now I have a crappy little computer that can hardly run anything.
Starting point is 00:08:59 But it can run YouTube. Whoa, that's a full rainbow. And so now he's in his early 20s. He's living at his grandparents' house. And he feels like at this time in his life when he should be, like, finding his way, he should be starting a career and thinking about having a family.
Starting point is 00:09:19 Instead, he's just watching YouTube. All we know is how little we know. What does the bird say? And then... The human brain is a network of approximately 100 billion neurons. I found a documentary called God is in the Neurons. When we grow up, our moral and ethical compass
Starting point is 00:09:41 is almost entirely forged by our environment. About cognitive dissonance and how you can fall into patterns of behavior. And since you have neuroplasticity, you can get out of those patterns of behavior. When we are self-aware, we can alter misplaced emotions because we control the thoughts that cause them. So I got in this mindset of, oh, my brain is just like this tool that I can shape into whatever I want. And then he stumbles into this emerging wing of YouTube.
Starting point is 00:10:08 And so I started just going through self-help content. Self-help videos. What dream or vision do you want to turn into reality? All of our success and failure in life comes from little decisions. Cheesy stuff, to be honest with you. The process of conditioning ourselves actually feels incredible. Like Tony Robbins and stuff. Every mind possesses the potential to be utterly free of all the Zen Buddhism stuff. We want to change our consciousness. They're like people with advice specifically for
Starting point is 00:10:36 you continue to do those same behaviors that keep you from making the change. Guys like Caleb. For making the change. Guys like Caleb. And then... To be truly free is both very easy and very hard. I found staff. But we can only be kept in the cages we refuse to see. So, Stéphane Molyneux is this Canadian libertarian, formerly a historian and an entrepreneur, and then he sort of became like a podcaster guy.
Starting point is 00:11:10 Good morning, everybody. It's Stefan Molyneux from Freedom Aid Radio. I hope that you're doing very well. Steph just was in the sidebar one day, and I clicked on it. You really have to open the often iron-bound doors of your heart. When YouTube sort of early in its life removed its 15-minute limit on how long videos could be, he just started pumping out hour, two-hour long shows called Free Domain Radio. The approach that we take at Free Domain Radio, the sort of philosophical Socratic approach that we take, can be very, very helpful for you.
Starting point is 00:11:39 Where he would expound on philosophical ideas. Philosophy is the all discipline. It covers everything. And that's why, to me, it is the most exciting and fundamental. And Stefan Molyneux is telling him things that, you know, make him feel better. He's saying this depression that you're going through, it's not permanent. Things will get better. And that a lot of the disillusionment and pain that young men like Caleb are facing
Starting point is 00:12:02 is not actually their fault. From the perspective of a young man, to take a brief look at society... It's the fault of society. I mean, you get, of course, on-demand pornography. You get video games that are unbelievably realistic, absorbing, and addictive. And what else do they have to look forward to?
Starting point is 00:12:20 Well, they can get themselves involved in higher education and graduate an average of $25,000 in debt to a job market that is pretty stagnant or declining. You've seen real wages... He finds a lot of what he's saying pretty sensible. College students have a damn right to be depressed. Their society is unsustainable because nobody's asking the fundamental questions
Starting point is 00:12:39 about why the society is the way it is, why things are so bad. I was like, yeah, yeah, that's true. And he's not just talking at people. He invites people to send him questions. I have had a number of requests to do a podcast on how to meet a nice girl. He invites them into his life. He would talk about how he grew up. I was interested in morality from a very early age.
Starting point is 00:13:05 I was physically, emotionally, and mentally abused. And he's talked about how he was so much better now. I have consistently said, if you have problems with your parents, talk to a therapist. And he had went to therapy, and his life had improved. Before we met, Christina had spent quite some time working on herself. And he had a wife who would come on stream with him. He has his wife on his channel and they talk about their life together. Talks about how much he loves his daughter.
Starting point is 00:13:44 I was like, I want all that stuff. I want a family like that. Because that's what I wanted my whole life. I just wanted a stable family. And I thought, well, if I just keep watching more and more, I'll be like Steph. Hi, everybody. Stefan Molyneux from Free Domain Radio. Hi, everybody.
Starting point is 00:13:57 This is Stefan Molyneux. Hi, everybody. It's Stefan Molyneux. Hi, everybody. It's Stefan Molyneux from Free Domain Radio. I hope you're doing very well. And for Caleb. You gotta happen happiness without responsibility.
Starting point is 00:14:05 Stefan Molyneux, his voice and his videos become a source of stability for him. There are very specific things that people need to do to be happy. And Caleb says that all this stuff that he's watching on YouTube, like, it's actually helping. Started working at Dairy Queen. He gets a new job. And then I started, like, getting over a lot of my social anxiety
Starting point is 00:14:25 because now I'm forced to interact with people and I'm also hanging out with all these high school kids. He starts to feel like things are picking up for him, finally. And after that,
Starting point is 00:14:35 I mean, it was just more and more of that. So the problem is not that you don't know how to think. I was pretty much always on YouTube. He's watching not only, like, the self-help stuff from Stefan Molyneux.
Starting point is 00:14:50 The Great Pyramid at Giza. He starts getting pretty into Joe Rogan. Joe Rogan, he is the man. The biggest podcast, YouTube talk show guy that there is on the Internet. Thank you very much for coming by, man. This is cool as fuck. But back then was just starting to experiment with uploading his interviews to YouTube. Anthony Bourdain is with us, ladies and gentlemen.
Starting point is 00:15:13 Hi, everybody. He just keeps watching and watching and watching. If I wasn't at work, any single moment that I had, I was watching YouTube videos. This is episode 500. Probably, at that point, 10, 12, 13, 14 hours a day. Such individuals as bad people.
Starting point is 00:15:46 I sound like a crazy person, but that's what I would do. And when Caleb talks about watching YouTube videos during this period in his life, he talks about experiencing this as the sensation of falling. But the thing that he doesn't even really know to think about is that on the other side of his screen, there's a force that's pulling him in. And that force has to do with a French guy named Guillaume.
Starting point is 00:16:21 Guillaume, hello, this is Kevin. Hey, how are you? Doing well, how are you? Great. Who, even though they've never met, So, yeah, I think everything is ready, yeah. is a really important part of Caleb's story. I did a PhD in artificial intelligence, and then I worked at Microsoft. Guillaume Chalot is a pretty smart guy.
Starting point is 00:16:57 You studied how to make robots, essentially. Exactly. Got his doctorate studying machine learning. essentially. Exactly. Got his doctorate, studying machine learning. And then in 2010, he got his dream job working at Google. And so when I joined Google, I actually didn't know which project I would be put on. And when he gets to Google, he gets this really interesting and exciting assignment. Yeah, he turned out as immediate someone on the AI of YouTube, so it was the perfect fit. And the project he's assigned to work on is ultimately what distinguishes YouTube from every other website on the Internet. And that has to do with the recommendations sidebar and the artificial intelligence that makes the whole thing work.
Starting point is 00:17:41 And did this seem to you like a big deal to be given a job at Google working on YouTube recommendations? Yeah, that was really amazing at first to realize that my work was going to be affecting so many people. So I thought it was going to be a good thing. I thought, okay, we can make artificial intelligence to make the world a better place. I thought, okay, we can make artificial intelligence to make the world a better place. And when you got there, I assume there was some algorithm that was selecting videos for people. How did that algorithm work? Yeah, so initially when YouTube started, what's best was clicks.
Starting point is 00:18:25 The more people clicked on the videos, the better they thought it was. And then they realized that it led to too many clickbaits. So people would click on the title, then realize that the video was not at all about what was said in the title, and then they would leave the platform immediately. So that would be actually bad for YouTube. So then they switched their measure to
Starting point is 00:18:46 total watch time. And how was the goal of this algorithm explained to you? Like, what did you understand about what YouTube's executives wanted? The idea was to maximize watch time at all costs, to just make it grow as big as possible. watch time at all costs to just make it grow as big as possible. It seems like a simple shift, but that shift has radical consequences. YouTube's watch time has gone up by 50%. I'm seeing accelerating usage growth. It produces these numbers that no one has ever seen before. YouTube now pulls in more than a billion dollars a quarter. So it's an incredible size of audience who are consuming ever more video content. It's not so much if you're watching YouTube,
Starting point is 00:19:29 it's how much. How are you feeling about your work at this point? I think we were so excited on working on this project that we didn't really question too much that watch time was a good metric. We were thinking, yeah, I i mean if people are watching longer they might be happier about what they're watching so at the time i felt pretty good about this yeah if i had a moment to stick earbuds in my ear to reject how we are viewed by others any single moment that i wasn't talking to someone i was consuming content.
Starting point is 00:20:06 Hey, everybody. This episode of the podcast is brought to you by... But then I realized there were some issues. People were noticing that you had a problem with Maximizing Just Watch Time, that it creates these filter bubbles. Say more about that. The way I explained it when I was at YouTube at the time was...
Starting point is 00:20:27 Say hi to everybody! When you watch a cat video, then the recommendation engine can say, oh, you watch the cat video, so we are going to give you another cat video, and then another cat video, and then another cat video. And then another cat video. More of the same.
Starting point is 00:20:48 More of the same. More of the same. And at the time, I was really worried about wasting human potential. If you could go on all of YouTube, but then the thing that's going to keep you watching the most is cats, is it the right thing to do to give you, again, cats on cats on cats? And over time, Guillaume realizes that this filter bubble problem he's been noticing, it's actually worse than everyone just watching the same cat videos over and over.
Starting point is 00:21:22 So at the time, it was a demonstration in Cairo. In Cairo, Egypt? Yes. Violence has erupted in the Egyptian capital Cairo as those... He sees these news videos, these political videos, like this conflict in Egypt that's going on at the time. And he sees that the algorithm is showing people in different groups the same thing over and over. And you would see a video from the side of the protesters, and then it would recommend another video from the side of protesters. So you would only see the side of protesters. You start with the side of the police, you would only see the side of the police. Then you had only one side of reality.
Starting point is 00:22:06 You couldn't see both sides. So these two different realities were created. Once you recognized that there were these filter bubbles, these sort of algorithmic echo chambers, what did you do about it? So the first thing I didn't want to do is complain about it and try to find like the bad example that shows what's wrong with it, because I didn't want to be the grumpy French guy who complains. So what I did is side projects, like I created with another engineer who's still at YouTube,
Starting point is 00:22:46 we created an algorithm that did the exact opposite. It got out of the filter bubble. And did any of these side projects have any impact at YouTube? Like, did they move into testing? Were they implemented? Did managers like them? No, they were always just prototype, but then they were never even
Starting point is 00:23:08 test on real users. And why do you think that is? I mean, the way they were saying it is that okay, it's not our objective. And our objective was to increase watch time. So the problem of sort of political polarization of giving people only one side of a story, was to increase watch time. So the problem of sort of political polarization
Starting point is 00:23:25 of giving people only one side of a story, like you're noticing this problem while you're at YouTube and it sounds like you were trying to address it through these side projects, but your bosses are not saying, Guillaume, that's the best idea we've ever heard. Let's put it live on the site right now. How did things unfold for you at YouTube from there?
Starting point is 00:23:47 Yeah, so when I proposed the third project to my manager, he told me if I were you, I wouldn't work on it too much. And then for a few months, I didn't work on it. And then when I started working on it again, then I got fired for bad performance review. Which is true because then I spent so much time on this project that I spent less time on my main project. So Guillaume left Google,
Starting point is 00:24:21 actually left Silicon Valley altogether and moved back to France. left Google, actually left Silicon Valley altogether and moved back to France. And he says that he kind of stopped thinking about the YouTube algorithm altogether. Until one day, like an old French romance, the two meet again. I was in a bus ride from Lyon to Paris. So it was a six-hour bus ride. I was working in Paris, but I had family in Lyon. So I was visiting my family and then coming back to work in Paris.
Starting point is 00:24:57 And there were these new buses with Wi-Fi. So I thought, OK, let's give it a try. So he's sitting there on the bus. He's on his laptop. He's doing some work. And he notices that on the screen next to him, my neighbor was watching YouTube videos for a very long time. The guy was just going from one recommended video to the next recommended video to the next to the next to the next.
Starting point is 00:25:18 Part of me was pretty proud that, well, I worked on this algorithm, so I helped him watch so much content. I was kind of curious which recommendations were so good that he was so captivated by YouTube. And then I saw that he was watching conspiracy theories about a secret plan to kill 2 billion people. So naturally, I tried to make a joke with him. Oh, who wants to kill us? I tried to make a joke with him. Oh, who wants to kill us? To try to initiate a conversation.
Starting point is 00:25:53 And he told me, oh, there is this secret plan. Look at it because medias are not going to tell you about it. But if you look on YouTube, you'll find all the truth. Then we talked about the videos. I could debunk videos one by one, but I couldn't debunk the plot because he told me like, there are so many videos like that. It has to be true. It was pretty intense because I knew the numbers, so I knew that this was not just one person. It was millions of people who were in these situations. Hi, everybody.
Starting point is 00:26:36 It's Stefan Molyneux from Freedom Aid Radio. Let's dip into the Lister mailbag with questions. So why don't we try to see how far we can go back in the search. If we search history history like down there yeah yep yep I remember this okay now we're starting
Starting point is 00:26:52 this is a Jewish cultural system designed to market why would I want my nation to be populated by and reflect the culture of people who have come from all other places in the world?

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.