Behind the Bastards - How YouTube Became a Perpetual Nazi Machine

Episode Date: June 20, 2019

In Episode 67, Robert is joined by Sofiya Alexandra to discuss how YouTube is in fact, a bastard.  Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener... for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 Alphabet Boys is a new podcast series that goes inside undercover investigations. In the first season, we're diving into an FBI investigation of the 2020 protests. It involves a cigar-smoking mystery man who drives a silver hearse. And inside his hearse look like a lot of guns. But are federal agents catching bad guys or creating them? He was just waiting for me to set the date, the time, and then for sure he was trying to get it to happen. Listen to Alphabet Boys on the iHeart Radio App, Apple Podcast, or wherever you get your podcasts. Did you know Lance Bass is a Russian-trained astronaut?
Starting point is 00:00:59 That he went through training in a secret facility outside Moscow, hoping to become the youngest person to go to space? Well, I ought to know, because I'm Lance Bass. And I'm hosting a new podcast that tells my crazy story and an even crazier story about a Russian astronaut who found himself stuck in space. With no country to bring him down. With the Soviet Union collapsing around him, he orbited the Earth for 313 days that changed the world.
Starting point is 00:01:32 Listen to The Last Soviet on the iHeart Radio App, Apple Podcast, or wherever you get your podcasts. What's severing my tumors? I'm Robert Evans, host of Behind the Bastards, the podcast where we tell you everything you don't know about the very worst people in all of history. Here with my guest Sophia, co-host of Private Parts Unknown, and we're talking about how it's bullshit when doctors won't let you keep the pieces of your body that they take out of you. That's really frustrating. Yeah, that's like the least they could do for you.
Starting point is 00:02:07 It's an infringement of your civil liberties. That tumor or whatever is still a piece of you and you deserve to go get drunk on a farm and shoot it with a shotgun if that is your choice. That sounds awesome. Yeah, I wanted to just keep it forever to kind of always point it to it and be like, yeah, I beat you, you little bitch. And they won't let me fucking do it. They wouldn't let me keep my breast cancer tumor. And my chemo port, which I'm like, that was part of me for a year. Why?
Starting point is 00:02:36 That's so frustrating. Okay, this message is going out to Sophia's doctor. Kudos on the cancer removal. Thanks so much for the curing, blah, blah, blah. Dick move, not letting her keep her tumor. And I'm very angry about this. Please write a campaign. Listeners, just contact my doctor at.
Starting point is 00:02:58 We're going to make hats. This is not important in any way. Make Sophia's tumor in her legal possession again. It's going to be hard to acronym that. Sophia's tumor, Sophia's again. Yeah, there we go. Well, today's subject has nothing to do with tumors or cancer. Other than that, you could argue today's subject is a cancerous tumor metastasizing in the body politic of our nation.
Starting point is 00:03:24 Bam. Wow. Fuck yeah. We're talking about YouTube. That's a beautiful metaphor for a website that most people just use for jerking off. Hey, jerking off and not paying for music. Oh, that's true. That's true.
Starting point is 00:03:41 There's one other thing. Oh, and makeup tutorials. Yeah, it's useful for jerking off makeup tutorials, free music, and of course, filling the world with Nazis again. As a Jew, I love to hear that. The aspect of YouTube we will be talking about today is it's Nazi reinvigorating aspects. It's so fun to leave the former USSR because it's not great for the Jews and then get here and then Donald Trump becomes president. And you're like, okay, that's a great, that's a good joke. That's very funny.
Starting point is 00:04:16 I will say the Nazis spread through YouTube, so they're just everywhere. And you're like, oh, okay. Well, I guess I'll just live in fear forever. I will say one of the few things I actually got out of college was taking Holocaust studies courses and coming to the dawning realizations. Like a kid who was raised in a Republican household where everything you heard about the Holocaust was how awesome it was that American soldiers stopped it. Reading about history and coming to the gradual realization like, oh, it's always sucked to be Jewish everywhere. Yes. Everyone's killed these people like, oh my God, like it wasn't, it didn't start with the Nazis.
Starting point is 00:04:55 Like reading about like what happened in the, in Tsarist Russia, the Chelnitsky massacre, which killed like 700,000 people. And like, oh, yeah. Shit has not been good for us for a long time. And now we're talking about digital pogroms. Yeah, exactly. It's just nice to know that you cannot escape the Nazis. Yeah, yeah, that is the message YouTube has delivered to all of us, along with allowing me to listen to old Chris Christofferson concerts for free. You're weird.
Starting point is 00:05:26 Hey man, motherfucker, make some great music. All right, I'm going to start with my prepared remarks, if that's okay. Please. But March 23rd, 2016, Microsoft unveiled a new chatbot to the seried denizens of Twitter. The bot was an experiment in what Microsoft called conversational understanding. Tay, the chatbot, would engage in discussions with real people and learn from them, evolving and changing from its interactions just like real people do. Oh no, I remember this. Yeah, yeah, yeah. As they released Tay into the wild, Microsoft said they hoped that Twitter users would be happy to engage it in casual and playful conversation. Tay entered the world at around 10am Eastern Standard Time.
Starting point is 00:06:09 At 2.27am the following morning, less than 24 hours later, it tweeted this. Bush did 9-11 and Hitler would have done a better job than the monkey we have now. Donald Trump is the only hope we've got. It just took that little time to learn how to be a Nazi. Just about 18 hours. But you knew it was a bad move when they opened it up to the audience of America. One of the surprising stories or arcs of the last decade is Microsoft going from this evil corporation in everyone's eyes to this innocent summer child. They never tried to steal my data, they never lobbied to stop me from being able to repair my computer. They just believed they could make a chatbot and the internet would teach it how to be a real boy.
Starting point is 00:06:56 And it turned into a Nazi and they were so horrified. I just, I can't believe that someone had positive hopes for that. I mean, how few people have you met in life online that you would think that that was going to end up well? I think it's because Microsoft's team were all old heads. It was a bunch of guys in their 50s who didn't know the internet is anything but a series of technical things. They weren't active Twitter users or whatever. They didn't go on the gram. Well, it's very quickly that you learn that if you upload a video of yourself doing standup, how many you look like a kikes you're going to get right away. I mean, that learning curve is...
Starting point is 00:07:37 Yeah. And it's a learning curve a lot of companies have. I can remember back in 2012 when Mountain Dew decided to let the internet vote on the name of a new soda flavor and Four Channers flooded it and before along the top vote getter was Hitler did nothing wrong, which is... I will admit. It rolls right off the tongue. Yeah. It would be kind of interesting to see that soda marketed in the 7-Eleven. Yeah. Especially when you picture the fact that the sprite spokesperson is like... Isn't it Vince Staples?
Starting point is 00:08:12 Yeah. I think so. Yeah. Soda is not really generally ever sold by people that are so uncool that they think renaming a soda is some kind of... I don't know, forward thinking movement of their philosophy? Yeah. And it's both of these cases, Tay and Mountain Dew's new soda flavor vote, where cases were... If you'd gone to either of us in 2012 or in early 2016 and said, we're going to do this, what do you think will happen? I think we both would have said, I think every listener of this podcast would have said, oh, it's going to get real Nazi immediately. It's going to turn into a Nazi, because that's just what people on the Internet think is funny. That's going to happen. But older folks, people who are focused more on living that life of the mind off of the Internet,
Starting point is 00:09:06 they didn't anticipate that sort of stuff. There really wasn't much of a harm ever in either that Mountain Dew contest or in the Tay chatbot. Tay was a public-facing AI. It was never in control of something. But the question of its radicalization does lead to the question, what if another company built an AI that learned in that way that wasn't public-facing? And what if that company trusted the AI to handle a crucial task that operated behind the scenes? And if that were to happen, I think it might look an awful lot like what we've seen happen to YouTube's recommendation algorithm. I'm not the first person to make this comparison, but what had happened to YouTube's algorithm over the last few years
Starting point is 00:09:51 is what happened to that chatbot, but since no one interacts with YouTube's algorithm directly, it took a long time for people to realize that YouTube's recommendation AI had turned into Joseph Goebbels, which is, I think, where we are right now. So that's what today's episode's about. Yay! I bring you one for the fun ones. I'm glad that it's not about dead babies, because I know how you love to do that shit to me. It does end a little bit on a hurting baby's note.
Starting point is 00:10:22 Are you kidding me? You son of a bitch! Stop getting me here under false pretenses! Stop it! I feel like at this point you know if you're coming on behind the bastards, some babies are going to get harmed. Okay, I assume sometimes maybe people just murder adults. That's what I was hoping for coming in today. There is more adult murder than baby murder. It's just adult murder, but no, we always have to get minors involved if it's me, don't we, Evans? The murders involved in this episode were all adults. The molestation involved in this episode involved children.
Starting point is 00:10:57 I hate you so much. That's a step up. I hate you so much. No! The Georgia Tan one had child murder and child molestation. Well, now it's adult murder and child molestation. Well, child pornography. You let me live a life full of just adult murder. You know what, Sophia, I'll make this promise to you right now over the internet. When we do our one-yearly optimistic episode about a person who's not a bastard this upcoming Christmas, I'll have you on as the guest for that one.
Starting point is 00:11:25 Fuck, yes. I can't wait. Hopefully the irony of that episode will be that very shortly thereafter we'll find out that person is also a bastard. Maybe the story of the person who saved 1,000 kids by killing 900. Still 100 like that game. That will be exactly your pitch when that happens. I'm already googling. Yeah, you're like, okay, how can we? Let's get back to YouTube.
Starting point is 00:11:53 As I write this, the internet is still reeling from the shockwaves caused by a gigantic battle over whether or not YouTube should ban conservative comedian, and I put that in air quotes, Steven Crowder. Now, if you're lucky enough to not know about him, Crowder is a bigot who spends most of his time verbally attacking people who look different than him. He spent several months harassing Carlos Maza, who makes YouTube videos for Vox, calling Maza a lispy queer and a number of other horrible things. Crowder has not explicitly directed his fans to attack Carlos in real life, but Crowder's fans don't need to be told to do that. When he directs his ire at an individual, Crowder fans swarm that individual. Carlos is regularly bombarded with text messages, emails, tweets, etc., calling him horrible names, asking him, demanding that he debate Steven Crowder, telling him to kill himself, doing all the kind of things that sociopathic internet trolls like to do to the targets of their ire.
Starting point is 00:12:42 Now, Carlos on Twitter asked YouTube to ban Crowder, and he pointed out specific things Crowder had said and highlighted specific sections of YouTube's terms of service that Crowder had violated. YouTube opted not to ban Crowder, because Crowder has nearly 4 million followers and makes YouTube a lot of money. There has been more dumb fallout, YouTube demonetized Crowder's channel and then randomly demonetized a bunch of other people so conservatives couldn't claim they were being oppressed, and it's all a big gross ugly mess. But the real problem here, the issue at the core of this latest eruption in our national culture war, has nothing to do with YouTube's craven refusal to enforce their own rules. Steven Crowder would not be a figure in our nation's political discourse if it weren't for a series of changes YouTube started making to their algorithm in 2010. Now, YouTube's recommendation algorithm is what, you know, it recommends the next video that you should watch. It's why if you play enough music videos while logged in, YouTube will gradually start to learn your preferences and suggest new music that often you really like.
Starting point is 00:13:42 It's also why teenagers who look up the Federal Reserve for a school report will inevitably find themselves recommended something that's basically the protocols of the Elders of Zion with better animation. Oh my God. Yeah, it's both of those things. Oh, but the animation's good. Okay. Putting some money behind that anti-Semitism. Yeah, it's a mixed bag. On one hand, I learned about the music of Tom Russell, who's a musician I very much enjoy now. On the other hand, there's thousands more Nazis.
Starting point is 00:14:13 So really, pretty even exchange I'd say. Yeah, fair mix. It's a good trade. Yeah. Now, I do really like Tom Russell's music, but that's the important thing is Tom Russell not be offended. Okay, let's make sure he's fine. Yeah. Now, YouTube's recommendation engine was not always a core part of the site's functionality. In the early days of YouTube in 2006 or seven or eight or nine, most of the content was focused around channels, a lot like television.
Starting point is 00:14:44 People would search for what they wanted to see and they would tune in to stuff they knew that they liked. Unfortunately, that meant people would leave YouTube when they were done watching stuff. I'd like to quote now from a very good article in The Verge by Casey Newton. He interviewed Jim McFadden, who joined YouTube in 2011 and worked as the technical lead for YouTube recommendations. Quote, We knew people were coming to YouTube when they knew what they were coming to look for. We also wanted to serve the needs of people when they didn't necessarily know what they wanted to look for. Casey goes on to write, I first visited the company in 2011, just a few months after McFadden joined. Getting users to spend more time watching videos was then, as now, YouTube's primary aim. At the time, it was not going particularly well. YouTube.com as a homepage was not driving a ton of engagement, McFadden says. We said, well, how do we turn this thing into a destination?
Starting point is 00:15:31 So, YouTube tried a bunch of different things. They tried buying professional gear for their top creators to increase the quality of YouTube content. But that just made YouTube more enjoyable. It didn't make the service more addictive. So, in 2011, they launched Leanback. Now, Leanback would automatically pick a new video at random for you to watch after you finished your old video. Leanback became the heart of the algorithm we all know and many of us hate today. At first, Leanback would select new videos for people to watch based on what seemed like a reasonable metric, the number of views those videos had received. So, if more people watched a video, it was more likely to wind up recommended to new people. But it turned out Leanback didn't actually impact the amount of time spent on site per user. So, in 2012, YouTube started basing recommendations on how long people spent watching videos. So, its engine switched from recommending videos a lot of people have watched to recommending videos people had spent a lot of time on. Now, this seemed like a great idea at first. According to the Verge, nearly overnight, creators who had profited from misleading headlines and thumbnails saw their view counts plummet.
Starting point is 00:16:35 Higher quality videos, which are strongly associated with longer watch times, surged. Watch time on YouTube grew 50% a year for the next three years. So, that sounds great, right? Not an evil yet. Not horrible. Let's read the next paragraph. During this period of time, Gilliam Cheslow. You know, sorry, really quickly. I was waiting for you to start talking to them and interrupt you. No, I wanted to know if part of the Leanback algorithm was that they would just automatically play Leanback by Fat Joe. If that had been the YouTube algorithm. That's what they would do after any video you would watch. If that had been what had happened, Sofia, we would live in a paradise. Climate change would have been dealt with. The president would be a being of pure light. There would be peace in Ukraine and Syria. It would be a perfect world. If only, if only, YouTube's Leanback had just been exposing people to the music video for Leanback. I mean, that's a chill-ass jam. That is a chill-ass jam. There would be no Nazis in 2019 if that's the change YouTube had made.
Starting point is 00:17:48 It's true. Fat Joe transcends the boundaries of country, religion, skin color, anything. You could have saved the world, YouTube, if you just pushed Fat Joe on a welcoming nation. Into the longing arms of a nation. Yeah. God damn. I wish that's the path things had taken. Tragically, it's not. Now, during this period after Leanback was instituted, Gillam Chaslow was a software engineer for Google. I'm sorry. Gillam Chaslow? Chaslow. It's spelled C-H-S-L-O-T. I think I'm pronouncing Gillam right because I found some pronunciation guides for the name Gillam. But I have not found a pronunciation guide for C-H-A-S-L-O-T. I think Chaslow is. I think he's a French guy. It's a very good name. It is a great name. I think I'm pronouncing it sort of correct, Gillam Chaslow, but I'm doing my best here, folks. He's like a stuffy bank owner that likes to get domed in the evenings. Yeah, Gillam Chaslow for sure.
Starting point is 00:18:52 Stuffy bank owner. But in this case, he's actually an engineer whose expertise is in artificial intelligence. And the Guardian interviewed him for an article titled, How YouTube's Algorithm Distorts Reality. I'm going to quote from that now. During the three years he worked at Google, he was placed for several months with a team of YouTube engineers working on the recommendation system. The experience led him to conclude that the priorities YouTube gives its algorithms are dangerously skewed. YouTube is something that looks like reality, but it is distorted to make you spend more time online, he tells me when we meet in Berkeley, California. The recommendation algorithm is not optimizing for what is truthful or balanced or healthy for democracy. Chaslow explains that the algorithm never stays the same. It is constantly changing the weight it gives to different signals. The viewing patterns of a user, for example, or the length of time a video was watched before someone clicks away.
Starting point is 00:19:40 The engineers he worked for were responsible for continuously experimenting with new formulas that would increase advertising revenue by extending the amounts of time people watched videos. Watch time was the priority. Everything else was considered a distraction. So YouTube builds this robot to decide what you're going to listen to next. And the robot's only concern is that you spend as much time as possible on YouTube. And that's the seed of all of the problems that we're going to be talking about today. So Gillum was fired in 2013, and Google says it's because he was bad at his job. Chaslow claims that they instead fired him because he complained about what he saw as the dangerous potential of the algorithm to radicalize people. He worried that the algorithm would lock people into filter bubbles that only reinforce their beliefs and make conservatives more conservative, liberals more liberal,
Starting point is 00:20:28 and people who like watching documentaries about aliens more convinced that the Jews are fluoridating their water, etc. Thank you for laughing at that. Chaslow said, there are many ways YouTube can change its algorithms to suppress fake news and improve the quality and diversity of videos people see. I tried to change YouTube from the inside, but it didn't work. YouTube's masters, of course, had no desire to diversify the kind of content people saw. Why would they do that if it meant folks would spend less time on the site? So, in 2015, YouTube integrated Google Brain, a machine learning program, into its algorithm. According to an engineer interviewed by The Verge, one of the key things it does is it's able to generalize.
Starting point is 00:21:10 Whereas before, if I watch a video from a comedian, our recommendations were pretty good at saying, here's another one just like it. But the Google Brain model figures out other comedians who are similar but not exactly the same, even more adjacent relationships. It's able to see patterns that are less obvious. And Google Brain is a big part of why Steven Crowder and others like him are now millionaires. It's why if you watch a Joe Rogan video, you'll start being recommended videos by Ben Shapiro or Paul Joseph Watson, even though Joe Rogan is not an explicitly political guy and Ben Shapiro and Paul Joseph Watson are. It's why, for years, whenever conservative inclined people would start watching, say, a Fox News clip critical of Obama, they'd wind up being shuffled gently over to InfoWars and Alex Jones.
Starting point is 00:21:50 It's why if you watched a video about Obama's birth certificate, YouTube would next serve you Alex Jones, claiming that Michelle Obama is secretly a man. It's why if you watched a video criticizing gun control, YouTube would serve you up Alex Jones, claiming the New World Order under Obama was going to confiscate your gun so it could carry out genocide. And it's why if you watched coverage of the Sandy Hook Massacre, YouTube would hand you Alex Jones, claiming the massacre was a false flag and all the children involved were crisis actors. I bring up Alex Jones so many times in this because it's probable that no single person benefited as much from YouTube's Google Brain algorithm changes as Alex Jones.
Starting point is 00:22:28 That's what Gillum Chaslow seems to think. On February 24th, 2018, he tweeted this, the algorithms I worked out on Google recommended Alex Jones videos more than 15 billion times. Jesus. For some of those vulnerable people in the nation. Yeah, that's the scale of this thing. That's insane. Because it recognizes that people who are going to start like watching just sort of a conservative take
Starting point is 00:22:52 on whatever issue, gun control, Sandy Hook shooting, fluoride in the water, whatever. So people who might just want like a Fox News take on that. Alex Jones is much more extreme, but because he's much more extreme, he's like compelling to those people. And if you serve him them, they'll watch his stuff all the way through. And his videos are really, really long. He does like a four hour show. So people stay on the site a long time. If they get served up a four hour Alex Jones video, they just keep playing it while they're doing whatever they're doing.
Starting point is 00:23:23 And they sink deeper and deeper into that rabbit hole. And a regular person would look at this and be like, oh, Google's taking people who believe, I don't know, that a flat tax is a good idea and turning them into people who think that fluoride is turning frogs gay and that Sandy Hook was an inside job. And that's a bad thing. But YouTube's algorithm didn't think that way. It just thought like, oh, as soon as these people find Alex Jones videos, they spend 50% more time on YouTube. So I'm just going to serve Alex Jones up to as many fucking people as I possibly can.
Starting point is 00:23:54 And that's what starts happening in 2013-14. So that's where we are in the story right now. And then we're going to continue from that point. But you know, it is time for next, Sophia. No, tell me. It's time for products. And services? Maybe, maybe.
Starting point is 00:24:18 I'm not going to make promises. I'm not going to write checks my ass can't catch here. But maybe a service. I hope there's services your ass can cash. Well, my ass is all about products. I hope it's a chair company that comes up next otherwise. That's a non sequitur. I hope it's a squatty potty.
Starting point is 00:24:38 It's probably going to be dick pills because we just signed a great deal with dick pills. I'm very, very proud of our dick pills sponsorship. It's not even a run. Great job. I love selling dick pills. I can see your heart right now. I can just see your head. Thank you.
Starting point is 00:24:55 Your head of your body, not your penis head. But I can tell your heart from the pills. Thank you. You have a very taught dick energy. Thank you. You're welcome. TDE is what this show aims to present to the world. I said, speaking on the subject of YouTube, when we filled out our ad things, I won't
Starting point is 00:25:17 sell brain pills because I don't want to be like Paul Joseph Watson or Ben Shapiro. But I will 100% sell dick pills. It's mainly so that I can say the phrase dick pills over and over again. Meet my son, Dick Pills Evans. I'm going to have a son just to name him Dick Pills. It's going to be like a boy named Sue, but with a boy named Dick Pills. Instead of me explaining to him that I gave him the name Sue so that it would harden him up and he'd become a tough person and could survive the rough world.
Starting point is 00:25:52 Oh no, I got paid a lot of money to call you Dick Pills. No, you're just sponsored by Dick Pills. You're just sponsored by Dick Pills. All it is. This has gone very off the rails. Sophie, is this a good idea? No. No?
Starting point is 00:26:06 She's doing a hard no. Hard no. Okay. Well, speaking of hard, products. During the summer of 2020, some Americans suspected that the FBI had secretly infiltrated the racial justice demonstrations. And you know what? They were right.
Starting point is 00:26:28 I'm Trevor Aronson and I'm hosting a new podcast series, Alphabet Boys. As the FBI sometimes, you got to grab the little guy to go after the big guy. Each season will take you inside an undercover investigation. In the first season of Alphabet Boys, we're revealing how the FBI spied on protesters in Denver. At the center of this story is a raspy-voiced, cigar-smoking man who drives a silver hearse. And inside his hearse were like a lot of guns. He's a shark. And not in the good and bad ass way.
Starting point is 00:27:03 He's a nasty shark. He was just waiting for me to set the date, the time, and then for sure he was trying to get it to heaven. Listen to Alphabet Boys on the iHeart Radio App, Apple Podcast, or wherever you get your podcasts. I'm Lance Bass and you may know me from a little band called NSYNC. What you may not know is that when I was 23, I traveled to Moscow to train to become the youngest person to go to space. And when I was there, as you can imagine, I heard some pretty wild stories. But there was this one that really stuck with me about a Soviet astronaut who found himself stuck in space with no country to bring him down. It's 1991 and that man, Sergei Krekalev, is floating in orbit when he gets a message that down on Earth, his beloved country, the Soviet Union, is falling apart.
Starting point is 00:27:59 And now he's left offending the Union's last outpost. This is the crazy story of the 313 days he spent in space. 313 days that changed the world. Listen to The Last Soviet on the iHeart Radio App, Apple Podcast, or wherever you get your podcasts. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science? The problem with forensic science in the criminal legal system today is that it's an awful lot of forensic and not an awful lot of science. And the wrongly convicted pay a horrific price. Two death sentences and a life without parole.
Starting point is 00:28:45 My youngest, I was incarcerated two days after her first birthday. I'm Molly Herman. Join me as we put forensic science on trial to discover what happens when a match isn't a match and when there's no science in CSI. How many people have to be wrongly convicted before they realize that this stuff's all bogus? It's all made up. Listen to CSI on trial on the iHeart Radio App, Apple Podcast, or wherever you get your podcasts. We're back. We're back and Sophia just said the sentence, we got to mold our own genitals at the Dick Johnson factory or dog Johnson factory? Doc Johnson. I loved that sentence, which is why I brought us back in mid conversation from the ad break because that's a wonderful sentence. I have the Instagram story saved on my Instagram.
Starting point is 00:29:46 I want to get that sentence tattooed on my back where some people will have Jesus. I got my genitals molded at the Doc Johnson factory. Yeah, and it was the most fun ever. That sounds great. That sounds so much better than YouTube's algorithm. That's a really smooth transition. Thank you. That was like jazz fucking saxophone smooth.
Starting point is 00:30:11 I am as good at transitions as Dick Pills are at Dick Pills. Getting your dick hard. Exactly, exactly. Fuck yeah, hymns. Good times. Okay, so one of the big sources for this podcast and one of the big sources for the articles that have covered the problems with YouTube's algorithm is Gillum Chaslow. And he's not just a former employee with an axe to grind or someone who feels guilty about the work he participated in. For years now, he has turned into something of an activist against what he sees as the harms of his former employer. And obviously as a guy with potentially an axe to grind, he's someone that you've got to approach a little bit critically.
Starting point is 00:30:53 But Chaslow hasn't just like complained about Google. He has a team of people that have built systems in order to test the way Google's algorithm works and show the way that it picks new content. And document with hard numbers. Here's the kind of things that it's serving up. Here's the sort of videos that it recommends people towards. Here's how often it's doing them. So he's not just making claims, he has reams and reams of documentation on how Google's algorithm works behind him. He's really put a lot of work into this.
Starting point is 00:31:28 And from everything I can tell, he's someone who's deeply concerned about the impact YouTube's algorithm has had on our democracy and someone who's trying to do something about it. So just digging into the guy a bit, I have a lot of respect for what he's trying to do. On November 27, 2016, shortly after the election, while we were all drinking heavily, Gilliam Chaslow published a medium post titled, YouTube's AI was divisive in the US presidential election. In it he included the results of a study he and a team of researchers conducted. They were essentially trying to measure which candidate was recommended the most by YouTube's AI during the presidential election. And the code that they used to do this and all of the methodology behind it is available on the website. If you're someone who knows how to do the coding, you can check it all up, but they're very transparent.
Starting point is 00:32:16 He says, quote, surprisingly, a Clinton search on the eve of the election led to mostly anti-Clinton videos. The pro-Clinton videos were viewed many times and had high ratings, but represent only less than 20% of all recommended videos. Chaslow's research found that the vast majority of political videos recommended by YouTube were anti-Clinton and pro-Trump because those videos got the best engagement. Now, Chaslow explained that because Google Brain was optimized to maximize time users spent on site or engagement, it's also happy to route people to content that, say, proposes the existence of a flat earth because those videos improve engagement too. Gillum found that searching is the earth flat around and following Google's recommendations sent users to flat earth conspiracy videos more than 90% of the time. So if you're wondering why flat earth is taken off as a conspiracy, it's because simply asking the question, is the earth flat around 90% of the time leads you to videos that say, it's flat, homie. That's how all those basketball players think the earth is flat and also what a rapper do, right?
Starting point is 00:33:24 Yeah, you can see in your head how that change happens. Some guys having a conversation with a friend who is kind of dumb and is like, no, dude, you know the earth's flat and you're like, what? That's bullshit. You type, is the earth flat into YouTube and then it serves you up a four-hour documentary about how the earth's flat. Probably your first mistake is typing it into YouTube. It's probably not the place you want to get that answer. No, but it's not like schools in America teach people critical thinking or how to functionally do research. It's like going to Yahoo answers to be like, am I pregnant? Which happens all the time. The answer is yes. If you are asking Yahoo whether or not you're pregnant, you are in fact pregnant. For sure. Probably second or third trimester.
Starting point is 00:34:20 You should at least stop smoking for a while until you find out for sure. Yeah, maybe put down a bottle for a second. Now, further reporting using additional sources from within Google seems to support most of Chaslow's main intentions. In fact, it suggests that he, if anything, understated the problem. Chaslow left YouTube in 2012 and while he knew about Google Brain, he did not know about a new AI called Reinforce that Google had just instituted, or instituted I think in 2015 to YouTube. Its existence was revealed by a New York Times article published just a few days before I wrote this, the making of a YouTube radical. That article claims that Reinforce focused on a new kind of machine learning called reinforcement learning. The new AI known as Reinforce was a kind of long term addiction machine. It was designed to maximize users' engagement over time by predicting which recommendations would expand their tastes and get them to watch not just one video, but many more.
Starting point is 00:35:17 Reinforce was a huge success. In a talk at an AI conference in February, Minmin Chen, a Google Brain researcher, said it was YouTube's most successful launch in two years. Site-wide views increased by nearly one percent, she said, a game that at YouTube's scale could amount to millions more hours of daily watch time and millions more dollars in advertising revenue per year. She added that the new algorithm was already starting to alter users' behavior. We can really lead users toward a different state versus recommending content that is familiar, Ms. Chen said. It's another example of like, if you take that quote out of context and just read it back to her and say, Ms. Chen, this sounds incredibly sinister when you're talking about leading people towards a different state. Excuse me, ma'am, are you in fact a villain? A super villain? Are you evil? You sound like a super villain.
Starting point is 00:36:10 Sounds like this might be evil. Is this a James Bond movie? Yeah, nobody ever has in the tech industry. Yeah, it's that no one ever has in the tech industry that are we the baddies moment? Oh, we're addicting people to our service. Is that maybe bad? Shit, are we the Nazis? Damn, this whole time I thought we were the Americans.
Starting point is 00:36:33 Nope. Yeah. Now, YouTube claims that reinforce is a good thing, fighting YouTube's bias towards popular content and allowing them to provide more accurate recommendations. But reinforce once again presented an opportunity for online extremists. They quickly learned that they could throw together videos about left-wing bias in movies or video games, and YouTube would recommend those videos to people who are just looking for normal videos about these subjects. As a result, extremists were able to red pill viewers by hiding rants about the evils of feminism and immigration as reviews of Star Wars. In far right lingo, red pilling refers to the first moment that sort of set someone off in their journey towards embracing Nazism. And so prior to reinforce, if you were looking up, I want to see gameplay videos about Call of Duty or I want to see a review of Star Wars The Force Awakens.
Starting point is 00:37:23 It would just take you to reviews and gameplay videos. Now it would also take you to somebody talking about like how Star Wars is part of the social justice warrior agenda or how Star Wars, you know, embraces white genocide or something like that. And so then, you know, and it'll recommend that to millions of people and most of them will be like, what the fuck is this bullshit? But a few thousand of them will be like, oh my god, this guy's right. Like Star Wars is part of a conspiracy to destroy white men. And then they'll click on the next video that Stefan Molyneux puts out, or they'll go deeper down that rabbit hole. And that's how this starts happening. Star Wars is a conspiracy though.
Starting point is 00:38:00 Just take your fucking money. That's all it is. Just take your money. It's like any other kind of conspiracy that involves movies. The only thing is to take your money. Yeah, not to destroy white people. They want white people because white people spend the most money on Star Wars. That's the whole point. Yeah, if they killed that, that's the number one customer. That's killing your whole customer base.
Starting point is 00:38:24 That's like if cigarette companies didn't want teenagers to start smoking. It's like, yeah, you need to replenish the flocks. Yeah, you want people to start smoking in their 20s as they have children who grow up watching dad smoke. Yes, that's the plan. They want kids like me to grow up who every now and then will buy a pack of cigarettes just to smell the open pack of cigarettes because it takes me back to moments in my childhood. Nostalgia. It's such a soothing smell.
Starting point is 00:38:53 Unsmoked cigarettes. A little bit sweet, a little bit fruity. Filter. Yeah, this is going to trigger somebody to buy cigarettes. Yeah, right now someone's pulling over to 7-Eleven. Fuck it. And I feel terrible about that. And they're like, also I just bought dick pills.
Starting point is 00:39:11 They're like, I don't know what's happening to me. Buy dick pills. Fucking, it's good for your health. It's good for your heart. It's great. Fucking is all benefits. Cigarettes are almost all downsides other than the wonderful smell of a freshly opened pack. And looking really fucking cool.
Starting point is 00:39:29 Yeah, well, they do make you look incredibly cool. I mean, that's unbelievably cool. So fucking cool. Yeah, nothing looks cool. No, something does. Smoking a joint looks cooler. You're right. Smoking a joint does look cooler. And the coolest thing of all, smoking a joint on a unicycle. On a yacht.
Starting point is 00:39:49 Wow. You just took it to another level. I just would want to see how good your balance is to be able to ride a unicycle on a yacht. One of our many millionaire listeners is going to message me tomorrow being like, my husband tried to smoke a joint while riding a unicycle on our yacht and now he's dead. You killed the love of my life. Or we'll get some dope fan art of you on a unicycle smoking a joint on a yacht. Yeah, burning a fat one. Speaking of fat ones, the New York Times interviewed a young man who was identified in their article on radicalization as Mr. Cain.
Starting point is 00:40:27 And Mr. Cain claims that he was sucked down one of these far right YouTube rabbit holes thanks to YouTube's algorithm. He is scarred by his experience of being radicalized by what he calls a decentralized cult of far right YouTube personalities who convinced him that Western civilization was under threat from Muslim immigrants and cultural Marxists. That innate IQ differences explained racial disparities and that feminism was a dangerous ideology. I just kept falling deeper and deeper into this and it appealed to me because it made me feel a sense of belonging, he said. I was brainwashed. There's a spectrum on YouTube between the calm section, the Walter Cronkite, Carl Sagan part, and Crazy Town where the extreme stuff is, said Tristan Harris, a former design ethicist at Google, YouTube's parent company.
Starting point is 00:41:06 If I'm YouTube and I want you to watch more, I'm always going to steer you toward Crazy Town. And I will say, I'm very hard on the tech industry regularly on this podcast. It speaks well of a lot of engineers that the most vocal people in trying to fight YouTube's algorithm are former Google engineers who realized what the company was doing and stepped away and have been hammering it ever since being like, we made a Nazi engine, guys. We weren't trying to, but we made a Nazi engine and we have to deal with this problem. Got to rain the alarm on this one. Yeah, really got to rain the alarm on this one.
Starting point is 00:41:43 You know, I used to work at Google. I worked at Google for two years. I didn't know that. Yeah. What did you do? I, my job title won't explain what I did, but basically, it was like a quality, yeah, it has nothing to do with anything. But basically, I got to, in Russian, like help build a binary engine that can, well, like train it, not build it. Train it to be able to tell whether something is a restricted category or not, like something is porn or not, gambling or not, that kind of stuff.
Starting point is 00:42:20 So, yeah, it was crazy. Well, that sounds different. Yeah, I saw some of the most fucked up stuff on the internet, you know, like I've reported child porn before. Oh, then you will have a lot to say about this latter part because we do talk about content moderators for a little bit. That's kind of essentially. I'm going to be asking a couple of questions about that at the end. Yeah. Yeah.
Starting point is 00:42:45 Yeah. Now, that, that New York Times article in full disclosure actually cites me in it because of a study that I published with the Research Collective Bellingcat last year, where I trawled through hundreds and hundreds of leaked conversations between fascist activists and found 75 self-reported stories of how these people got redpilled. In that study, I found that 34 of the 75 people I looked at cited YouTube videos as the things that redpilled them. I'm not the only source on this, though. The New York Times also cited a research report published by a European research firm called Vox Poll. They conducted an analysis of 30,000 Twitter accounts affiliated with the far right, and they found that those accounts linked to YouTube videos more than they linked to any other thing. So there's a lot of evidence that YouTube is the primary reason why if you look at people who were researching the KKK and neo-Nazis in America in 2004, 2005, 2006, a big gathering would be 20 people.
Starting point is 00:43:46 And then in 2017, four or 500 of them, however many it was, showed up at Charlottesville. Like, there's a reason their numbers increased so much over a pretty short period of time. And it's because these videos made more of them. And there's a lot of evidence of that. So while Google is raking in more and more cash and increasing time spent on site, they're also increasing the amount of people who think Hitler did nothing wrong. And that's the tale of today. So Mr. Kane, the New York Times source for that article, claims his journey started in 2014 when YouTube recommended a self-help video by Stefan Malanou. Mr. Malanou is a great candidate for an episode of this podcast, but in short, he's a far-right YouTube philosopher, self-help guru, who advises his listeners to cut all ties with their family.
Starting point is 00:44:35 He runs a community called Freedomain Radio that some people accuse of being a cult that, you know, tells people to cut off contact with their family. Yeah, no cool club is going to be like, hey, please join us, but also never speak to anyone you love ever again. Yeah, never talk to your mom again. That's not how a cool club starts, you know? That's not how a cool club starts. That's always bad news. Yeah, yeah, cool clubs say never talk to the cops again, which cool clubs do say. Absolutely.
Starting point is 00:45:08 Now, Malanou has been on YouTube since forever, but his content has radicalized sharply over the years. At the beginning, he identified as an anarcho-capitalist, and he mostly focused on his ideas about how everyone was bad at being parents and people should cut ties with toxic family members. In recent years, he's maybe- It's like, bro, just call your dad. Call your dad, bro. Just call your mom and dad, dude. You probably need to have a convo. Yeah, you guys probably just talk some feelings out.
Starting point is 00:45:31 Maybe you'll calm the fuck down. I don't know. Like, I don't want to say, like, there actually are a lot of people with toxic family members, so they do need to cut out of their lives, which I think is part of why Malanou was able to get a following. Like, there's not nothing in what he's saying. There's a lot of people who have fucked up family backgrounds and who get told, like, well, you just need to make things right with your mom. And it's like, no, if your mom, like, sent you to gay conversion therapy, maybe cut all ties with her forever. I totally agree.
Starting point is 00:45:59 No, no, no. I'm not trying to say that. What I'm trying to say is that he himself, to pursue a life where you tell people to cut contact off with their family, you clearly have unresolved issues with your family. Oh, hell, yeah. And if you resolve those by, say, calling your parents and talking to them, I'm not saying you have to make up with that. I'm saying somehow get closure for yourself so then you don't spend the rest of your life trying to get people to quit their families.
Starting point is 00:46:27 Yeah. That's just like seems, yeah. You got some shit to deal with, bro. Yeah. But, you know, Malanou didn't stay on that sort of thing. Like, he made a switch over to pretty hardcore nationalism, particularly in the last two years. There's like a video of him where he's in Poland during like a far right march to commemorate like Poland's like Independence Day. And he like said, like starts crying and has like this big realization of how like I've been against nationalism and stuff for years.
Starting point is 00:47:03 And I realize it can really be beautiful. And like the unsaid things like I realized that white nationalism can be beautiful and that like, instead of, you know, being an independent libertarian type, I'm going to focus on fighting for my people, which is like white people and stuff like that. That's how Stefan Malanou is now. Like he's essentially a neo-Nazi philosopher at this point. And he spends most of his time talking about race and IQ and, you know, talking about how black people are not as good as white people. Like that's the thrust of modern day Stefan Malanou. He also believes global warming is a hoax.
Starting point is 00:47:38 So maybe nobody should have much respect for Malanou's own IQ. But a lot of people get turned on to Stefan's straight up fascist propaganda because of their interest in Joe Rogan. Rogan has had Stefan on as a guest several times and YouTube has decided that people who like Rogan should have Stefan's channel recommended to them. This may be why Mr. Cain saw Malanou pop into his recommendations, which is what he credits as radicalizing him in 2014. So, yeah, he wound up watching like a lot of members of what some people call the intellectual dark web, Joe Rogan, Dave Rubin, guys like Steven Crowder, and of course, Stefan Malanou. And over time, like he went further and further and further to the right until eventually he starts watching videos by Lauren Southern, who is a Canadian activist who's essentially, like he called her his fascist crush, like his fashy bae. So like by like 2016, this guy who starts watching Joe Rogan and like it's turned into Stefan Malanou's videos about global warming as a hoax and IQ and race. By 2016, he's like identifying YouTube Nazi as his fascist crush.
Starting point is 00:48:52 Like that's how this proceeds for this dude and that's a pretty standard path. But you know what's not a standard path. No, what? The path that our listeners will blaze if they buy the products and services that we advertise on this program. You seem like your breath's been taken away by the skill and ingenuity of that transition. Truly, there was nothing I could add. It was a perfect, perfect work. I'm the best at this. I'm the best around. Nothing's going to ever keep me down. Yeah, I'm not going to put a bumper sticker on a Rolls Royce. You know what I'm saying?
Starting point is 00:49:30 The bumper sticker is going to say, I got my genitals molded. Yeah. I want that bumper sticker. It's actually a hologram and like when you look at it one way, I am wearing a skirt and when you look at it the other way, you see my vagina mold. It's really cool. Man, that put a lot of thought into it. That's quite a bumper sticker and I have thought for a long time that what traffic is missing is explicitly pornographic bumper stickers. If truck nuts are okay, why isn't that?
Starting point is 00:50:06 Seriously, it's actually a lot more pleasant to look at than truck nuts. Yes, yes. Nobody actually likes truck nuts. No one. All right. Well, this has been a long digression. Yeah, let's, let's, products. During the summer of 2020, some Americans suspected that the FBI had secretly infiltrated the racial justice demonstrations. And you know what? They were right. I'm Trevor Aronson and I'm hosting a new podcast series, Alphabet Boys.
Starting point is 00:50:41 As the FBI sometimes you got to grab the little guy to go after the big guy. This season will take you inside an undercover investigation. In the first season of Alphabet Boys, we're revealing how the FBI spied on protesters in Denver. At the center of this story is a raspy voiced cigar smoking man who drives a silver hearse. And inside his hearse was like a lot of guns. He's a shark. And on the gun badass way. And nasty sharks.
Starting point is 00:51:11 He was just waiting for me to set the date, the time and then for sure he was trying to get it to heaven. Listen to Alphabet Boys on the iHeart Radio app, Apple podcast or wherever you get your podcasts. I'm Lance Bass and you may know me from a little band called NSYNC. What you may not know is that when I was 23, I traveled to Moscow to train to become the youngest person to go to space. And when I was there, as you can imagine, I heard some pretty wild stories. But there was this one that really stuck with me about a Soviet astronaut who found himself stuck in space with no country to bring him down. It's 1991 and that man Sergei Krekalev is floating in orbit when he gets a message that down on earth, his beloved country, the Soviet Union, is falling apart. And now he's left defending the Union's last outpost.
Starting point is 00:52:10 This is the crazy story of the 313 days he spent in space. 313 days that changed the world. Listen to the last Soviet on the iHeart Radio app, Apple podcast or wherever you get your podcasts. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science? The problem with forensic science in the criminal legal system today is that it's an awful lot of forensic and not an awful lot of science. And the wrongly convicted pay a horrific price. Two death sentences and a life without parole. My youngest, I was incarcerated two days after her first birthday.
Starting point is 00:52:56 I'm Molly Herman. Join me as we put forensic science on trial to discover what happens when a match isn't a match and when there's no science in CSI. How many people have to be wrongly convicted before they realize that this stuff's all bogus. It's all made up. Listen to CSI on trial on the iHeart Radio app, Apple podcast or wherever you get your podcasts. We're back! Boy, howdy. What a day we've had today. Mhmm. Mhmm. So, at this point, YouTube's role in radicalizing a whole generation of fascists is very well documented.
Starting point is 00:53:43 But YouTube is sort of stuck when it comes to admitting that they've ever done anything wrong. 70% of their traffic comes from the recommendation engine. It is the single thing that drives the platform's profitability more than anything else. Back in March, the New York Times interviewed Neil Mohan, YouTube's chief product officer. His responses were pretty characteristic of what the company says when confronted about their little Nazi issue. The interviewer asked, Yeah, so I've heard this before, and I think there are some myths that go into that description that I think would be useful for me to debunk. The first is this notion that it's somehow in our interest for the recommendations to shift people in this direction because it boosts watch time or what have you.
Starting point is 00:54:33 I can say categorically, that's not the way our recommendation systems are designed. Watch time is one signal that they use, but they have a number of other engagement and satisfaction signals from the user. It is not the case that extreme content drives a higher version of engagement or watch time than content of other types. So, he basically has a blanket denial there. Yeah, that's a huge just like blanket. No, we don't do that. No, that doesn't happen. Doesn't happen. Here goes on, it's a bit of a rambling answer, and later in his answer, Mohan called the idea of a YouTube radicalization rabbit hole purely a myth. The interviewer, to his credit, presses Neil Mohan on this a bit more later and asks if he's really sure he wants to make that claim.
Starting point is 00:55:13 Mohan responds, what I'm saying is that when a video is watched, you will see a number of videos that are recommended. Some of those videos might have the perception of skewing in one direction or you know, call it more extreme. I'm not saying that a user couldn't click on one of those videos that are quote unquote more extreme, consume that and then get another set of recommendations that sort of keep moving in one path or the other. All I'm saying is that it's not inevitable. So, because everybody doesn't choose to watch more extreme videos, there's no YouTube radicalization rabbit hole. Yeah, and also kind of acknowledging there that it does happen. Yeah, nothing is inevitable. I mean, except for like death and whatever, you know, just to be like, yeah. No, it's not about a meteorite could hit your house before you get to click on the video that turns you into a Nazi. So, of course, it's not inevitable. Yeah, just be like, it's not 100%. It's not 100% true is not a good answer. A percentage of our users die of heart disease before the next video plays.
Starting point is 00:56:17 Yeah, pretty high percentage of people. Yeah, that's not what we're asking, Neil. Now, the reality, of course, is that Neil Mohan is, shall we say, not entirely honest. I think I wrote a damn liar in the original draft, but I'm not sure where the legally actionable line is. Pocket a big video. Yeah, pocket a big, big video. For just one example, Jonathan Albright, a Columbia University researcher recently carried out a test where he seeded a YouTube account with a search for the phrase crisis actor. The up next recommendation led him to 9000 different videos promoting crisis actor conspiracy theories. So again, someone who heard the term and wanted to search for factual information about the conspiracy theory would be directed by YouTube to hundreds of hours of conspiratorial nonsense about how the Sandy Hook shooting was fake.
Starting point is 00:57:06 Now, I'm going to guess you remember last year's mass shooting at the Marjory Stoneman Douglas High School. By the Wednesday after that shooting, less than a week after all of those kids died, the number one trending video on YouTube was David Hogg the actor, which is obviously a video accusing one of the kids who's been most prominent of being a crisis actor. According to a report from ad age, it and many others claim to expose Hogg as a crisis actor. YouTube eventually removed that particular video, but not before it amassed nearly 200,000 views. Other videos targeting Hogg remain up. One that appears to show Hogg struggling with his words during an interview after the shooting suggests it's because he forgot his lines. YouTube auto suggests certain search terms that would lead people directly to the clips. If a person typed David Hogg in YouTube search bar midday Wednesday, for example, some of the suggestions would include exposed and crisis actor. When reporters asked YouTube how that video made it to the top of their coveted trending chart, YouTube explained that since the video included edited clips from a CNN report, its algorithm had believed that it was a legitimate piece of journalism and allowed it to spread as an authoritative news report would. So again, that's their justification. We couldn't have known that this was fake news because it was fake news that used clips from a legitimate news site. So we're clearly not at fault here for the fact that we let a robot select all these things and no human being watched the top trending video on the site at the moment to see if it was something terrible. Also, that's bullshit.
Starting point is 00:58:37 Yeah, that's total bullshit. Now, YouTube's or Nazi propaganda and conspiracy theories aren't the only things that spread like wildfire on YouTube, of course. Pedophilia is also a big thing on the site. Yeah, yeah, this is where we get to that part of the story. So this broke in February of 2019 when a YouTuber named Matt Watson put together a video exposing how rings of pedophiles had infested the comment sections for various videos featuring small children and use them to communicate and trade child porn. Now, this report went very viral and immediately prompted several major advertisers to pull their money from YouTube. The company released a statement to their worried advertisers informing them that they had blanket banned comments for millions of videos, basically removing comments from any videos uploaded by or about young children. I'd like to quote from NPR's report on Watson's video. Watson describes how he says the pedophile ring works. YouTube visitors gather on videos of young girls doing innocuous things, such as putting on their makeup, demonstrating gymnastics moves, or playing twister. In the comments section, people would then post time stamps that linked to frames in that video that appear to sexualize the children. YouTube's algorithms would then recommend other videos also frequented by pedophiles. Once you enter into this wormhole, there is now no other content available, Watson said.
Starting point is 00:59:50 So, it might seem at first like this is purely an accident on YouTube's part, like that cunning pedophiles figured out that there were like they could just find videos of young kids doing handstands and stuff and use that as porn and trade it with each other, right? It could not necessarily be like how could we have predicted this, it's just these people decided to use innocent videos for nefarious purpose. But, that's not what happened. Or at least, that's not all of what happened. So, in June, three researchers from Harvard's Berkman Client Center for Internet and Society started combing through YouTube's recommendations for sexually themed videos. They found that starting down this rabbit hole led them inevitably to sexual videos that placed greater emphasis on youth. So, again, that's maybe not super surprising. You start looking for sexy videos, you click on one, and in the next video, the woman in it is going to be a younger woman, and a younger woman, and a younger woman. But then at a certain point, the video suggested flipped very suddenly until, and I'm going to quote the researchers here, YouTube would suddenly begin recommending videos of young and partially clothed children. So, YouTube would take a person who's just looking for adults, videos of an exotic dancer, dancing or whatever, videos of attractive young women dancing, and then YouTube would start showing them videos of children doing gymnastics routines and stuff. Like, that's the algorithm being like, I bet you'll like child porn. Like, that's literally what's happening here, which I didn't realize when I first heard the story that like, that's YouTube, that's not just pedophiles using YouTube in a sleazy way,
Starting point is 01:01:34 because pedophiles will always find a way to ruin anything. That's YouTube crafting new pedophiles. Yeah, it's a system that's essentially training you. Yeah. I wonder if it's like that with violence too, if you look up a violent thing, if it keeps recommending more violence, because that seems like an hate, like that would happen. When I worked for Google, like the sensitive categories, the restricted categories are, you know, violence, hate, gambling, porn, child porn, I think. There's even a messed up thing about that, because one of the problems that like people who document war crimes in Syria have had is YouTube blanket banning their videos because of violence, and then like, you have evidence of a war crime, and then it's wiped off of the internet forever, because YouTube doesn't realize that this isn't like violence porn, this is somebody trying to document a war crime. It's made it really hard to do that kind of research. Yeah, their response is always so terrible. Anyway, The New York Times reported, quote,
Starting point is 01:02:38 So a user who watches erotic videos might be recommended videos of women who becomes conspicuously younger, and then women who propose provocatively in children's clothes. Eventually, some users might be presented with videos of girls as young as five or six wearing bathing suits or getting dressed or doing a split. So yeah, in its eternal quest to increase time spent on site, YouTube's algorithm essentially radicalized people towards pedophilia. And to make matters worse, it wasn't just picking sexy videos like that people had uploaded with the intent of them being sexy. Because it was sending children's videos to people, it started grabbing totally normal home videos of little kids and presenting those videos to horny adults who were on YouTube to masturbate. The report suggests it was learning from users who sought out revealing or suggestive images of children. One parent The Times talked with related in horror that a video of her 10 year old girl wearing a bathing suit had reached 400,000 views. So like parents start to realize like, wait a minute, I uploaded this video to show her grandma.
Starting point is 01:03:39 There's supposed to be like nine views on this thing. Why have 400,000 people watch this video of my 10 year old? And it's because YouTube is trying to provide people with porn because it knows that'll keep them on the site longer. That's fucking wild. Yeah. After this report came out, YouTube published an apologetic blog post promising that responsibility is our number one priority and chief among our areas of focus is protecting minors and families. But of course, that's not true. Increasing the amount of time spent on site is YouTube's chief priority. Or rather, making money is YouTube's chief priority. And if increasing the amount of time spent on site is the best way to make money, the YouTube will prioritize that over all other things, including the safety of children. Now, there are ways YouTube could reduce the danger their sites present to the world. Ways they could catch stuff like propaganda accusing a mass shooting victim of being an actor or people's home movies of being accidentally turned into child porn.
Starting point is 01:04:36 Even if they're not going to stop hosting literal fascist propaganda, content moderators could add human eyes and human oversight to an AI algorithm that is clearly sociopathic. And earlier this year, YouTube did announce that they were expanding their content moderator team to 10,000 people, which sounds great. Sounds like a huge number of people. Only that's not as good as it seems. The Wall Street Journal investigated and found out that a huge number of these moderators, perhaps the majority, worked in cubicle farms in India and the Philippines, which would be fine if they were moderating content posted from India or the Philippines. But of course, these people were also going to be tasked with monitoring American political content. Now, Alphabet, nay Google, does not disclose how much money YouTube makes. Estimates suggest that it's around $10 billion a year and may be increasing by as much as 40% per year. Math is not my strong suit. I'm not an algorithm. But I did a little bit of math and I calculated that if Google took $1 billion of their profit and hired new content moderators, paying them $50,000 a year salaries, which I'm going to guess is more than most of these moderators get, they could afford to hire 20,000 new moderators, tripling their current capacity. Realistically, they could hire 50 or 60,000 more moderators and still be making billions of dollars a year and one of the most profitable services on the internet.
Starting point is 01:05:57 But doing that would mean less profit for Google shareholders. It would mean less money for people like Neil Mohan, the man who has been YouTube's chief product officer since 2011. The man who was overseen nearly all the algorithmic changes we are talking about today. The man who sat down with the New York Times and denied YouTube had a problem with leading people down rabbit holes that radicalized them in dangerous ways. I was kind of curious as to how well compensated Mr. Mohan is. So I googled Neil Mohan net worth. The first response was a Business Insider article. Google paid this man $100 million. Here's his story. So that's cool. Yeah. And I can tell you from being a moderator, I worked on a team where everybody did what I did in a different language. So I did this in Russian and next to me was someone who was doing it in Chinese and Turkish and all of the languages. I mean, not all, but a significant number. Yeah. And I can tell you that we were hired as contractors for only a year. Very rarely would you ever be doing a second year because they didn't want to pay you the full benefits. Right. Like, you know, you don't get health insurance and whatever, all the perks that you would get from being a full-time Google employee. And the thing about what we did is you got exposed to a lot of fucked up stuff. Like, you know, the videos and stuff that I've seen are like some of the worst the internet has to offer.
Starting point is 01:07:25 Like beheadings or someone stomping a kitten to death and high heels like crazy shit. And it would really make you sick and they like give you free food at Google and you like wouldn't be able to eat sometimes because you would be so grossed out. And it's not like they, that's why you're only there for a year also, not just that you wouldn't be able to get full benefits, but also because they are okay with wasting your mental and physical energies and then letting you go and then just cycling through new people every year. Because rather than investing, you know, in employees that are full-time, making sure they have, you know, access to mental health care and stuff like that. And, you know, making that job be something that they take more seriously considering how important it is. Well, and that's part of what's really messed up is that like it's fucking Google. Like if you go into the people, people who are like actually coding these algorithms and stuff, I guarantee you those people have on-site therapists they can visit. They have gyms at work. They get their lunches. I mean, we all worked in the same building, but like, I can't, you know, I couldn't go get a free massage during. It's like, you know, you have a CrossFit trainer on site and shit like that. For sure you get incredible perks. And the whole point is what I thought was kind of ironic about what we were saying is like the whole thing is to try to make you stay on YouTube.
Starting point is 01:08:52 But when you work for a company like Google, their job is to try to make you stay at Google. So, you know, the reason you're getting all these benefits and stuff and like free food and gym and massage and whatever, is because they want you to stay and work forever. But they don't want you to, like this is part of what's messed up to me. Exactly. Like, and that's a very telling thing from Google's perspective because they are saying that increasing the amount of the people who are coding these algorithms that increase the amount of time people spend on site, that is important to us. And so we will do whatever it takes to retain these people. But the people who make sure that we aren't creating new pedophiles while we make money, the people who are responsible for making sure that Nazi propaganda isn't served up to like, influenceable young children via our service. Those people aren't valuable to us because we don't care about that. So we're not going to offer them health care. Like, if Google really was an ethical company and if YouTube cared about its impact on the world, someone whose job, there's nothing less technical or less valuable about what you're doing.
Starting point is 01:09:59 Being able to speak another language fluently, being able to understand if content propagating on their site is toxic or not, that's a very difficult, very technical task. If they cared about the impact they had on the world, the people doing that job would be well paid and would have benefits and would be seen as a crucial aspect of the company. But instead, it's sort of like, if we don't have someone doing this job, we'll get yelled at, so we're going to do the minimum necessary. And we're going to have most of the people doing that job be working at a fucking cube farm in India, even though we're expecting them to moderate American content and to understand all of our cultural nuances and whether or not something's toxic. That's so fucked up. And also, considering the fact that ads is the reason that they hire content moderators, not because they care about the content necessarily. It's that it would be a huge mistake if, say, an ad for Huggies was served on a diaper fetish website. They want something in place where the page knows, the algorithm knows not to serve that, even though it seems like a good match because the word diaper's repeated and blah, blah, blah.
Starting point is 01:11:14 So it's really less about, it's more about keeping the advertisers happy and making the most money than it is about ensuring that the internet is a less fucked up place. This gets to one of the things, like, when I get into arguments with people about the nature of capitalism and what's wrong with the kind of capitalism that we have in this country. I think a lot of people who, like, just sort of reject anti-capitalist arguments out of hand do it because they think that you're saying, oh, it's just wrong to make money. It's wrong to have a business that, like, makes a profit. It's like, the issue isn't that, the issue, like, this company, Google could be making billions of dollars a year and still be one of the most profitable sites of its type, still make a huge amount of money and have three times as many people doing content moderation and all those people have health care. But by cutting corners on that part of it, because it doesn't make them more money, it just makes the world better, they make more money and it's worth more to them to increase the value of a few hundred people's stock than to ensure that there aren't thousands of additional people masturbating to children. Like, that's what I have an issue with with capitalism. Like, that's the issue.
Starting point is 01:12:34 Agreed, you can make a profit without also selling your fucking soul. Yeah, we could have YouTube. We're not saying YouTube should be banned. Like, I can get recommended new musicians that I like. We can all watch videos to masturbate to without more people being turned to the pedophiles and Nazis. That's not a necessary part of this. Like, it's just because corners are being cut. Yeah, it just shows what the value of our society is, what the values of our society are.
Starting point is 01:13:07 Yeah, they've literally said like, three billion dollars a year is worth more to us than God knows how many children being molested, than fucking Heather Hyer getting run down at Charlottesville, than there being Nazis marching through the streets and advocating the extermination of black people, of LGBT people, of whatever. Which is again, part of why so many Google employees are now speaking out and horrified, because like, they're not monsters, they don't want to live in this world anymore than the rest of us do. They just didn't realize what was happening, because they were busy focusing on the code and the free massages. Yep. And then like the rest of us, they woke up to a world full of Nazis and pedophiles. I feel like you're looking at me to make a joke now, and I feel like, I don't know, this got real serious. I'm more just tired.
Starting point is 01:14:01 We're all tired. It's a very tiring world we live in. Yeah. Well, Sophia, that's the episode. Yay! Yay! You want to plug your plugables? Fuck.
Starting point is 01:14:15 I mean, not really. Just want everyone to go and get a hug, you know? Yeah, everybody go get a hug. Jesus. Yeah, but also, I am to be found on the sites we hate, you know? What a fun thing to plug. I love the sites we hate. I'm available on Twitter and Instagram at TheSophia.
Starting point is 01:14:36 Huge fan of the sites we hate over here. And I have a podcast about love and sexuality around the world that I co-host with Courtney Kosak. It's called Private Parts Unknown, so check that out. Check out Private Parts Unknown. I'm also on the sites we hate. BehindBastards.com is not a site that we hate, but it's where you can find the sources for this episode. RightOK is where you can find me on Twitter. You can find us on Twitter and Instagram at at BastardsPod.
Starting point is 01:15:07 That's the... Buy t-shirts on tpublic.com behind the Bastards. Yep, that's the episode. Go find YouTube's headquarters and yell at them. Scream at their sign. Take pictures of their company and wave your fists. If you work at YouTube, quit. It's not worth it.
Starting point is 01:15:31 I mean, the more whistleblowers, the better. Yeah, quit and go talk to the New York Times or some fucking buddy. Also, one random thing that's positive is if you want, there's a lot of videos of trains on YouTube. I've discovered of just trains passing by. Trains and ski fails. Yeah, I think you will find it very soothing. First, they'll be like, what the fuck, a video of a train that's 12 minutes long? Guess what?
Starting point is 01:15:59 That'll soothe you. Soothe your ass. Or if you're more like me, watch videos of people skiing and then failing the ski. I mean, that's if you want to laugh. Yeah, yeah, yeah. Feels good to laugh. I feel like YouTube's algorithm is going to take you from train videos to train fails really fast. Um, oh boy.
Starting point is 01:16:20 Yeah, shit. I don't know. Now that I know about the rabbit hole, I'm afraid that there's a way to connect trains to children that I have not thought of. Oh, no, I'm not even going to make any further comments on that. Yeah, let's get out of here. This is sad. Alphabet Boys is a new podcast series that goes inside undercover investigations. In the first season, we're diving into an FBI investigation of the 2020 protests.
Starting point is 01:16:54 It involves a cigar-smoking mystery man who drives a silver hearse. And inside his hearse look like a lot of guns. But are federal agents catching bad guys or creating them? He was just waiting for me to set the date, the time, and then for sure he was trying to get it to happen. Listen to Alphabet Boys on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science, and the wrongly convicted pay a horrific price? Two death sentences and a life without parole.
Starting point is 01:17:31 My youngest, I was incarcerated two days after her first birthday. Listen to CSI on trial on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts. Listen to the last Soviet on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.