Behind the Bastards - Part Two: Let's Look at the Facebook Papers

Episode Date: November 20, 2021

Robert is joined again by Jamie Loftus to continue to discuss the Facebook Papers. Learn more about your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy info...rmation.

Transcript
Discussion (0)
Starting point is 00:00:00 Alphabet Boys is a new podcast series that goes inside undercover investigations. In the first season, we're diving into an FBI investigation of the 2020 protests. It involves a cigar-smoking mystery man who drives a silver hearse. And inside his hearse look like a lot of guns. But are federal agents catching bad guys or creating them? He was just waiting for me to set the date, the time, and then for sure he was trying to get it to happen. Listen to Alphabet Boys on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science?
Starting point is 00:01:21 And the wrongly convicted pay a horrific price? Two death sentences in a life without parole. My youngest? I was incarcerated two days after her first birthday. Listen to CSI on trial on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts. Let's do it. Let's start the podcast. Alright, well, let's have that be what starts the podcast, what we just said. Let's start the podcast. Let's start the podcast. Let's start the podcast. Well, I'm Robert Evans.
Starting point is 00:02:01 Yep, I'm Sophie Lichterman. I never introduced myself. I'm Jamie. Who are you? I'm Jamie Loftus. Anything more we need to say? Are we done with the episode? Anderson's here? No, I think that, yeah. Yeah. Anderson is here. Sure. Well, you know what's happening in the world? No. Facebook is happening to the world and it's bad.
Starting point is 00:02:22 It's unfortunate. It's not great, Jamie. It's not great, Sophie. Not a fan of the Facebook. We left off having gone through some of the Facebook papers, particularly employees attacking their bosses after Jan 6th when it became clear that the company they were working for was completely morally indefensible. I wouldn't call it attacking. They already knew. I wouldn't call it attacking either. I would call it, you know.
Starting point is 00:02:48 I mean, there's a guy, like the quote there, there was the guy who was like, history won't judge us kindly. The guy who was like, yeah, when we didn't ban Trump in 2015, that's what caused the capital riot. I mean, facts are facts. Is that really attacking if you're just like... Well, I think stating facts can be an attack. Whoa. Okay. Put it on a T-shirt. I mean, for people like this, you know, yeah, I think stating facts can be an attack. And we ended part one by sharing some of the blistering criticisms of Facebook employees against management and the service itself. So as we start part two, it's only proper that we cover how Facebook responded to all of this internal criticism.
Starting point is 00:03:31 As I stated last episode, Facebook is in the midst of a years-long drought of capable engineers and other technical employees. They are having a lot of trouble hiring all of the people that they need for all of the things they're trying to do. So one of the things is for a lot of these employees, when they say things that are deeply critical, they can't just dismiss the concerns of their employees outright because if they were to do that, these people would get angry and they need them. Facebook's not in the strongest position. When it comes to the people who are good engineers, they have to walk a little bit of a tightrope. However, if they were to actually do anything about the actual meat of the concerns, it would reduce profitability and in some cases destroy Facebook as it currently exists. So they're not going to do anything, which has meant that they've had to get kind of creative with how they respond. So Mark and his fellow bosses pivoted and argued that the damning critique...
Starting point is 00:04:22 We're calling him Mark now. Yeah. Oh, Zuckie Zuck. So when this all comes out and people are like, boy, it sure seems like all of your employees know that they're working for the fucking Death Star. Zuckerberg and his mouthpiece has made a statement that all of these damning critiques from people inside the company were actually evidence of the very open culture inside Facebook, which encouraged workers to share their opinions with management. That's exactly what a company spokesperson told the Atlantic when they asked about comments like, history will not judge us kindly. The fact that they're saying we'll be damned by historians means that we really have a healthy office culture. Oh, hashtag Death Star Proud.
Starting point is 00:04:57 Yeah. Death Star Proud, everybody. Yeah, yeah. It's like the fact that... Wow, remove the stigma of working for the devil, right? I mean, come on. The devil I would be proud to work for because he's done some cool stuff. Like, have you ever been to Vegas?
Starting point is 00:05:11 Nice town. Oh, I've been to Vegas. I saw the Backstreet Boys in Vegas right before two of them were revealed to be in QAnon. So really caught the end of that locomotive. Oh, wow. I did not realize that a sizable percentage of the Backstreet Boys had gotten into QAnon. That makes total sense. 40% of the Backstreet Boys, they're from Florida.
Starting point is 00:05:32 They're ultimately five men from Florida. So what can you do? As the author of that article, the Atlantic article noted, this stance allows Facebook to claim transparency while ignoring the substance of the complaints and the implication of the complaints that many of Facebook's employees believe their company operates without a moral compass. All over America, people used Facebook to organize convoys to D.C. and to fill the buses they rented for their trips. And this was indeed done in groups like the Lebanon main truth seekers where Kyle Fitzsimmons posted the following, quote,
Starting point is 00:06:01 This election was stolen and we're being slow walked towards Chinese ownership by an establishment that is treasonous and all too willing to gaslight the public into believing the theft was somehow the will of the people. Would there be an interest locally in organizing a caravan to Washington D.C. for the electoral college vote count on January 6th, 2021? Yeah, and Kyle recently pled not guilty to eight federal charges, including assault on a police officer. Mark Zuckerberg would argue that like Facebook didn't play a significant role in organizing January 6th and couldn't have played a significant role in radicalizing this guy and many other people. But the reality is that for the people like the people who managed part of what led Kyle Fitzsimmons to go assault people on January 6th, was the fact that he had been radicalized by a social network that for years made the conscious choice to amplify angry content and encourage anger because it kept people on the site more. Right? Like all of the anger that boiled up in January 6th that came from a number of places.
Starting point is 00:06:57 But one of those places was social media because social media profited and specifically Facebook knowingly profited from making people angry. That was the business. And of course it blew up in the real world. I have a question just out of your own experience and observation, which is how do you like if you're doing a side by side case study of how Facebook responded to events like this versus how, like YouTube slash Google responded to radicalization? Are there like significant differences? Did anyone do better or differently? Yes.
Starting point is 00:07:33 Twitter has done better than probably most of them. YouTube, I mean, and again, I'm not saying that Twitter's done well or that YouTube has done well, but they've both done particularly with coronavirus disinformation a bit better than Facebook. And they were better in general on not really YouTube as much, but like Twitter was definitely has taken, has been the most responsible of the social networks around this stuff. It did seem like for a while there, the various networks were kind of like duking it out to see who could do the absolute worst and damage their lives. And it seems like Facebook won that. Yes. I would say Facebook. And again, Twitter chose to do a lot of the same talks of things Facebook did.
Starting point is 00:08:18 So did YouTube and they did it all for profit. A number of the things we've criticized Facebook for, you can critique YouTube and Twitter for, I would argue Twitter certainly has done more and more effectively than Facebook. Not enough that they're not being irresponsible because I would argue that Twitter has actually been extremely irresponsible and knowingly so. But I think Facebook, in my analysis, Facebook has been the worst, although I haven't gotten studied as much about like TikTok yet. So we'll see. But my analysis. You've got to get on TikTok. I hope you pivot out of podcasting and into TikTok dances.
Starting point is 00:08:54 Yeah. I mean, it's not the dances that concern me on TikTok. It's the minute long conspiracy theory videos that have convinced a number of people that the Kardashians are Armenian witches and had something to do with the collapse of the Astro Worlds or the deaths in the Astro World thing. My concern there is the dances that go over those conspiracy videos and really marry the worst of both worlds. Yeah, I'm sure that's a thing. Because I have seen dancing on TikTok. I have seen conspiracy videos that involve dancing. Incredible.
Starting point is 00:09:25 And skincare routine. Have you ever seen a conspiracy video where someone's also doing their skincare routine? Because that is a thriving subcomment. Yeah. I'm sure that's, yeah. So, well, I... I have. I was like, that is just a thing that exists on many platforms.
Starting point is 00:09:43 I will say all social media companies are willfully bad at stopping radicalization. Because making people angry and frightened is good for all of their bottom lines. So, they all knowingly participate in this. I think Facebook has been the least responsible about it. But that doesn't... That shouldn't be taken as praise of anybody. Like saying Twitter has done the best is saying like, well, we were all drunk driving, but John could actually walk a most of a straight line before vomiting. So, he was the least irresponsible of us who drunk drove.
Starting point is 00:10:20 Just to put it in terms that I understand, it sounds like Twitter is the backstreet boy that's like, look, I don't believe in QAnon, but I see their points. That's kind of the vibe I'm getting. Fair enough. So, when deciding which posts should show up more often in the feeds of other users, Facebook's algorithm weighs a number of factors. The end goal is always the same, to get the most people to spend the most time interacting with the site. For years, this was done by calculating the different reactions a post got and weighing it based on what responses people had to it. Again, for years, the reaction that carried the most weight was anger, the little snarling smiley face icon you could click under a post. It was at one point being weighted five times more than just like a like.
Starting point is 00:11:04 Really? Like, again, when I'm saying this was all intentional, they were like, people who respond angrily to posts, that keeps them on the site more. They spend the most time engaging with things that make them angry. So, when it comes to determining by which method, how we choose to have the algorithm present people with posts, the posts that are making people angriest is the posts our algorithm will send to the most people. That's a conscious choice. That's a conscious, yeah. It's so funny how, I mean, not funny.
Starting point is 00:11:33 It's tragic and upsetting, but just how specific the Facebook audience is that it's like, you would have to be the kind of person who would be like, I'd better react angry to that, maybe as specific as possible in my feedback to this post, which is Farm Bill moms and people who have been killed. Yeah, it's boomers. It's boomers. Yeah. And yeah, they just kind of knowingly set off a bomb in a lot of people's fucking brains. They're addicted to telling on themselves for no reason. Why?
Starting point is 00:12:00 Why? Anyways. Facebook has something called the integrity department. And these are the people with the unenviable task of trying to fight misinformation and radicalization on the platform. They noted in July. That is so embarrassing. What a horrible job. Imagine going on a first date.
Starting point is 00:12:13 Yeah. Just going on a first date and be like, I work for the Facebook integrity department. Like, yeah, good fucking luck. Yeah, I work for the Air Force. No one's sex again in your life. My job is to go door to door and apologize to people after we bomb them. We have gift baskets for the survivors, you know? Like, that's the gig, really.
Starting point is 00:12:31 Yeah, I send edible arrangements to people who have been drone-striped. Like, oh, Jesus, awful. And a lot of... There's... It's one of my favorite follow-ups on Twitter is Brooke Binkowski who used to work for Facebook. It was like one of the people early on who was trying to warn them about disinformation and radicalization on the platform years ago and left because, like, it was clear they didn't actually give a shit. And a lot of the integrity department people are actually, like, really good people who are a little bit optimistic and kind of young and come in and like, okay, I'll make... It's my job to make this huge and important thing a lot safer.
Starting point is 00:13:07 And these people get chewed up and spit out very, very quickly. And members of the integrity team were kind of analyzing the impact of weighing angry content so much. And some of them noted in July 2020 that the extra weight given to the anger reaction was a huge problem. They recommended the company stop weighing it extra in order to stop the spread of harmful content. Their own tests showed that dialing the weight of anger back to zero so it was no more influential than any other reaction would stop rage-inducing content from being shared and spread nearly as widely. This led to a 5% reduction in hate speech, misinformation, bullying, and posts with violent threats. And when you consider how many billions of Facebook posts there are, that's a lot less nasty shit, some of which is going to translate into real-world violence. And again, this was kind of a limited study, so who knows how it would have actually affected things in the long run.
Starting point is 00:13:59 But less money, question mark? Yeah, this actually was kind of a win for them. Facebook did make this change. They pushed it out in September of 2020. And the employees responsible deserve real credit. Again, there's people within Facebook who did things that really actually were good. Changing this was probably made the world a bit healthier. That said, the fact that it had been waited this way for years, you don't undo that just by dialing it back now. For one thing, anger has become such an aspect of the culture of Facebook that even without weighing the anger emoji, most of the content that goes viral is still stuff that pisses people off,
Starting point is 00:14:40 because that's just become what Facebook is because that's what they selected for for years. If they'd done this years ago, if they'd never weighted anger more, it might be a very different platform with a very different impact on the brains of, for example, our aunts and uncles. I think that that's really interesting too, because that timeline lines up pretty exactly with where it feels like a lot of younger people were leaving that platform, and the platforms became associated with older people because I feel like I don't think I was using Facebook consistently after 2017. I want to say it was maybe my last Facebook year. Yeah, I stopped visiting it super regularly a while back. Yeah, maybe around 2017. Right.
Starting point is 00:15:27 So in April of 2020, Facebook employees came up with another recommendation, and this one wouldn't be as successful as changing the reaction of the algorithm to the angry reaction. Spurred by the lockdown and the sudden surge of QAnon, Boogaloo, and anti-lockdown groups urging real-world violence, it was suggested by internal employees that the newsfeed algorithm deprioritized the posting of content based on the behavior of people's Facebook friends. So the basic idea is this. What Facebook was doing was you would, normally, like the way you'd think it would work, right, is that like your friends post something and you see that in your newsfeed, right? Like the posts of the people that you've chosen to follow and say are your friends, right? That's how you would want it to work. That's how it worked at one point. They made a change a few years back where they started sending you things, not because someone you followed had said something, but because they'd liked a thing or they'd commented, not even commented, just like liked a thing. Like if they'd reacted to a thing, you would get the thing. Engaged in any way.
Starting point is 00:16:27 Yeah, you would get that sent to your newsfeed. And members of the integrity team start to recognize, like, this has some problems in it. For one thing, it results in a lot of people getting exposed to dangerous bullshit. So they start looking into, like, the impact of this, and how just sharing the kind of things your friends are reacting to influences what you see and what that does to you on Facebook. The integrity team experimented with how changing this might work, and their early experiments found that fixing this would reduce the spread of violence inside and content. For one thing, what they found is that, like, normally if you hadn't seen someone like a post about something that was maybe, like, violent or aggressive or conspiratorial, like a Flat Earth post or a post urging the execution of an elected leader, if you hadn't seen anyone that you knew react to that post, even if you saw it, you wouldn't comment on it or share it. But they found that, like, if you just saw that a friend had liked it, you were more likely to share it, which increases exponentially the spread of this kind of violent content.
Starting point is 00:17:29 And it's this idea, like, the whole people weren't stopped being afraid to be racist at a certain point, as much as they had been earlier, and it led to this surge in real-world violence. It is kind of the same thing. People felt, by seeing their friends react to this, they felt permission to react to it, too, in a way maybe they would have, like, well, I don't want to, like, maybe I'm interested in Flat Earth shit, but I'm just going to ignore this, because, like, I don't want to seem like a kook. That is so fucking upsetting and fascinating in the way that it affects your mind, is, yeah, there was a time where you would, if you were, you know, racist, misogynist, homophobic, whatever you were, but you just didn't talk about it, but then all of a sudden there's this confirmation that, like, hey, this person you know and see all the time feels the same fucking way you do, so why be quiet about it? Let's discuss. Like, it's just, that's so dark.
Starting point is 00:18:20 It's really dark, and so the integrity team sees this and they're like, we should change this. We should only show, we shouldn't be showing people just, like, the reactions their friends have had to content, because it seems to be bad for everybody, and they do find in some of their, you know, because when they experiment, they're like, we'll take this country or this city and we'll roll this change out in this limited geographical location to, like, try and see how it might affect its scale, and they do this and they see that, like, oh, changing this significantly reduces the spread of specifically violence and sighting content, so they're like, hey, we should roll this out service-wide. Zuckerberg himself steps in, according to Francis Hogan, the whistleblower, and, quote, rejected this intervention that could have reduced the risk of violence in the 2020 election.
Starting point is 00:19:04 From the Atlantic, quote, an internal message characterizing Zuckerberg's reasoning says he wanted to avoid new features that would get in the way of meaningful social interactions, but according to Facebook's definition, its employees say engagement is considered meaningful even if it entails bullying, hate speech, and re-shares of harmful content. The episode, like Facebook's response to the incitement that proliferated between the election in January 6th, reveals a fundamental problem with the platform. Facebook's mega-scale allows the company to influence the speech and thought patterns of billions of people. What the world is seeing now through the window provided by reams of internal documents is that Facebook catalogs and studies the harm it inflicts on people, and then it keeps harming people anyway. That's always so interesting to hear, and by interesting, I mean psychologically harmful.
Starting point is 00:19:53 Because it's like, yes, that is a fundamental flaw of the platform, but that's also very entrenched into what the DNA of the platform always was, which was based on harshly judging other people. That's why Mark Zuckerberg created Facebook, to harshly judge women in his community. I know that it is on a bajillion scale at this point, but I'm always kind of stunned at how people are like, oh, it's so weird that this went the way that it did. It's like, well, to an extent, it was always like that, and maybe it was like cosplaying as not being like that, and seeing people, there were eras in Facebook where user experience wouldn't be like that, but this goes back almost 20 years at this point of this being in the DNA of this shit show. Yeah, and it's really bleak. It's just really bleak, and it also goes to show like the...
Starting point is 00:20:55 One of the things Zuckerberg will say repeatedly when he talks about, when he does admit, he's like, yes, there are problems, and there have been negatives associated with the site, and we're aware that they're humbling, but you also have to include all the good that we're doing, all of the meaning, and the way he always phrases this is all of the meaningful social interactions that wouldn't have happened otherwise, and then you realize every time he says that... Name five meaningful social interactions that have taken place on Facebook. When he says that, he's including, as these internal documents say, he includes bullying and people making death threats and talking about their desire to murder people.
Starting point is 00:21:27 That's a meaningful interaction. People getting angry and trying to incite violence together is a meaningful social interaction, which I guess, yes... I guess the hate is not meaningless, that has meaning, but that's not the general... Yeah, planned meetings were meaningful social interactions. You gotta give the KKK that. The Nuremberg rally was a meaningful interaction. The last meaningful interaction I had on Twitter led to a rebound I was dating coming to my grandma's funeral blackout drunk, so I, you know, it's all just... Oh, man. God, it's been too long since I've shown up at a funeral, just too drunk to stand.
Starting point is 00:22:08 It is still one of my favorite memories with my family to this day. They're like, who is this guy? And I'm like, I don't really know. He's drunk as shit, though. He came on the megabus. Hell yeah, he did. He had so fucked up on the megabus. Getting drunk from a camelback on a megabus. Yeah.
Starting point is 00:22:27 That would be when I used to do a lot of bus trips, like when I was traveling and stuff, that would be one of the tactics, as you feel like a thermos or a camelback with like 40% cranberry juice, 60% liquor, and just get ruined. No, it's awesome. I'm not above getting fucked up on a megabus, but on your way to my grandma's funeral, that was a move. Me and my friends got wasted in San Francisco one day, just like going shopping in broad daylight with a camelback where we would get a bottle of orange flavored Trader Joe's Patron tequila,
Starting point is 00:23:03 and we would get a half dozen lime popsicles, and you just throw the popsicles in with the Patron in the camelback, and throughout the day it melts, and you just have constant cold margarita. It's actually fucking amazing. That fucking rocks. Wait, I wish I knew that when I was 22. Oh yeah, I recommend it heavily. You will get trashed and people don't notice.
Starting point is 00:23:25 Dude, walking around with a fucking camelback in San Francisco, nobody gives a shit. Oh my god, you're basically camouflaged. You know who else is camouflaged? Who? The products and services that support this podcast, camouflaged to be more likeable to you by being wrapped in a package of the three of us. That's how ads work, Jamie. But you were saying that you were taking ads from the US Army Recruitment Center again?
Starting point is 00:23:50 I mean, it's entirely possible. But at the moment, we're just camouflaging, I don't know, whoever comes on next, whoever comes on next, you'll feel more positively about because of our presence here. Wow. That's how ads work. What would you do if a secret cabal of the most powerful folks in the United States told you, hey, let's start a coup? Back in the 1930s, a Marine named Smedley Butler was all that stood between the US and fascism. I'm Ben Bullock.
Starting point is 00:24:22 And I'm Alex French. In our newest show, we take a darkly comedic. And occasionally ridiculous. Deep dive into a story that has been buried for nearly a century. We've tracked down exclusive historical records. We've interviewed the world's foremost experts. We're also bringing you cinematic historical recreations of moments left out of your history books. I'm Smedley Butler and I got a lot to say.
Starting point is 00:24:43 For one, my personal history is raw, inspiring and mind blowing. And for another, do we get the mattresses after we do the ads or do we just have to do the ads? From iHeart Podcast and School of Humans, this is Let's Start a Coup. Listen to Let's Start a Coup on the iHeart Radio app, Apple Podcast, or wherever you find your favorite shows. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science? The problem with forensic science in the criminal legal system today is that it's an awful lot of forensic and not an awful lot of science. And the wrongly convicted pay a horrific price. Two death sentences and a life without parole.
Starting point is 00:25:32 My youngest, I was incarcerated two days after her first birthday. I'm Molly Herman. Join me as we put forensic science on trial to discover what happens when a match isn't a match and when there's no science in CSI. How many people have to be wrongly convicted before they realize that this stuff's all bogus? It's all made up. Listen to CSI on trial on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts. And when I was there, as you can imagine, I heard some pretty wild stories. But there was this one that really stuck with me. About a Soviet astronaut who found himself stuck in space with no country to bring him down. It's 1991 and that man, Sergei Krekalev, is floating in orbit when he gets a message that down on Earth, his beloved country, the Soviet Union, is falling apart. And now he's left defending the Union's last outpost.
Starting point is 00:26:52 This is the crazy story of the 313 days he spent in space. 313 days that changed the world. Listen to the last Soviet on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts. Oh, we're back. My goodness. What a good time we're all having today. How are you doing, Jamie? You make it? Okay. You made it sound sarcastic. I am having a good time. Well, I'm glad. I'm happy that you're having a good time. That's my only goal for this show and for you as a good time. See, now you're doubling down on it and I'm getting insecure. I'm doubling down and I'm also talking more and more like an NPR talking head as I get quieter by the bed. Now I'm going to start having a panic attack. I've never heard you talk this way.
Starting point is 00:27:42 I know. This is how I talk to my cats when I'm angry at them. Honestly, I feel like we do have that dynamic. I feel like I'm a cat that you get angry at sometimes. Yeah, because you jump on my desk and knock over my Xevia. It's infuriating. Well, it's just for attention. I know, I know. It's just for attention. But I've got to work to keep you an expensive cat food. I only feed my cats the nice wet food. I would rather have your attention than really nice food, okay?
Starting point is 00:28:05 No, that's not what my cats say. So there's just a shitload to say about how Facebook negatively impacts the increasingly violent political discourse in the United States and how they help to make January 6th happen. But I think the way I'd like to illustrate the harm of Facebook next is a bit less political. It also occurs in a different Facebook product. I'm talking about Facebook, the company, generally, we're not afraid of Facebook. But now we're going to talk about Instagram. In part one, I mentioned that young people felt that removing likes from Instagram temporarily corresponded with a decrease in social anxiety. The impact of Instagram specifically on the mental health of kids and teens can be incredibly significant.
Starting point is 00:28:43 One of the other Facebook internal studies that was released as part of the Facebook papers was conducted by researchers on Instagram. The study, which again almost certainly would never have seen the light of day if a whistleblower hadn't released it, found that 32% of teen girls reported Instagram made them feel worse about their body. 22 million teenagers in the United States log on to Instagram on, like, a daily basis. So that's millions of teen girls feeling worse about their body because of Instagram. I've never been less surprised at learning a thing. Well, good news. It gets worse. Like, no fucking kidding. Oh, good.
Starting point is 00:29:20 These researchers released their findings internally in March of 2020, noting that comparisons on Instagram can change how young women view and describe themselves, again, not surprising. So company researchers have been investigating the way that Instagram works, though, for quite a while years. About three years that they've like been doing this seriously. And their previous findings all back up the same central issues. Photo sharing in particular is harmful to teen girls. One 2019 report concluded, we make body image issues worse for one in three teen girls.
Starting point is 00:29:50 Its findings included this damning line. Teens blame Instagram for increases in anxiety and depression. This reaction was unprompted and consistent across all groups. So, like, they almost always mention that this app specifically makes them feel worse about their body, and we don't have to prompt them at all. Like, this just comes up when they talk about Instagram. I mean, that truly, it's so... Sophie, I don't know how you feel.
Starting point is 00:30:14 I mean, I truly think that, like, because I've been on Instagram since, what, like, 2014? Some shit like that. No, earlier, I think. I think I've been on earlier. It was around when we were in high school. I truly think my life and my relationship to my body would be very different if I had not been on that app for the better part of a decade. Yeah, I mean, especially when they introduced filters. Yeah, we're about to talk about that.
Starting point is 00:30:39 So, here's the kicker. And by kicker, I mean the bleakest part. In teens who reported suicidal thoughts, 13% of teens in the UK and 6% of teens in the United States claimed their desire to kill themselves started on Instagram. That's fucking disgusting and terrible. That's pretty bleak. More than 40% of...
Starting point is 00:31:00 Again, I just, like, I wish her more surprise. Yeah, I know. But it's good to have this data. The data shows that more than 40% of Instagram... So, more than 40% of Instagram users are less than 22 years old, which means you've got 22 million teens logging onto the service in the US every day. 6% of those people becoming suicidal as the result of Instagram is 1.32 million children who started wanting to kill themselves while using Instagram.
Starting point is 00:31:26 Hey, everybody. Robert Evans here. And I actually screwed up the math that I just cited, which is often the case when I do math. So, anytime I do math of my own in an episode, you're right to question me. I was calculating 6% of 22 million, basically. But, as the study noted, it's 6% of kids who are suicidal say that their suicidal feelings started on Instagram.
Starting point is 00:31:49 So, I wanted to recalculate that. About 76, 72 to 76, kind of depending on the source, percent of American teens use Instagram. There are about 42 million teenagers in the United States. So, I calculated from that and about 19% of high school students of teenagers seriously considered attempting suicide. So, if we're just counting serious attempts or people who seriously considered attempting suicide, that's 5,745,600 teens who seriously considered suicide. 6% of those, if 6% of those kids had their suicidal feelings start on Instagram,
Starting point is 00:32:32 that's 344,736 children in the United States who suicidal feelings started on Instagram. And I furthermore found that about 9% of kids who seriously attempt suicide or seriously consider suicide attempt it. So, of that 344,736 American teens who suicidal feelings started on Instagram, about 31,026 kids attempt suicide. So, about 31,000 kids in the United States on an annual basis attempt suicide because of suicidal feelings that started on Instagram. So, that is the more accurate look at the data and I apologize as always for the error.
Starting point is 00:33:21 But what's interesting is that these studies do document like Facebook is as physically harmful at scale as like a wide variety of narcotics. Like most narcotics probably are less harmful at scale physically than Instagram. I think weed certainly is. Oh my God, if every teenager was smoking weed instead of doom scrolling on Instagram, the world would just be so fucking different. If there were chain smoking cigars instead of being on Instagram, we might be better off. It's so weird because I think about like how, I don't know, whatever.
Starting point is 00:33:58 Like, I'm in my late 20s, so I feel like I have like a little bit of memory of like what life was like before you were constantly being encouraged to compare yourself to every single person you've ever met in your life regardless of whether you know who they are, how they are, whatever. And I just make call me nostalgic, but I liked how I felt better. Yeah. Like, it's so absurd how much I know about people I don't give a shit about and how bad it makes me feel to know about the curated lives of people that I don't give a shit about
Starting point is 00:34:35 and how I let that actively affect my daily life. And it's just, yeah, it's just fucking miserable. It is. It's horrible. It's horrible. That said, I like flirting on the application. So, you know, it's complicated. Here's why, despite the documented harm that Instagram does, nothing's ever going to change.
Starting point is 00:34:53 As I stated, 22 million US teens use the gram daily. Only 5 million log on to Facebook. So, Instagram is almost five times as popular among teenagers as Facebook where kids are leaving in droves. So, Facebook, Mark Zuckerberg's invention is now definitively just the terrain of the olds and Facebook knows that kids are never going to come back because that's not how being a kid works. You don't get them back.
Starting point is 00:35:19 They're going to continue to do new shit. Eventually, they'll leave Instagram for something else, you know? That's just the way it fucking goes. Unless the 30-year nostalgic cycle is like Facebook is actually back now, it's actually cool. I just don't think it gave anybody a good experience enough to have that. It's not the fucking Teenage Mutant Ninja Turtles. No one's getting dopamine hits.
Starting point is 00:35:39 That's a good point. It's not like flaming hot Cheetahs. Yeah, nobody's thinking fondly back to scrolling Facebook when they were seven. They're thinking back to, I don't know, SpongeBob SquarePants. Oh, and as well they should. But at the moment, Instagram is very popular with teens. And Facebook knows that if they're going to continue to grow and maintain their cultural dominance, they have to keep bringing in the teens.
Starting point is 00:36:00 They have to keep Instagram as profitable and as addictive as it currently is. And that's why they bought Instagram in the first place. They only paid like a billion dollars for it. It's an incredible investment. And they spend 50% more time than a day. Yeah, that's cheap as hell for something as influential and huge as Instagram. That is true. Money is not real.
Starting point is 00:36:22 I wonder, do you know what it's worth now? I would guess significantly more than a billion dollars. But I don't entirely know how to value it. But Facebook's like a trillion dollar company now. That's right. Yeah, they're very profitable. And Facebook fucking sucks. Well, but Facebook, that includes Instagram, you know?
Starting point is 00:36:37 Oh, okay. Yeah, and among, you know, teens are one of the most valuable demographics to have for advertisers. And Instagram is where the fucking teens go. Do you want the number? Yes. Its estimated value is 102 billion. So yeah, I would say.
Starting point is 00:36:50 That's a good investment. That's a good investment. That's a fucking good investment if money was real. Yeah, you got it. Yeah. So the fact that so much is at stake with Instagram, the fact that it's such a central part of the company having any kind of future, is part of why Mark and company have been so compelled to lie about it.
Starting point is 00:37:06 None of this stuff that we've been talking about was released when Facebook researchers got it. Of course not. They wouldn't want anyone to know this shit. In March of 2021, Mark took to Congress where he was criticized for his plans to create a new Instagram service for children under 13. He was asked if he'd studied how Instagram affects children. And he said, I believe the answer is yes.
Starting point is 00:37:27 So not yes. I think we've studied that. He told them then, the research we've seen is that using social apps to connect with other people can have positive mental health benefits. And I'm sure there's something that he's gotten paid researchers to come up with that he can make that case off of. I'm sure in certain situations it may even be true. There are ways you can use social media that are good dear men.
Starting point is 00:37:50 I've legitimately smiled or had my heart warmed by things that happen on social media. It doesn't not happen. And I do think that there is a case for like, I mean, and it's, you can't credit Mark Zuckerberg for that, but just, I mean, going back to fucking like live journal days of just like friendships that have deepened as a result of social media. That's definitely a thing, but the costs outweigh the benefits there by quite a bit. Yeah. It's great.
Starting point is 00:38:18 So, so Mark goes on to say, you know, I think we've got research that shows it can have positive mental health effects. You know, I think we've studied whether or not how it affects children, but he doesn't talk about. He leaves out all the stuff that I, all the statistics like about all the kids who, who suicidal ideation starts on Instagram. They had that data when he went before Congress. He just didn't mention it.
Starting point is 00:38:38 They hadn't told anyone that shit. Like he didn't say a goddamn word about it. Yeah. He was like, yeah, I think we've looked into it. And you know, there's some ways in which it can be healthy, not. And also 1.3 million American kids became suicidal because of our app. Like he did not throw that info out. Like, did he throw that, I mean, truly I'm like up in the air of like, did he not say
Starting point is 00:38:59 that because he didn't want people to know? Or did he just say that because he heard it and he didn't care and he forgot like, you just don't know what that guy, that is so fucking evil. Wow. Yeah. It's pretty great. And we'll talk more about that later. In May of 2021, Instagram boss Adam Wasari told reporters that he thought any impact on
Starting point is 00:39:18 teen well being by Instagram was likely quote, quite small based on the internal research he'd seen. They haven't released this research. He's saying, oh, we have research and it says that any kind of impact on well being is pretty small. And again, the actual research by this point showed 13% of kids in the UK and 6% of kids in the United States were moved to thoughts of suicide by Instagram, which I would not call small.
Starting point is 00:39:40 I wouldn't necessarily say it's huge, but that is not a small impact. No, that is like thousands and thousands and thousands and possibly millions of children. Yeah. That's significant. The Wall Street Journal caught up with Wasari after the Facebook papers leaked. So they were able to like drill him on this a bit and he said a bit more quote, in no way do I mean to diminish these issues. Some of the issues mentioned in the story aren't necessarily widespread, but their impact
Starting point is 00:40:04 on people may be huge, which is like, again, a perfect nonstatement. That's right. They're like, but what about the thing we couldn't possibly gauge at all versus the thing that we did and we're actively distancing ourselves from? I mean, those statistics, that's like at least one kid in every classroom. That is gigantic. When you read the responses of guys like Wasari and compare them, they're responsive guy like people like Mark Zuckerberg and official corporate spokespeople.
Starting point is 00:40:30 It's very clear that they're working from the same playbook that they're very disciplined in their responses because Wasari does try to tell the journal that he thinks Facebook was late to realizing there were drawbacks in connecting people in such large numbers. But then he says, I've been pushing very hard for us to embrace their responsibilities more broadly, which again, says nothing. He then pivots from that to stating that he's actually really proud of the research they've done on the mental health effects on teens, which again, they didn't share with anybody, and I would argue lied about by omission in front of Congress.
Starting point is 00:41:01 He's proud of this because he says it shows Facebook employees are asking tough questions about the platform. Quote, for me, this isn't dirty laundry. I'm actually very proud of this research, which is the same thing Zuckerberg said about his own employees damning the service after Jane Six. Right, I was going to say that's the same exact thing as the actually bad work. Talking about how the working for the Death Star is bad is evidence of, oh, the Death Star actually has a really open work culture, like no, I don't know.
Starting point is 00:41:31 I feel like there are a few. There are not many CEOs that are good at flipping a narrative, but Mark Zuckerberg is particularly bad at it. Yeah, and part of why they can be bad at it is it doesn't really matter, or at least it hasn't fucking so far. But the patterns. I mean, not enough to get a better figurehead. The pattern's pretty clear here.
Starting point is 00:41:56 When a scandal comes out, deny it until the information that can't be denied leaks out and then claim that whatever is happening at the site, whatever information you had about how harmful it is, is a positive because it means that you were trying to do stuff about it even if you actually rejected taking action based on the data you had and refused to share it with anybody else. Maseri and Zuckerberg were also careful to reiterate that any harms from Instagram had to be weighed against its benefits, which I haven't found a ton of documentation on. In fact, as the Wall Street Journal writes, in five presentations over 18 months to this
Starting point is 00:42:28 spring, the researchers, Facebook researchers conducted what they called a teen mental health deep dive and follow-up studies. They came to the conclusion that some of the problems were specific to Instagram and not social media more broadly. This is especially true concerning so-called social comparison, which is when people assess their own value in relation to the attractiveness, wealth, and success of others. Social comparison is worse on Instagram, states Facebook's deep dive into teen girl body image issues in 2020, noting that TikTok, a short video app, is grounded in performance
Starting point is 00:42:56 while users on Snapchat, a rival photo and video sharing app, are sheltered by jokey features that keep the focus on the face. In contrast, Instagram focuses more heavily on the body and lifestyle. In March 2020, internal research states, it warns that the Explore page, which serves users photos and videos curated by an algorithm, can send users deep into content that can be harmful. Aspects of Instagram exacerbate each other to create a perfect storm, the research states. Yeah, I mean, again, not a shocking revelation over here.
Starting point is 00:43:31 And I do think that let's TikTok and Snapchat get off a little easy there, like their assert there is absolutely toxic body image culture on there. And I feel like Finspo will thrive on any platform it fucking gloms itself onto. But Instagram is particularly bad because it's where so many lifestyle people have launched. And there's so many headless women on Instagram, it is shocking. There's so many, like, not like, not like you machete my head off, but like, you're not encouraged to show your head by the algorithm, which sounds weird, but it is true. The less like, it is just very focused on how you physically look.
Starting point is 00:44:16 And then there's also this tendency to like, tear people apart if they have edited their body to look a certain way, when it's like, well, that the algorithm rewards editing your body to look a certain way and to do all this. And it's, you do bring up a good point where it's like, it's frustrating that it's important to critique Facebook in relation to its competitors, like TikTok and Snapchat. That can lead to the uncomfortable situation of like seeming to praise them when they haven't done a good job. They just haven't been as irresponsible.
Starting point is 00:44:47 It's kind of like attacking like Chevron. If you look at all of the overall harms, including like their impact and like covering up climate change, maybe the worst of the big oil and gas companies, I don't know, it's admirable. But it's like, if you're criticizing Chevron specifically, you're not saying BP is great. You're just being like, well, these are the guys specifically that did this bad thing and they were the leaders in this specific terrible thing. Other bad things are going on, but we can't, like the episode can't be about how bad everyone
Starting point is 00:45:14 is. We're talking about Facebook right now. We have these documents from inside Facebook. I'm sure versions of this are happening everywhere else. Listeners, in your everyday life, just don't use Facebook as a yardstick for morality. Then you'll just end up letting a lot of people off for a lot of fucked up stuff. I would say in your regular life, don't use Facebook is all the sentence we needed there. So you were talking earlier about like, because Mark went up in front of Congress and was
Starting point is 00:45:44 like, yeah, I think we've got research on this and I've definitely seen research that says it's good for kids. We know everything I just stated that quote, I just read everything like that's in those internal studies. We know that Mark saw this, we know that it was viewed by top Facebook leaders because it was mentioned in a 2020 presentation that was given to Mark Zuckerberg himself. We know that when in August of 2021, Senators Richard Blumenthal and Marsha Blackburn sent a letter to Mark Zuckerberg asking him to release his internal research on how his platforms
Starting point is 00:46:12 impact child mental health, we know that he sent back a six page letter that included none of the studies we've just mentioned. Instead, the study said that it was hard to conduct research on Instagram and that there was no consensus about how much screen time is too much. Meanwhile, their own data showed that 40% of Instagram users who reported feeling unattractive said that the feeling began while they were on Instagram. Facebook's own internal reports showed that their users reported wanting to spend less time on Instagram, but couldn't make themselves.
Starting point is 00:46:39 And here's a quote that makes it sound like heroin. Teens told us they don't like the amount of time they spend on the app, but feel like they have to be present. They often feel addicted and know that what they're seeing is bad for their mental health, but feel unable to stop themselves. That's Facebook writing about Instagram. That's their own people saying this. This is not some activist getting in here.
Starting point is 00:47:01 I guess good on them regardless of the level of self-awareness going on there. And what I was thinking about earlier when it comes to anytime Zuckerberg is in front of Congress or in front of political officials, I feel like for a lot of people, the takeaway and the thing that gets trending is how little political officials and members of Congress understand about how the internet works. And that's like the funny story is like, oh, Mark Zuckerberg talked about an algorithm. And this comes up all the time. It comes up on V, it came up on succession of just like how not internet literate the
Starting point is 00:47:41 majority of people who decide how the internet works are. And it almost becomes like a hee-hee-ha-ha old guy doesn't know how algorithm works. But it's like, well, the consequence of that is that it ends up making Mark Zuckerberg look way cooler than he is. And it also doesn't address the problem at all of like, no, Mark Zuckerberg is omitting something gigantic here. And the majority of our lawmakers in Congress don't have the fucking cultural vocabulary to even understand that.
Starting point is 00:48:12 And that is like, and I guess it makes for a couple of good memes, but it's just like, no, this is bad. Jamie, can you commit to Cancel Finsta? Do you remember that horror? That was sad. Oh, God. That made me think. Right, Cancel Finsta.
Starting point is 00:48:29 I mean, I think that was the most recent one where it's like, okay, yeah, that is objectively funny. But the consequence of that is, I mean, that's ultimately a win for Instagram and that's a win for Facebook because it makes them look like they're operating on a level that the fucking government doesn't understand. And meanwhile, you know, one kid in every classroom is suicidal as a result of the inability of law, like lawmaking officials to understand the effect that this has. Yes.
Starting point is 00:49:00 It's just, it makes me real mad, Robert. And one of the main things about this is that while these lawmakers don't understand and sound like idiots talking to Mark Zuckerberg, his own employees, these researchers who are part of the integrity team, these researchers studying the impact of Instagram on teens, know exactly how harmful it is. And they are grappling in real time with like the damage their product is doing to children. Members of these teams reported frustration at the fact that their colleagues often refuse to take their findings seriously.
Starting point is 00:49:30 One former researcher told the Wall Street Journal that we're standing directly between people and their bonuses when they try to reduce the harmful aspects of Instagram. Because like anything that reduces the harm is going to reduce its popularity. It's going to cut down on time on site. It's going to cut down on users. And so everyone else at Facebook, but the integrity team gets paid by how much they increase engagement on the site. That's where the bonuses come from.
Starting point is 00:49:54 So if you're trying to make Facebook less dangerous, a huge number of the most influential employees have a financially, a vested financial interest in you failing. So that's good. And what that boils down to is like, there is a financial incentive for you convincing a 13 year old girl that she should get fucking fillers in her, like, what the fuck? It's just it. Now, all I'm going to try to sell the teens because it's ad break time, Jamie. And I would like to.
Starting point is 00:50:21 Oh, okay. Okay. If you're a teenager, if you're a teenager, you don't need Instagram, that shit's bad for you. You certainly don't need Facebook. That's where the olds are. But you know what's come back around to being cool and what I think your generation needs to get on board with smoking huge, comical cigars.
Starting point is 00:50:39 I'm talking the ones that are like nine inches long and like three inches wide, massive teens listening right now. Go out to your local cigar shop, do not drive a sad adult, the worst looking adult you find outside the cigar shop, pay him for huge, cheap cigars to me and change the new Instagram is chain smoking horrible cigars. This is this is what this is what's going to bring not buying teenagers. Look, zoomers, zoomers, the air is not getting any cleaner, right? You're all going to, we're all going to choke to death on wildfire smoke.
Starting point is 00:51:12 You might as well burn down a big fat mac and noodle. Robert, can I tell you, I bought a teenager, a white claw the other day. I'm so proud of you. Good for you. Thank you. I felt good. I felt like I did a public service. Yeah.
Starting point is 00:51:24 A big fat, ridiculous lunatic cigars, bribe, bribe a sad 40 year old for it. You're just preparing yourself for climate change. All right. Here's the other white claw teens. Okay. I mean, white claw goes great with a huge shitty cigar, Jamie. No, it doesn't. It absolutely does.
Starting point is 00:51:43 Smoking is bad for you. Andy has a white paw. You're not smoking a cigar. You puff it so it's healthy. You puff. All right. Here's some ads. What would you do if a secret cabal of the most powerful folks in the United States told
Starting point is 00:51:58 you, Hey, let's start a coup. Back in the 1930s, a Marine named Smedley Butler was all that stood between the US and fascism. I'm Ben Bullitt. And I'm Alex French. In our newest show, we take a darkly comedic and occasionally ridiculous deep dive into a story that has been buried for nearly a century. We've tracked down exclusive historical records.
Starting point is 00:52:18 We've interviewed the world's foremost experts. We're also bringing you cinematic, historical recreations of moments left out of your history books. I'm Smedley Butler and I got a lot to say. For one, my personal history is raw, inspiring and mind blowing. And for another, do we get the mattresses after we do the ads or do we just have to do the ads? From my heart podcast and School of Humans, this is Let's Start a Coup.
Starting point is 00:52:48 Welcome to Let's Start a Coup on the iHeart Radio app, Apple podcast, or wherever you find your favorite shows. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science? The problem with forensic science in the criminal legal system today is that it's an awful lot of forensic and not an awful lot of science. And the wrongly convicted pay a horrific price. Two death sentences and a life without parole.
Starting point is 00:53:19 My youngest, I was incarcerated two days after her first birthday. I'm Molly Herman. Join me as we put forensic science on trial to discover what happens when a match isn't a match and when there's no science in CSI. How many people have to be wrongly convicted before they realize that this stuff's all bogus. It's all made up. Welcome to CSI on Trial on the iHeart Radio app, Apple podcast, or wherever you get your
Starting point is 00:53:52 podcasts. I'm Lance Bass, and you may know me from a little band called NSYNC. What you may not know is that when I was 23, I traveled to Moscow to train to become the youngest person to go to space. And when I was there, as you can imagine, I heard some pretty wild stories. But there was this one that really stuck with me about a Soviet astronaut who found himself stuck in space with no country to bring him down. It's 1991 and that man Sergei Krekalev is floating in orbit when he gets a message that
Starting point is 00:54:29 down on earth, his beloved country, the Soviet Union, is falling apart. And now he's left defending the Union's last outpost. This is the crazy story of the 313 days he spent in space, 313 days that changed the world. Listen to the last Soviet on the iHeart Radio app, Apple podcast, or wherever you get your podcasts. All right, we're back. We are.
Starting point is 00:55:02 We all just enjoyed a couple of really comically large cigars. We did not. There are those ridiculous long asylum cigars. It was great. Why are you fixated on this? What is happening? Because I find that sketch from I think you should leave while the little girls are talking about smoking five mac and noodles to unwind at the end of the day.
Starting point is 00:55:23 Actually quite funny. I mean, yeah. But like why are you... I love when you reveal yourself to be a basic bitch. I am a basic bitch. Because I was watching Netflix. Yeah, you're right. I was watching Netflix.
Starting point is 00:55:34 That's why I'm thinking about cigars. I love that we're in the middle of a podcast and you can't get off that. Well, I also think making children do things that's bad for them is funny. But not this way. Not the way Facebook does it. Well, maybe you should just send them to Dan Flashes. Send them to Dan Flashes. I mean, they've all...
Starting point is 00:55:51 I think the teens are rejecting NFTs pretty widely, Jamie. So when Facebook does try to make the case that their products are benign, they like to bring up studies from the Oxford Internet Institute, which is a project of Oxford University, which show minimal or no correlation between social media use and depression. The Wall Street Journal actually reached out to the Oxford researcher responsible for some of these studies, who right away was like, wasn't like, oh, yes, they're right. Everything's fine. He was like, actually, Facebook needs to be much more open with the research that they're
Starting point is 00:56:23 doing because they have better data than we can get, than researchers can get. And so our actual information that they're citing is hampered by the fact that they're not sharing what they're finding, and who knows how things could change and our conclusions could change if we had access to all of that data. He even told the Wall Street Journal, people talk about Instagram like it's a drug, but we can't study the active ingredient, which you'll notice is not him saying, it's fine. It's him being like, yeah, I really wish we could actually study this better. It's difficult right now.
Starting point is 00:56:53 Also, he's referring to it like drugs, which is the comparable scale of how it manifests in... Yeah, he's certainly not being like, everything's fine. I think that's clear. He's truly like constantly, Mr. Policeman, I gave you all the clues in this situation and just no one gives a shit. It is very funny in like that movie you were referencing. And that's what I was trying to say was that it's hilarious.
Starting point is 00:57:18 So we focused a lot on these episodes about how Facebook has harmed people and institutions in the United States. But as we've covered in past episodes, the social network has been responsible for helping to incite ethnic cleansings and mass racial violence in places like Myanmar and India. Mob violence against Muslims in India, incited by viral Facebook misinformation, led one researcher in February of 2019 to create yet another fake account to try and experience social media as a person in Kerala, India might. From the New York Times, quote, for the next three weeks, the account operated by a simple
Starting point is 00:57:50 rule, follow all the recommendations generated by Facebook's algorithm to join groups, watch videos and explore new pages on the site. The result was an inundation of hate speech, misinformation and celebrations of violence which were documented in an internal Facebook report published later that month. And this is from the Facebook researcher. Following this test user's newsfeed, I've seen more images of dead people in the past three weeks than I've seen in my entire life total. What a great site Mark built.
Starting point is 00:58:18 This new tagline, the place for corpses. Oh, yeah. My goodness. I mean, and it's like, I know that we have discussed Facebook's role in supercharging ethnic cleansings. Yeah. But that is just, that is so, yeah, it's not great, Jamie. Someone wrote that down, Robert, and it's wrote that down and hit publish.
Starting point is 00:58:41 It's not greater or because India is Facebook's biggest customer, 340 million Indians use one or more Facebook products. That's a shitload of people, yeah, 340 million. That is something that I think is important to remember and something that I lose sight of sometimes is like Facebook is not a super popular platform for people of all ages in North America, but that's not the case everywhere. Yeah. And it is just, it is the internet for a lot of these people.
Starting point is 00:59:15 Right. That is the way they, that is the whole of how they consume the internet in a lot of cases. I mean, maybe with like YouTube or something mixed in, but they're probably getting a lot of their YouTube links from their Facebook feed. Now the fact that India is the number one customer in terms of like number of people for Facebook, I'm sure the United States is still more profitable just because of like differences in income and whatnot, but this is a huge part of their business.
Starting point is 00:59:39 But despite that fact, they have failed to invest very much in terms of meaningful resources into having employees who speak the language or as is more the problem, the languages of India. See, India, super mixed country, right? In terms of different like ethnic groups and religious groups, they have 22 officially recognized languages in the country and there's way more languages than that in India that significant numbers of people speak. There's 22 officially recognized languages.
Starting point is 01:00:04 Anyone who can travel there and I've spent a lot of time in India can tell you that being able to effectively say hello and ask basic questions of people can require a lot of research if you're traveling a decent amount. But Facebook aren't 20-something tourists on the prowl for Good Tandori and Bong Lassies. They have effectively taken control of the primary method of communication and information distribution for hundreds of millions of people and they failed to hire folks who might know if some of those people are deliberately inciting genocide against other people in the country. 87% of Facebook's global budget for identifying misinformation is spent on the United States.
Starting point is 01:00:37 The rest of the planet shares 13% of their misinformation budget. You want to guess what percentage of Facebook users North Americans make up? No. 10%. 87% of their budget goes on 10% of their users. I was like dealing with disinformation. That sounds like a larger metaphor for something else. Dealing with disinformation specifically.
Starting point is 01:00:57 Now, when this leaked out, Facebook's response was that the information cited was incomplete and did not include third party fact checkers. They're like, well, this doesn't include all of the people, the third party companies we hire except for the data they show suggests that the majority of the effort and money spent on third party fact checkers is for fact checking stuff in the United States. And, of course, they did not elaborate on how including this information might have changed the overall numbers, so my guess is not by much of it all. Internal documents do show that Facebook attempted to create changes to their platform to stop
Starting point is 01:01:27 the spread of the disinformation during the November election in Myanmar. Those changes, which also halted the spread of disinformation put out by the military, it was the military inciting ethnic cleansings and trying to incite violence in order to lock down political power ahead of this election. So they cut this significantly prior to the election. They see it as a problem, they institute changes, similar to the changes they talked about putting up in the US if things went badly with the election. And these worked.
Starting point is 01:01:54 It dropped dramatically. Oh, yeah. Cool. And again, that is good. I'm glad that was done, but they only responded. Give me a second, Jamie. Give me a second, Jamie, because prior to the election, they institute these changes, which are significant.
Starting point is 01:02:08 It reduces the number of inflammatory posts by 25.1% and reduces the spread of photo posts containing disinformation by 48.5%. This is huge. That's really significant. As soon as the election was done, Facebook reversed those changes, presumably because they were bad for money. Three months after the election, the Myanmar military launched a vicious coup. Since there continues to this moment, in response, Facebook created a special policy
Starting point is 01:02:33 to stop people from praising violence in the country, one which presumably reduces the spread of content by freedom fighters resisting the military as much as it reduces content spread by the military. It's obviously too much to say that Facebook caused a coup in Myanmar. Shit's been, I mean, there's a lot going on there. I'm not pretending that this is like, it's all just Facebook. But a major contributing factor, for sure. It wasn't insignificant in the fact that they knew how much their policies were helping
Starting point is 01:03:00 and reversed them after the election, reversing this effect and leading to an increase in inflammatory content because it profited them more is damning, right? That's the thing that's damning. Around the world, Facebook's contribution to violence may be greatest in places where the company has huge reach but pays little attention. In Sri Lanka, people were able to automatically add hundreds of thousands of users to Facebook groups that spread violent content. In Ethiopia, a national militia coordinated calls for violence openly on the app.
Starting point is 01:03:29 The company claims that it has reduced the amount of hate speech people see globally by half this year. But even if that is true, how much hate was spread during the years where they ignored the rest of the world? How many killings? How many militant groups seeded with new recruits? How many pieces of exterminationist propaganda spread while Facebook just wasn't paying attention?
Starting point is 01:03:47 The actual answer is likely incalculable. But here's The New York Times again reporting on that test account in Kerala, India. Ten days after the researcher opened the fake account to study misinformation, a suicide bombing in the disputed border region of Kashmir set off a round of violence and a spike in accusations, misinformation, and conspiracies between Indian and Pakistani nationals. After the attack, anti-Pakistan content began to circulate in the Facebook recommendation groups that the researcher had joined. Many of the groups she noted had tens of thousands of followers, a different report
Starting point is 01:04:17 by Facebook published in December 2019 found Indian Facebook users tended to join large groups, with the company's median group size at 140,000 members. In a separate report produced after the elections, Facebook found that over 40% of top views or impressions in the Indian state of West Bengal were fake or inauthentic. When one inauthentic account had amassed more than 30 million impressions, a report in March 2021 showed that many of the problems cited during the 2019 elections persisted. In the internal document, called Adversarial Harmful Networks, India Case Study, a Facebook researcher wrote that there were groups and pages replete with inflammatory and misleading
Starting point is 01:04:53 anti-Muslim content on Facebook. The report said that there were a number of dehumanizing posts comparing Muslims to pigs and dogs, and misinformation claimed that the Quran, the holy book of Islam, calls for men to rape their female family members. So that's significant, like the scale at which this shit spreads is huge. And I mean, I don't even, I mean, I feel like I know the answer if the hate is existing on that scale unmitigated, but who is working to, like how many people does Facebook have working on, is there an integrity team for this region?
Starting point is 01:05:34 Like technically, yes. The question is, how many of them and how many of the languages there are represented by the team? Right. And it's not many. Exactly. Like you can't have a global company and not have global representation or shit like this is going to happen.
Starting point is 01:05:51 Like it's just... It's actually, you know what, it kind of reminds me of Jamie. I was looking at this and I was thinking about the East India Trading Company. When the East India Company took over large chunks of India, they took it over from a regime, the government, the monarchical government that had been in charge in that area prior, there's not a good government, right, because I'm number one, they lost that war, but like they weren't a very good government, they were a government. So they did do things like provide aid and famines and disasters and have people whose
Starting point is 01:06:19 job it was to like handle stuff like that and like handle, like make sure that like place the stuff was getting where it needed to go during like calamities and whatnot. And doing things specifically that helped people but were not profitable because a big chunk of what a government does isn't directly profitable, it's just helping to like keep people alive and keep the roads open and whatnot, right? Yeah. Sustain humanity. Yeah.
Starting point is 01:06:42 When the East India Company took over, they were governing and in control of this region and this is actually Bengal, I think is their first place, but they don't have any responsibility. They don't have teams who are dedicated to making sure people aren't starving. They don't have people who are dedicated to actually keeping the roads open in any way that isn't necessary for directly the trade that profits them. They don't do those things because they're not, they're governing effectively but they're not a government and there's been a lot of talk about how Facebook is effectively like a nation, a digital nation of like three billion people and Mark Zuckerberg has the power of
Starting point is 01:07:12 a dictator. And one of the problems with that is that for all of their faults, governments have a responsibility to do things for people that are like necessary to stop them like to deal with like calamities and whatnot. Facebook has no such responsibility. And so when people were not paying attention to Sri Lanka, to West Bengal, to Myanmar, they didn't do anything. And as we know, like in a region where there are millions and millions of people, 40% of
Starting point is 01:07:41 the views were fake and authentic content. Because they don't give a shit what's spreading because they don't have to because they don't have to deal with the consequences unless it pisses people off as opposed to a government where it's like, well, yeah, we are made up of the people who live here and if things go badly enough, it can't not affect us. I'm not trying to be, again, like with TikTok, I'm not trying to praise the concept of government. Governance. But it is better than what Facebook's doing.
Starting point is 01:08:08 Right. Right. Yeah. And it's, I think that that is like a very, I'd never considered looking at it that way, but viewing it as this kind of digital dictatorship that. A colonial dictatorship. It's colonized people's information, like information streams. It's colonized the way people communicate, but it has no responsibility to them if they
Starting point is 01:08:29 aren't white and wealthy. Well, yeah, and marginalized people in the same ways that actual dictatorships do in terms of how much attention is being given, how are people being hired to support and represent this area? And of course, the answer is no. Yeah. And of course, the result of that is extreme human consequence and harm. And it's so, and it like is just so striking to me that it still feels like in terms of
Starting point is 01:09:01 the laws that exist that, I mean, that, that even attempt to address the amount of influence and control that a gigantic digital network like, like Facebook has, you know, that Facebook, I mean, unless people are yelling at them and unless their bottom line is threatened, they're never going to respond to stuff like this. Like that's, that's been made clear for decades at this point. Yeah. It's great. I love it.
Starting point is 01:09:31 So. Well, I'm all worked up. Yeah. A great deal of the disinformation that goes throughout India on Facebook comes from the RSS, which is an Indian fascist organization closely tied to the BJP, which is the current ruling right wing party. And when I say fascist, I mean, like some of the founders of the RSS were, were actual like friends with Nazis and they were heavily influenced by that shit in like the thirties.
Starting point is 01:09:55 Both organizations are profoundly anti-Muslim and the RSS's propaganda has been tied to numerous acts of violence. Facebook refuses to designate them a dangerous organization because of, quote, political sensitivities that might harm their ability to make money in India. Facebook is the best friend many far right and fascist political parties have ever had. Take the Polish Confederation Party. They're your standard right wing extremists, anti-immigrant, anti-lockdown, anti-vaccine, anti-LGBT.
Starting point is 01:10:22 The head of their social media team, Tomasz Garbacek, sorry, Tomasz, told The Washington Post that Facebook's hate algorithm, in his words, had been a huge boon to their digital efforts. Like he calls it a hate algorithm and says, this is great for us. Expanding it like, I think we're good with emotional messages and thus their shit spreads well on Facebook. Quote, from The Washington Post. In one April 2019 document detailing a research trip to the European Union, a Facebook team
Starting point is 01:10:49 reported feedback from European politicians that an algorithm changed the previous year, built by Facebook at Chief Executive Mark Zuckerberg as an effort to foster more meaningful interactions on the platform had changed politics for the worst. This change, Mark claimed, was meant to make interactions more meaningful, but it was really just a tweak to the algorithm that made comments that provoked anger and argument even more viral. And I'm going to quote from The Post again here. In 2018, Facebook made a big change to that formula to provoke meaningful social interactions.
Starting point is 01:11:17 These changes were built as a design to make the newsfeed more focused on posts from family and friends and less from brands, businesses and the media. The process weighted the probability that a post would produce an interaction, such as a like, emoji or comment, more heavily than other factors. But that appeared to backfire. Hogan, who this week took her campaign against her former employer to Europe, voiced a concern that Facebook's algorithm amplifies the extreme. Anger and hate is the easiest way to grow on Facebook, she told British lawmakers.
Starting point is 01:11:42 Many of them have their jobs because of how easy it is to make people shit go viral when it causes anger and hate. I was about to say, I mean, that showed me a point in such a way that we're a system of power that that's not true for. Yes. Again, we're focusing on Facebook here in part because I do think it's more severe in a lot of ways there, but also just because they're the ones who had a big leak and so we have this data.
Starting point is 01:12:03 So we're not just saying, yeah, look at Facebook, obviously hate spreading there. We're saying, no, no, we have numbers. We have their numbers about how fucking bad the problem is. I guess that that is the difference. Yeah. There's data. And we have evidence that the system is well aware of the numbers. Yeah, I would love to be talking about Twitter too, which just, and maybe Twitter just never
Starting point is 01:12:23 bothered to get those kind of numbers, who knows. This caused what experts describe as a social civil war in Poland, like this change. One internal report concluded, we can choose to be idle and keep feeding users fast food, but that only works for so long. And they've already caught on to the fact that fast food is linked to obesity. And therefore, its short-term value is not worth the long-term cost. So he's being like, we're poisoning people. And it's addictive, like McDonald's, but people are going to give it up in the same
Starting point is 01:12:52 way that McDonald's started to suffer a couple of years back because they don't like the way this makes them feel, actually. It's fun for a moment, but it's horrible for them. We just got to get a Morgan Spurlock for Facebook, baby. We just got to get, where's the supersize me for Facebook? Our entire society is the Morgan Spurlock for Facebook. January 6th was Morgan. I was going to say, I was like, I feel like it's, I mean, whatever, not to say that McDonald's
Starting point is 01:13:23 isn't a hell of a drug, but like, this is not the same, I mean, it's stronger because it's your fucking brain and self-image and the view of yourself. And I feel like that is the most strong manipulation that any given system, person, whatever can have on you is controlling the way that you see yourself. It's not the same in terms of like involuntary baseness. I feel like it's something that you very much participate in. It's bad. Yeah.
Starting point is 01:13:58 It's good. It's good. That's what I think, Jamie. Oh. Facebook has been aggressive. Is that why you called me today to say good, actually? To read all this and then say, so that's fine. Let's never talk of it again.
Starting point is 01:14:10 Anyway, Facebook has been aggressive at rebutting the allegations that their product leads to polarization. Their spokeswoman brought up a study, which she said shows that academic research doesn't support the idea that Facebook or social media more generally is the primary cause of polarization. Now, ignore for the moment that not the primary cause doesn't mean isn't a significant cause. And let's look at this study. The spokeswoman was referencing cross-country trends and effective polarization, an August 2021 study from researchers at Stanford and Brown University.
Starting point is 01:14:40 This study opens by noting it includes data for only 12 countries and that all but Britain in Germany exhibited a positive trend towards more polarization. So right off the bat, there's some things to question about this study, which is number one, they're saying that like, oh, Britain hasn't gotten more polarized, which is like, have you been there? Have you talked to them? Yeah. Not that I don't live there, but not what I've been hearing from my friends that do.
Starting point is 01:15:06 Here's the thing. When you look at how Facebook is basically citing this as like evidence that like, look, we're fine. Social media is not. This study from this very credible study says that we're not the cause of polarization, so everything's good. The study doesn't quite back them up on this. Right off the bat, one of the authors provides, like, notes this, and this is from a write-up
Starting point is 01:15:28 by one of the authors on the study in a website called techpolicy.com, where he's talking about the study and what it says. A flat or declining trend over the 40 years of our sample does not rule out the possibility that countries have seen rising polarization in the most recent years. Britain, for example, shows a slight overall decline, but a clear increasing trend post-2000 and post-Brexit. So he's saying that like, we don't have as much data from like more recent polarization, and that may be a reason why this study is less accurate and why some of our statements
Starting point is 01:15:55 do not conform with like what people have observed. He goes on to note, the data do not provide much support for the hypothesis that digital technology is the central driver of effective polarization. The internet has diffused widely in all the countries we looked at, and under simple stories where this is the key driver, we would have expected polarization to have risen everywhere as well. In our data, neither diffusion of internet nor penetration of digital news are significantly correlated with increasing polarization.
Starting point is 01:16:20 Similarly, we found little association with changes in inequality or trade. One explanatory factor that looks more promising is increasing racial diversity. The non-white share of the population has increased faster in the US than in almost any other country in our sample, and other countries like New Zealand and Canada where it has risen sharply have seen rising polarization as well. So I have some significant arguments with him here, including the fact that as he notes here, his study only looks at Western nations. With the exception of Japan, all of the nations in the study are European or the United States
Starting point is 01:16:51 in Canada. And so they have all had prior to 2000, higher penetrations of the internet and non-internet mass media. Like, outside of this, if you're trying to determine the impact of social media, elements of what social media has done were present in places like Fox News in the United States years before Facebook ever existed. And that was not the case in places like Myanmar and India, which are not a part of this study. So right off the bat, it's problematic to try and study the impact of social media on
Starting point is 01:17:18 polarization only in countries that already had robust mass media before social media came into the effect, which is not to say that I agree with their conclusion, because I think there's other flaws with this study. But one of the flaws is just that hundreds of millions of their users exist in countries where they did not, the study was not done, where they were not looking at these places, which is a flaw. And that's dependent on most readers just conflating North America and Europe with the center of the fucking world.
Starting point is 01:17:46 And again, I have issues about like, okay, well, you're saying that racial diversity is more of a thing. But where is the propaganda? Where is the hate speech about racial diversity spreading? Is it spreading on social media? Like, yes, it is. I can say that as an expert. It's also just like, again, not that this study is even bad or not useful.
Starting point is 01:18:05 It is one study. And again, we have internal Facebook studies that make claims that I would say throw some of this into question. But again, this is just how a corporation is going to react. They're going to find a study that they can simplify in such a way that they can claim there's not a problem. Because none of the people who they're going to be arguing with on Capitol Hill, and precious for you, the journalists, are going to actually drill into this and then talk to other experts
Starting point is 01:18:32 or even reach out to members of that study and be like, how fair is this phrasing? How does it gel with this information and this information? As we saw earlier with the last study, when people reached, when the Wall Street Journal to their credit reached out to that scientist, he was like, well, actually, they have better data than me. And I'd love to see it, because maybe that'll change our conclusions. Anyway, yeah, Mark Zuckerberg has been consistent in his argument that deliberately pushing divisive and violent content would be bad for Facebook, quote, we make money from ads.
Starting point is 01:19:00 And advertisers consistently tell us they don't want their ads next to harmful or angry content. So while I was writing this article, I browsed over to one of my test Facebook accounts. The third ad on my feed was for a device to illegally turn a Glock handgun into a fully automatic weapon. Wait, you? Just his heads up. Yeah.
Starting point is 01:19:17 One of my, I have a couple of test feeds, and it was like, hey, this button will turn your Glock automatic, which is so many felonies, Jamie. If you even have that thing and a Glock in your home, the FBI can put you away forever. I have to laugh, I have to laugh, because that is really, really scary. But yeah, it is like Mark being like, look, oh, no advertiser wants this to be a violent place. Buy a machine gun on Facebook, you know, next to ads that are like t-shirts about killing liberals and stuff like a machine gun advertiser, maybe it would be one that wouldn't take issue
Starting point is 01:19:50 with that. Holy shit. I've had fucking hang the media shirts advertised to me on Facebook. Like my God, go to go to like, fuck you, Mark. I used to at the last, but when I quit Facebook a couple of years ago, I was like, well, I was, I was getting normie advertisements. I was getting good for you. Those really scary ones that says like those custom t-shirts that say, it's a Jamie Loftus
Starting point is 01:20:14 thing. You wouldn't understand. I wouldn't. Like what? And you wouldn't. Yeah, I would not. No. And you wouldn't.
Starting point is 01:20:22 The only time Facebook I can think of recently actually anticipated something I wanted is they keep showing me on all of the accounts that I've used, videos of hydraulic presses crushing things. And I do love those videos. Those, those are, those are pretty, pretty fun. And that's the meaningful social interactions that Mr. Mark Zuckerberg was talking about was the hydraulic press videos. Those are very comforting.
Starting point is 01:20:44 On the good old internet, which also wasn't all that great, but on the old internet, which was, it was a lot more fun though. There would have just been a whole website that was just like, here's all the videos of hydraulic presses crushing things. Come watch this shit. There wouldn't have been any algorithm necessary. You could just scroll through videos. There's no friend function.
Starting point is 01:21:04 It's just hydraulic press shit. Yeah. That's all I need, baby. That's all I need. That was a fun version of the internet. So back to the point. It is undeniable that any service on the scale of Facebook, again, like three billion users, is going to face some tough choices when it comes to the problem of regulating the speech
Starting point is 01:21:20 of political movements and thinkers. As one employee wrote in an internal message, I am not comfortable making judgments about some parties being less good for society and less worthy of distribution based on where they fall in the ideological spectrum. That's true. This is, again, part of the problem of not regulating them like a media company, like a newspaper or something, because by not making any choices, they're making an editorial choice, which is to allow this stuff to spread, presumably actually, like if you were actually being held
Starting point is 01:21:47 to some kind of legal standard that, again, most of our media isn't anymore, you would at least have to be like, well, let's evaluate the truthfulness of some of these basic statements before pressing. And I would say that's where the judgment should come in on, but that's expensive. What Facebook is saying, we won't judge based on politics, but we will judge based on whether or not something is counterfactual. That I think is morally defensible, but that's expensive as shit, and they're never going to do that.
Starting point is 01:22:13 Look, moral decisions are famously not cheap, and that is a lot of the reason why people do not do them. Yeah. It is true that having morals is not a profitable venture. Yeah. No, of course not. But the other thing that's true is that Facebook already makes a lot of decisions about which politicians and parties are worthy of speech, and they make that decision based mostly on
Starting point is 01:22:36 whether or not said public figures get a lot of engagement. Midway through last year, they deleted like all of the different anarchist media groups that had and a lot of anti-fascist groups that had accounts on Facebook just across the board. They deleted like CrimeThink, and they kicked off. It's going down like a rapper, I know, soul. Yeah, I mean, nobody ever complains when bad shit happens to anarchists except for anarchists. But yeah, they nuked a bunch of anarchist content, just kind of blanket saying it was
Starting point is 01:23:06 dangerous. And I think it was because they just nuked the proud boys and they had to be shown to be fair. But it has now come out that they have a whole program called X-Check or CrossCheck, which is where they decide which political figures get to spread violent and false content without getting banned. Based on engagement? Yeah.
Starting point is 01:23:26 Based on engagement. They've claimed for years that everybody's accountable to the site rules, but again, the Facebook papers has revealed that, like, that's explicitly a lie, and it's a lie Facebook has told other people at high levels of Facebook, and I'm going to quote from the Wall Street Journal here. The program, known as CrossCheck or X-Check, was initially intended as a quality control measure for actions taken against high-profile accounts, including celebrities, politicians, and journalists.
Starting point is 01:23:48 Today, it shales millions of VIP users from the company's normal enforcement process, the documents show. Some users are whitelisted, rendered immune from enforcement actions, while others are allowed to post-rule violating material pending Facebook employee reviews that often never come. At times, the documents show X-Check has protected public figures whose posts contain harassment or incitement to violence, violations that would typically lead to sanctions for regular users.
Starting point is 01:24:11 In 2019, it allowed International Soccer Star Neymar to show nude photos of a woman who had accused him of rape to tens of millions of his fans before the content was removed by Facebook. Guided accounts shared inflammatory claims that Facebook's fact-checkers deemed false, including that vaccines are deadly, that Hillary Clinton had covered up pedophile wings, and that then-President Donald Trump had called all refugees seeking asylum animals. According to the documents, a 2019 review of Facebook's whitelisting procedures marked attorney-client privilege found favoritism to those users to be both widespread and
Starting point is 01:24:42 not publicly defensible. "'We are not actually doing what we say we do publicly,' said the confidential review. It called the company's actions a breach of trust.'" Guided and added, "'Unlike the rest of our community, these people violate our standards without any consequence.'" And they lied to their board members about whether or not it was a thing. They said it was very small, and just for I think the initial claim was like, we have to have something like this in place for people like President Trump.
Starting point is 01:25:05 But it's a tiny number of people, and it's because they occupy some political position where we can't just as easily delete their account because it creates other problems. You can't write it off because they're not as free and just they need to be for this conduct to be acceptable. That is on the level of you can be unethical and still be legal. I mean, it's still true. Well, here's the thing. They told their board they only did this for a small number of users.
Starting point is 01:25:33 You want to guess what that small number was? Oh, I love when Facebook says there's a small number. What is a small number? 5.8 million. That's so many. Yeah. Oh, dear. Okay.
Starting point is 01:25:47 It's very funny. It's very funny. It's all good. I mean, yeah, they're just, can I say something controversial? Please. I don't like this company one bit. You don't. Well, I feel like that's going a bit far.
Starting point is 01:26:07 I'm sorry. And I'm famously, I don't like making harsh judgments on others, but I'm starting to think that they might be doing some bad stuff over there. Mm-hmm. Yeah. I would, you know, I don't like these people, I don't like these people at all. You know what I do like, Jamie? What do you like?
Starting point is 01:26:26 What do you like? Ending podcast episodes. Mm-hmm. Oh, I actually do like that, too. Yeah. That's the thing I'm best at. Do you want to plug your plugables? Yeah.
Starting point is 01:26:36 Passion. Yeah. Sure. You can, I'm going to just open by plugging my Instagram account, a famously healthy platform that I'm addicted to and I don't really have any concerns about it, and I don't really think it's affecting my mental health at all. So I'm over there, and that's at Jamie Cray Superstar. I'm also on Twitter, which Robert can't stop saying is the healthiest of the platforms.
Starting point is 01:27:00 It is. I'm not so sold on that. It is the, of all of the people who are drunk driving through intersections filled with children, Twitter has the least amount of human blood and gore underneath the grill of the car. Okay, so Robert's saying, for all you Backstreet Boys heads, he's saying that Twitter is the Kevin Richardson of social media, I'm there as well. I'm saying the drunk driving Twitter car made it a full 15 feet further than the Facebook
Starting point is 01:27:28 car before the sheer amount of blood being churned up into the engine flooded the engine air intakes. But at the end of the day, we're all fucked. Yeah. So I'm on Twitter as well at Jamie and Robert, you can listen to my podcast, you can listen to my podcast, The Bechdel Cast, you can listen to ACC Cast, that's about the Cathy Comics, you can listen to My Year in Mensa, you can listen to Lolita Podcast, you can listen to nothing.
Starting point is 01:27:54 You know what never led to a genocide in any country as far as I'm aware, Jamie? The Cathy Comics. Well, see, then you haven't listened to the whole series about you. Oh really? Is it? Oh, you know what? That's why the last episode is your live report from Sarajevo in 1994. Episode 11.
Starting point is 01:28:14 Yeah. Irving really, his politics were not good. Yeah, he was like weirdly into the Serbian nationalism. Irving is like for the Cathy Comics, he's like, okay, I'm about to make a wild parallel. But Irving is like the barefoot Contessa's husband in that he looks so innocent, but then when you Google him, you're like, wait a second, this man is running on dark money. This guy was on Wall Street in the 80s, this is a bad man. He's basically like Jeffrey, the barefoot Contessa's husband.
Starting point is 01:28:47 The barefoot Contessa is run on dark money. I know people don't like to hear it, they love her, but it's just true, it's objectively true and that's what I would like to say at the end of the episode. I've never heard of the barefoot Contessa and I don't know what you're talking about. Yes, I am not even 1% surprised to know that, but that's okay. But you know what I do know about? What? I know about podcasts and this one is done.
Starting point is 01:29:11 Great, ending. Alphabet Boys is a new podcast series that goes inside undercover investigations. In the first season, we're diving into an FBI investigation of the 2020 protests. It involves a cigar-smoking mystery man who drives a silver hearse. And inside his hearse was like a lot of goods. And our federal agents catching bad guys or creating them. He was just waiting for me to set the date, the time, and then for sure he was trying to get it to happen.
Starting point is 01:29:40 Listen to Alphabet Boys on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts. Did you know Lance Bass is a Russian-trained astronaut? That he went through training in a secret facility outside Moscow, hoping to become the youngest person to go to space? Well, I ought to know because I'm Lance Bass. And I'm hosting a new podcast that tells my crazy story and an even crazier story about a Russian astronaut who found himself stuck in space, with no country to bring him down.
Starting point is 01:30:12 With the Soviet Union collapsing around him, he orbited the Earth for 313 days that changed the world. Listen to the last Soviet on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science, and the wrongly convicted pay a horrific price? Two death sentences in a life without parole. My youngest?
Starting point is 01:30:45 I was incarcerated two days after her first birthday. Listen to CSI on trial on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.