Behind the Bastards - Part Three: Mark Zuckerberg: The Worst Person of the 21st Century (So Far)

Episode Date: January 17, 2019

In Part Three, Robert is joined again by Maggie Mae Fish and Jamie Loftus to continue discussing Mark Zuckerberg, Facebook and fake news.  Learn more about your ad-choices at https://www.iheartpodca...stnetwork.comSee omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 Alphabet Boys is a new podcast series that goes inside undercover investigations. In the first season, we're diving into an FBI investigation of the 2020 protests. It involves a cigar-smoking mystery man who drives a silver hearse. And inside his hearse look like a lot of guns. But are federal agents catching bad guys or creating them? He was just waiting for me to set the date, the time, and then for sure he was trying to get it to happen. Listen to Alphabet Boys on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts. Did you know Lance Bass is a Russian-trained astronaut?
Starting point is 00:00:59 That he went through training in a secret facility outside Moscow, hoping to become the youngest person to go to space? Well, I ought to know, because I'm Lance Bass. And I'm hosting a new podcast that tells my crazy story and an even crazier story about a Russian astronaut who found himself stuck in space. With no country to bring him down. With the Soviet Union collapsing around him, he orbited the Earth for 313 days that changed the world.
Starting point is 00:01:32 Listen to The Last Soviet on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts. What's up, people? That's not how I open the show. You turn into a tech bro. You guys are both really positive about this, but Sophie's giving me the thumbs down and looks livid. I'm loving Casual Evans over here. We're in hour 3 of the Zupcast. His pants are 11. The Podmark.
Starting point is 00:02:03 The Podmark. I'm Robert Evans. This is Behind the Bastards, the show where we tell you everything you don't know about the very worst people in all of history. And again, this is part 3, so listen to the other two episodes first. Or if you're a renegade, like Mark Zuckerberg. Like that song about the Jeeps? Yeah, play by the rules. As you can tell, we are all deep into bags of Doritos.
Starting point is 00:02:28 We are deep into bags of Doritos in order to handle the stress of being in a hardcore zuck hole. Yeah, a zuck hole. Fully in the zuck vortex. Fully zucked. In the ear. Zucked in the ear. In the asterisk ear. We're getting Zed in the E.
Starting point is 00:02:47 Thankfully, eating a nice D helped me deal with that. The D meant Dorito. Get the D in the M. The D in the M. That's what busy people say. Time to say Doritos every time. Too much going on in this work of day world. In March of 2010, some asshole named Robert Evans published an article on Cracked.com.
Starting point is 00:03:13 Title, five reasons the internet could die at any moment. I am not proud of the title. 2010 was a different time, but one of the entries on that ultra-clickable listicle is relevant to our current topic. It was about the worry that something called the strip mall effect was rapidly destroying the wild and weird internet that most of us grew up on. The place where each new click was as likely to bring you to Goatsy as it was to some dude's meticulously archived coin collection or a website full of Roy Orbison cling wrap fetish fiction. And nothing in between. No, I mean literally everything in between. It was a fun place.
Starting point is 00:03:46 2010 was the year that Facebook hit 500 million users. People were spending more time there than they'd ever spent on a website. The worry was that Facebook and a few other giant consolidated social media sites would swallow up the weird little websites that had given the internet its character. Much in the same way that Walmart and Target had swallowed up the tiny shops in family-owned businesses that once dominated Main Street America. Fortunately that didn't happen. Oh yeah. I mean I don't need to say that that's exactly what went down. I mean and I didn't make that call somebody else that I was citing in the article did, but it was prescient.
Starting point is 00:04:17 That's exactly what happened to the fucking internet. The term they used at the time was SplinterNet, which is not a term we wound up using. I kind of like it. I kind of like it. I think it defines what Facebook did pretty well. Now, something else happened though that almost no one predicted. I certainly didn't. Rather than being an engine for the spread of knowledge as many techno-utopians had hoped at the turn of the millennium, the internet became the greatest engine for the spread of bullshit ever conceived.
Starting point is 00:04:38 Rather than bringing people together, it facilitated division and hatred on an unprecedented scale. Facebook is not the only culprit behind this, but it's probably the largest one. I don't think this was inevitable. I think most of the negative impacts we've seen Facebook have can be tied directly to the things we already know about Mark Zuckerberg based on the first two parts of this podcast. There are a couple of clear facts I've established about the man that I'm going to list. Number one, he believes Facebook fundamentally is good, and so keeping people on the site longer is also fundamentally good. Number two, because he guessed one thing about the future correctly, once he thinks he is always right about where the future is headed, and three, he has no problem with lying, cheating, and stealing to expand Facebook and furthers the things he believes to be inevitable.
Starting point is 00:05:19 Yeah. Uh-huh. Now, in April of 2016, Mark Zuckerberg announced to the world that within five years, Facebook would be almost entirely video. Video, he assured us, was how most people now preferred to consume their content. This is what the kids wanted. He didn't know that, as Jamie Loftus knows, the kids wanted Doritos. The kids went, Robert, do you see these two Doritos hooking up? They're making love. When it's Doritos, they are always making love and never hooking up.
Starting point is 00:05:48 They're scissoring, Robert. Lovingly scissoring. It is beautiful. Yeah, don't eat those. Let them have their beautiful romance. This is my Christ in a Chip. This is my The Notebook. Yeah, no, that whole video content worked out great.
Starting point is 00:06:05 Speaking of some of these... Yeah, aren't we still all employed by your video content creator? That was all worked out, yeah. I remember when my friends and I all had health care. Yeah. And then this happened. Honestly, congrats on getting that far. Yeah, actually, I'm a little older than you.
Starting point is 00:06:21 I had a couple of extra years. I hit the internet at Sweet Spot. I came in too late. They were like, butchug something and we'll give you $75 for you. Okay, and I did, and it's there forever. I never run for office. But the $75 is... No, I will say this.
Starting point is 00:06:39 The one good thing about President Donald Trump, there is no way in which you're disqualified from office for butchugging. That's true. A guy got to the Supreme Court and we talked about boofing in Congress. Yeah, that is true. Boofing in Congress will be the name of my memoir. Because we're just going to cut the middleman in boofing. That's boof it. I was so weirded out to hear a term that my friends and I used when I was damaging public property as a 19-year-old.
Starting point is 00:07:05 Especially from a man with gigantic pores just being like... Yeah. Boof it. Always my favorite term. And again, I don't want the creepiness of Brett Kavanaugh to make boofing look like a bad thing. Because boofing is an inherently noble action. It's gorgeous. It's beautiful.
Starting point is 00:07:22 Much like two Doritos making love. I'm still looking at it. I haven't turned away. Maybe you guys noticed they've started moving. We're not looking at them. They're moving in together. They're always moving in your heart. So, Mark Zuckerberg tells everybody that within five years, Facebook is going to be almost all video.
Starting point is 00:07:40 Video is how most people consume their content. He tells everyone this is what people want. By this point, Facebook had more than 1.6 billion monthly users. The site was, increasingly, the place where ad dollars were being spent. In my first quarter of 2016, 85 cents out of every dollar spent in online advertising went to either Facebook or Google. So if Facebook says, make video, you make video. Yeah. Now, Maggie, you and I were both working for a website named Crack right around this time.
Starting point is 00:08:06 We're not working there anymore. Why not? Do you remember that fun six months when we all got extra money to make videos? You know what? That was really fun, wasn't it? That was a fun six months. That was fun. That was wild.
Starting point is 00:08:17 I didn't get to make video, but you guys did. Yeah, you guys did. I liked those videos. Aw, thanks, Jamie. Hey, fun videos, guys. Quality videos, great videos directed by great people. Yeah. Yeah.
Starting point is 00:08:26 And then... Anyway, video companies that have just shut down suddenly. Yeah. They're like, I just contacted them to get like tax information and they're just like, we just destroyed everything. We burnt our files. Yeah. The one, I was working at a different company that did like little like news videos and
Starting point is 00:08:46 they too, like it was like one minute we were working in the office, then my friend and I complained about one of our sexist bosses and we got told to work from home and then the company went under. Jesus Christ. Well, at least you get your stuff out. Yeah. Yeah. I got paid.
Starting point is 00:09:03 So, Jamie, you have an out. I have to leave. You're going to flee now. I have to leave. You have to go back. Guys, I didn't want to like make a big deal of this, but there is someone like waiting for me outside and I am going to go back and have my name kind of like bleeped out every time it's mentioned because I'm...
Starting point is 00:09:23 Are you on a date with Marzak? No. No. No. No. No. No. Congratulations.
Starting point is 00:09:32 Thank you so much. Oh, my God. Thank you so much. Now, when I got to ask before you go, when one of them climaxes. Of course. Don't even need to. Does the other make this sound? But like pitch it up an octave.
Starting point is 00:09:49 Pitch it up. That's it. And they come bitcoins. Oh, my God. Which is pretty new. Do they go right into your wallet? Sopping wet bitcoins. Absolutely disgusting.
Starting point is 00:10:00 Sorry, guys. They're waiting in their Bitcoin mobile. You go find love like these Doritos. Goodbye, friend. Well, we've lost a loftus. Yeah, I feel it. I feel the absence. We are unalofted.
Starting point is 00:10:14 Unalofted. We are de-lofted. De-lofted. But we still have a Maggie May fish, which is pretty great. I'm all right. No, I'm no Winkleboss. Well, you're one and a half Winklevi. No.
Starting point is 00:10:29 Oh, thank you. I would say so. Thank you. I'm so happy those weird guys are stuck in pop culture forever now. They're like a permanent fixture. And the only images we'll ever have of them is them rowing in a boat and being angry about Facebook. And that's them forever.
Starting point is 00:10:48 They'll never do anything to surpass that. And we don't care. We won't ever care again. I will never care. Beautiful. Now, Facebook offered a lot of incentives to websites that pivoted to video. For a short time, they even offered a partnership where they would pay you to post videos on their site.
Starting point is 00:11:05 The entire digital media world very quickly swerved to oblige what they thought were just the new realities of the world. In 2016 and 17, MTV News, Vice, Valkative, Mike, and Mashable all fired writers and editors and put more resources into hiring new video teams. Now Maggie and I both worked for a, I would say, medium-sized digital media company at the time. We crack did not lay off its editors and writers to hire video people, but we started pumping a ton of company resources into making videos.
Starting point is 00:11:31 And the people who'd been writing articles started spending more and more of their time pivoting for them sweet, sweet Facebook dollars. Which honestly, as someone who didn't know any of the tech stuff of it, made sense. It seemed great. It seemed great. It seemed to make sense with YouTube and, you know, so. This is what the kids wanted. Yeah.
Starting point is 00:11:50 Zuckerberg knew. Zuckerberg knew. That's what we're about to get to. Now, this was not happening in a vacuum. From about 2013 to 2015, I think Facebook was very good to most of us because people liked sharing our articles and Facebook's algorithm ensured the articles people liked to share got shared to huge audiences. That really started to shift in 2016.
Starting point is 00:12:07 We started seeing the same kind of traffic from Facebook. Every few months, they tweak their algorithm again and traffic would fall. This was all part of a strategy Mark Zuckerberg outlined in an internal email back in 2012. The answer I came to is that we're trying to enable people to share everything they want and to do it on Facebook. Sometimes the best way to enable people to share something is to have a developer build a special purpose app or network for that kind of content and to make that app social by having Facebook plug into it.
Starting point is 00:12:31 However, that may be good for the world, but it's not good for us unless people also share back to Facebook and that content increases the value of our network. So ultimately, I think the purpose of platform, even the read side is to increase sharing back to Facebook. So Facebook was doing well for us, but Facebook did not think they were doing well enough by us because they didn't want people ever off of Facebook. The Splinternet. Because it is a good, it is...
Starting point is 00:12:56 It's inherently good if people spend more time on Facebook. It's inherently good if people do it. And it's inherently good if like Pistoff Boomer News X-19 looks just as credible in a headline form as the New York Times because it's all on your Facebook timeline. Right. How could that end badly? Yeah, and we're not going to check and we will check, but we're not going to pull any cards here.
Starting point is 00:13:16 Yeah. So here or so there, digital media companies worshiped Facebook's algorithm as if it was some sort of mysterious elder god, inscrutable but capable of delivering vast rewards if properly appeased. We published Facebook Instant Articles because Facebook paid us for those and tacitly promised that they would increase traffic for our other content. And of course, we filled the internet with videos. Now unbeknownst to most of us, in August of 2016, Facebook quietly published a blog post
Starting point is 00:13:40 on its Advertising Help Center admitting that it had wildly overestimated the amount of time people spent watching videos on Facebook. It turned out they'd been counting any video views longer than three seconds for their average duration of video viewed metric and discarding any views of less than this amount of time. If you understand basic mathematics, you may recognize this as providing wildly inaccurate information about how value video ads on Facebook were. So they were ignoring any time someone scrolled past the video, like you do 90% of the time
Starting point is 00:14:06 that didn't count, only if you watched it for more than three seconds. Right. Well, yeah. That makes sense. Yeah. Yeah. Doesn't it? Yeah.
Starting point is 00:14:16 Yeah. Now, even that blog post had the reality of the situation. Here's Vanity Fair. Quote, according to a new lawsuit that allowed a group of small advertisers in California to review some 80,000 pages of internal Facebook records, it appears that Facebook was actually aware of the issue long before it claimed. At the time, Facebook told advertisers that it had overestimated views by about 80% at most.
Starting point is 00:14:36 But in Tuesday's complaint, the plaintiffs alleged that average viewership metrics had been exaggerated by 150% to 900%, 900%. Oh, my God. It's so infuriating. That's mine, my buddies, and I don't have health care no more. It's just, it's so, it's wild that- Shout out to Tom Reiman and David Bell and the Gamefully Unemployed Network, back them on Patreon, because they don't have health care anymore.
Starting point is 00:15:03 Yeah, take it like, and they're great people, and it's so, it's infuriating because it's just, how can someone think that they are good at business and then do things like this? Yeah, that's insane. That is insane. 900%. That, okay, that's like if Marie Kondo came to your apartment and as she was cleaning up, just stole most of your clothing and walked away. It's not on the ground anymore, isn't it?
Starting point is 00:15:27 What's your problem? It's clean, isn't it? Anyway, you'll make a profit after a- I'm gonna go sell this shit to Buffalo Exchange. The result of all this was the digital media crash we are all still dealing with today, which is not to say that companies did not make mistakes, like all those companies that fired their writers in order to hire them. Errors were made, but they were made based on the fact that the company that was responsible for all of the ad money lied to us.
Starting point is 00:15:53 Blatantly, and so blatantly, I don't think anyone could have predicted that they were inflating their numbers by 900%. 900%. Because who does that? Sorry, 150 to 900%. Right, of course. The advertisers who also got screwed over, although I'm not super- Actually, I love advertisers, they're great.
Starting point is 00:16:09 Well, when they are ethical- When they're ethical, like by advertising on our podcast. Here you go. Doritos. Doritos. Now, you're wonderful, Maggie. The digital media crash was exacerbated by a number of things, including the fact that after the 2016 election, Facebook de-emphasized and limited the spread of content from brands,
Starting point is 00:16:31 largely as a reaction to complaints about the spread of fake news on their platform. Thousands upon thousands of journalists, writers, and other creatives lost their jobs. The results of this were earth-shattering to many of us. To Mark Zuckerberg, it all came down to a couple of speeches he probably only half are members. Now, I started with this example of a thing Facebook broke because it's very personal to me, but the consequences of Mark Zuckerberg's bad decision-making have amounted to a lot more than a few thousand lost jobs.
Starting point is 00:16:54 Let's talk about Myanmar. Oh, here we go. The ethnic cleansing of the Muslim Rohingya in Myanmar by the Buddhist majority has to date forced more than 650,000 people out of their homes. Tens of thousands have been massacred. 43,000 dead seems to be the low end of the body count estimates. Last year, the United Nations announced that Facebook had played a determining role in the massacre.
Starting point is 00:17:14 United Nations. Determining role. Other social media was also blamed, but Facebook was by far the big kahuna, quote from the United Nations. It has substantively contributed to the level of acrimony and dissension and conflict, if you will, within the public. Hate speech is certainly, of course, a part of that. As far as the Myanmar situation is concerned, social media is Facebook, and Facebook is
Starting point is 00:17:35 social media. The how here has a lot in common with the whole issue of fake news and the spread of violent divisive content that's turned American politics upside down. It also has connections to Russia, because of course it does. Remember how, in 2009, Facebook introduced that newsfeed thing? Oh yeah! I remember. In addition to turning the internet to a walled garden, sucking in ever more ad dollars, it
Starting point is 00:17:56 also ensured that divisive content would spread further and faster than it ever had before. This is because, per Mark Zuckerberg's stated desires, Facebook's algorithm praised time spent on Facebook more than anything else. What kind of content drives that sort of engagement? Why the kind of content people argue over, and get angry over? To Facebook, pissed off people are the people who aren't going to leave Facebook. They'll keep commenting, fighting, and sharing. It took a little while, but oppressive regimes around the world realized this.
Starting point is 00:18:22 One of those regimes was Myanmar's military junta, partly pushed out of power in 2011, but still very powerful and very shitty. Here's the New York Times. They began by setting up what appeared to be news pages and pages on Facebook that were devoted to Burmese pop stars, models, and other celebrities, like a beauty queen with a pin shot for parroting military propaganda. They didn't tend the pages to attract large numbers of followers, said the people. They took over one Facebook page devoted to a military sniper, Oen Mong, who had won national
Starting point is 00:18:47 acclaim after being wounded in battle. They also ran a popular blog called Opposite Eyes that had no outward ties to the military. Those then became distribution channels for lurid photos, false news, and inflammatory posts, often aimed at Myanmar's Muslims. Troll accounts run by the military helped spread the content, shout down critics, and fuel arguments between commenters to rile people up. Often they posted sham photos of corpses that they said were evidence of Rohingya-patrated massacres.
Starting point is 00:19:12 This is interesting. So we watched the front-line documentary where we talked a little bit about this. To bring it back to how Zuckerberg never learned and never grows up, the way he cheated on his final exam at Harvard was to make a fake account on Facebook, post a divisive article about the art that he was supposed to appraise, made another fake Facebook account to stoke arguments on his page so that he could write an essay made off of other people's arguments, and he went on the site to make sure people kept talking about it, kept fighting over it.
Starting point is 00:19:51 Wow. And that's included in the social network movie. And it's just wild to see him do the exact same thing that led to- That the military of Myanmar did in order to engage in an ethnic cleansing. It is the exact same thing and how he can keep claiming that he has no idea. Go ahead. Let's see. Myanmar received training from the Russian government because they were already pretty
Starting point is 00:20:12 good at doing this sort of shit. They're pretty good. Yeah. Funny. For all they do. They're pretty good at it. Funny that shitty people all think the same. In the meme.
Starting point is 00:20:21 In the meme. Yeah. Good times. Oh, we are having fun. I know. Every once in a while, I just glance at the two chips having sex just to remind myself they're still beauty in the world. That's what Doritos is there for, to remind you of the world's wonder, the splendor of
Starting point is 00:20:36 the world. Doritos. Now, for most people in Myanmar, Facebook is the internet. This is because of a plan launched by our man, Mark Zuckityzuck, himself. The Earzucker. The Earzucker. Now, we're going to get into that plan, but first, we're going to get into both products and occasionally, if we have time, services, products.
Starting point is 00:21:02 During the summer of 2020, some Americans suspected that the FBI had secretly infiltrated the racial justice demonstrations, and you know what? They were right. I'm Trevor Aronson, and I'm hosting a new podcast series, Alphabet Boys. As the FBI sometimes, you got to grab the little guy to go after the big guy. Each season will take you inside an undercover investigation. In the first season of Alphabet Boys, we're revealing how the FBI spied on protesters in Denver.
Starting point is 00:21:37 At the center of this story is a raspy-voiced, cigar-smoking man who drives a silver hearse. And inside his hearse was like a lot of guns. He's a shark. And not in the good and bad ass way, he's a nasty shark. He was just waiting for me to set the date, the time, and then, for sure, he was trying to get it to happen. Listen to Alphabet Boys on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts.
Starting point is 00:22:01 I'm Lance Bass, and you may know me from a little band called NSYNC. What you may not know is that when I was 23, I traveled to Moscow to train to become the youngest person to go to space. And when I was there, as you can imagine, I heard some pretty wild stories. But there was this one that really stuck with me about a Soviet astronaut who found himself stuck in space with no country to bring him down. It's 1991, and that man, Sergei Krekalev, is floating in orbit when he gets a message that down on Earth, his beloved country, the Soviet Union, is falling apart.
Starting point is 00:22:44 And now he's left defending the Union's last outpost. This is the crazy story of the 313 days he spent in space, 313 days that changed the world. Listen to the last Soviet on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science? The problem with forensic science in the criminal legal system today is that it's an awful
Starting point is 00:23:20 lot of forensic and not an awful lot of science. And the wrongly convicted pay a horrific price. Two death sentences and a life without parole. My youngest, I was incarcerated two days after her first birthday. I'm Molly Herman. Join me as we put forensic science on trial to discover what happens when a match isn't a match and when there's no science in CSI. How many people have to be wrongly convicted before they realize that this stuff's all
Starting point is 00:23:54 bogus? It's all made up. Listen to CSI on trial on the iHeart Radio app, Apple Podcast, or wherever you get your podcasts. We're back. We've been produced and serviced. That's not the way to frame that. And the ads were-
Starting point is 00:24:13 We have been- Oh, man. Products. Yeah. Woo. I'm going to have me a Dorito. Cover that up. Mm.
Starting point is 00:24:23 Mm. That's good. That crunchy taste. Yeah. I like licking the cheese off of the chip. Fantastic. I'm not proud of myself. This is going to be either the best episode or the worst episode for the ASMR crowd.
Starting point is 00:24:36 It's really hard to predict. The dog is laying on my coat. Aw, she likes the way your coat smells. Aw. Dogs. It's my sweat. I'm- I'm-
Starting point is 00:24:48 Mark Zuckerberg in that interview. Yeah. I'm just sweating out of all of my holes. Aw. And dogs love that. Yeah. Because they don't judge. Aw.
Starting point is 00:24:56 Aw. A dog would love Mark Zuckerberg if he were capable of human affection. Man. That's sad. Wow. Ah. You're like, ah. He has a dog named Animal.
Starting point is 00:25:04 Oh, my God. No, no. That's just a lie. Oh, okay. That's just a lie. He has a dog or not. If he did, he would name an animal. Yeah.
Starting point is 00:25:12 Or farm animal. Or farm animal. Mm-hmm. And then rate whether or not girls he met were hotter than it. Yeah. I don't think I'm hotter than a cow. I've been thinking about this entire time since part one. He has a dog.
Starting point is 00:25:22 It looks like a mop. He has an expensive looking dog that looks like a mop. Oh, wow. It does look like a mop. He would choose a dog that looks like an object. Yeah. Because objects are valuable to him. And dogs are objects to some people.
Starting point is 00:25:34 Yeah. Now, for most people in Myanmar, Facebook is the internet. And this is because of a plan launched by Mark Zuckerberg, as I stated in the last one. The plan had its roots in 2012 when Facebook first went public. As part of an IPO, investors get research on both a business's potential and its potential pitfalls. One problem that was noted for Facebook in the future is that by the time it went public, it had already connected virtually every human being in the parts of the world with widespread internet access.
Starting point is 00:25:59 There just wasn't a lot of room for the company to grow. Everybody in Europe and America is everybody in the countries that have a lot of internet. Oh, well, we make money. Yeah. Because it gets sucked into the other. But yeah, yeah, yeah, yeah. So Zuckerberg realized that in order to expand Facebook, he would need to connect the world. So he started partnering with makers of cheap mobile phones and service providers initially in the Philippines,
Starting point is 00:26:21 providing their customers with free data when they used Facebook and just Facebook. These first steps seem to work well. So Mark announced a formal plan in 2014 at the Mobile World Congress in Barcelona, which is a fun week if you're in tech journalism. Oh. For another drunken Matt vomiting story, I had to go to like deal with a bunch of like one of these product showcases at a hotel. And I got really sick either because of the drinking or because I ate some bad paella. And I vomited in front of the hotel and then got into a cab.
Starting point is 00:26:48 And the cab driver asked me, did you see the king? And I was like, what do you mean did I see the king? And he pointed like a flag flying on top of the hotel. And he was like, whenever that flag is there, the king's at the hotel. And I was like, I think I might have just puked in front of the king's limousine. Wow. That's my Matt hasn't changed or learned anything or Robert hasn't changed or learned anything in 30 years story. I'm kind of proud of that.
Starting point is 00:27:11 Thank you. I'm picturing it and I'm proud. I still have problematic substance abuse issues. And I'm fine with that. You know why? I guess you didn't exacerbate an ethnic cleansing in Myanmar. You know what? This guy.
Starting point is 00:27:24 Yeah, I'm gonna say you're fine. Exactly. You know, you read about these people like Dick Cheney or like George Bush, who had horrible substance abuse problems and then sobered up and then killed millions. Right. What if they'd kept drinking and doing coke and died at 50? Better world. I absolutely agree.
Starting point is 00:27:37 Absolutely. In that case. If your only other option is drugs or the presidency, choose drugs. Please choose drugs. Please choose drugs. Please choose drugs. Please don't go into politics after sobering up. No.
Starting point is 00:27:49 No. Yeah. Oh boy. We are on dangerous ground with this podcast. Oh my God. Well, it is funny that Zuckerberg did try to incite the idea of presidency. I have been on record as saying that I think a great TV show idea would be about a time-traveling drug dealer who finds horrible people in history, like Saddam Hussein,
Starting point is 00:28:06 and gets them hooked on pills before they can kill people. Like if Hitler had just had oxy, no holocaust. No holocaust. Hitler's just sitting in a room listening to fucking Wagner and taking a shitload of pills until he dies. What a beautiful world that would have been. What a beautiful world. Time-traveling drug dealer.
Starting point is 00:28:22 If anyone listening is with a network. Or a time-traveling drug dealer, go for it. And cast Maggie May Fish as your lead. Oh great. I saw those pictures you did when you were like a 20s detective. You could rock the look for the episode in the 20s about Hitler. Oh my God. Yeah.
Starting point is 00:28:37 I'll do it. Yeah. I'll do it. Not gonna be great. Boy, howdy. That was quite the digression. Oh my God. It's like a Facebook status.
Starting point is 00:28:45 We keep pausing to discuss and then come back to the feed. Yeah. Well, here's back to the feed. So Mark announced his plan to connect the world at the Mobile World Congress in 2014. Here's a quote from the Guardian covering a speech he gave that day. This is Mark talking. There was this Deloitte study that came out the other day, he told his audience, that said if you could connect everyone in emerging markets you could create more than 100 million
Starting point is 00:29:05 jobs and bring a lot of people out of poverty. The Deloitte study, which did indeed say this, was commissioned by Facebook based on data provided by Facebook and was about Facebook. Now, the crux of Mark's plan involved giving people in poor countries free internet access to a limited selection of websites. Mark started with Zambia, but India was the real prize, with 6 or 700 million potential new users. Now, there were some signs that just rolling Facebook out for free in these places might
Starting point is 00:29:31 be bad. In 2012, a series of fake images began circulating on Facebook, purporting to show the massacre of Muslims by Buddhists. This sparked a riot that left several dead. But Mark did not pay this much heat. He rolled right ahead with internet.org and began connecting the world. Zambia, India, the Philippines, Sri Lanka, and a little country called Myanmar. Now, last year, in an interview with Fox, Mark directly responded to the claims made
Starting point is 00:29:56 by the United Nations about the genocide his social network was enabling and making much worse. He brought up a recent success story, two fake news chain letters that had been circulating on Facebook before they were caught and deleted. Quote from Mark. So that's the kind of thing where I think it is clear that people were trying to use our tools in order to incite real harm. Now, in that case, our systems detect that that's going on.
Starting point is 00:30:17 We stop those messages from going through. Now, as soon as the interview was published, it provoked fury from activists and social media researchers in Myanmar who were actually working to stop the spread of fake news and save lives. Their response to Mark is pretty damning. I'm going to read a healthy excerpt from it. As representatives of Myanmar civil society organizations and the people who raised the Facebook Messenger threat to your team's attention, we were surprised to hear you use this case
Starting point is 00:30:41 to praise the effectiveness of your systems and the context of Myanmar. From where we stand, this case exemplifies the very opposite of effective moderation. It reveals an over-reliance on third parties, a lack of proper mechanism for emergency escalation, a reticence to engage in local stakeholders around the systemic solutions and a lack of transparency. Far from being an isolated incident, this case further epitomizes the kind of issues that have been rife on Facebook in Myanmar for more than four years now and the inadequate response of the Facebook team.
Starting point is 00:31:06 It is therefore instructive to examine this Facebook Messenger incident in more detail, particularly given your personal engagement with the case. The pictures were clear examples of your tools being used to incite real harm. Far from being stopped, they spread in an unprecedented way, reaching country-wide and causing widespread fear and at least three violent incidents in the process. The fact that there was no bloodshed is a testament to our community's resilience into the wonderful work of peace-building and interfaith organizations. This resilience, however, is eroding daily as our community continues to be exposed to
Starting point is 00:31:32 violent hate speech and vicious rumors which Facebook is still not adequately addressing. Damn, that's eviscerating. They reported those posts to Facebook, which eventually, a couple of days later, I think, removed them and then Mark Zuckerberg lied in an interview and said that Facebook caught them and removed them. Yeah. The motherfucker. Yeah.
Starting point is 00:31:54 One of the things they hit Facebook on most in that letter was an overreliance on third parties. In this case, the third parties, of course, were the people writing this open letter. Quote, we identified the messages and escalated them to your legal team via email on Saturday the night of September in Myanmar time. At that point, the messages had been circulating for three days and they continued to circulate for several days after they were reported. So let me be clear exactly about what happened.
Starting point is 00:32:13 Number one, after years of bloodshed and racism spread by Facebook, local activists managed to warn Facebook of in a timely manner about a new threat. Number two. In a miracle. Facebook listened to them and removed the threat. Days after it had first been posted. Number three, when Mark Zuckerberg took flak for enabling an ethnic cleansing, he touted this as a success, erasing the existence of local activists in Myanmar and pretending
Starting point is 00:32:32 Facebook itself had done the job. The way that Zuckerberg speaks because he does view himself as such a genius. Yeah. He often says no one could have seen it coming. Yeah. No one could have. It's been happening for years, dude. People have been telling you for years.
Starting point is 00:32:48 Several people could have stepped in at various moments. Now, Facebook, not Mark Zuckerberg, did issue a response and apologize for erasing the local activists in Mark's first response. Now, Myanmar is the most shocking example of Facebook enabling unspeakable evil, but it is not the only one. While internet.org seems to be something of a failure in India, oddly enough, in part because of a massive grassroots net neutrality campaign. They got like a bunch of Indian peasants to like understand net neutrality and realize
Starting point is 00:33:15 they wanted it. And like, it's a really cool story that we will not cover in enough detail because this is a sad podcast about bad people. Yeah, yeah, yeah. Yeah. But it is a cool story. Check it out. Fake news spread through Facebook has exacerbated ethnic tensions between Muslims and Buddhist
Starting point is 00:33:28 as well as Muslims and Hindus leading to numerous angry mobs and several deaths. There has been quite a lot of bloodshed as a result of Mark's relentless desire to connect the world. Here's a quote from the fantastic, utterly indispensable New York Times article where countries are tinderboxes and Facebook is a match. Solid titling. Wow. Solid titling.
Starting point is 00:33:45 That's sexy. That is a sexy title. That's a hot title. That's a hot title. Almost makes me forget that they platformed the dictator of Turkey in an op-ed recently. But that's on the op-ed section. Okay. You know, that's different from these guys.
Starting point is 00:33:57 Yeah. They're good, hardworking journalists. Quote, last year in rural Indonesia, rumors spread on Facebook and WhatsApp, a Facebook down messaging tool that gangs were kidnapping local children and selling their organs. Some messages included photos of dismembered bodies or fake police flyers. Almost immediately, locals in nine villages lynched outsiders they suspected of coming for their children. Near identical social media rumors have also led to attacks in India and Mexico.
Starting point is 00:34:20 Lynchings are increasingly filmed and posted back to Facebook where they go viral as grizzly tutorials. That kind of content does really well. You don't get to 500,000, 500 million friends without making a few enemies. Oh my God. Oh my God. One of these folks, people that Facebook put up for front line, if she was confronted with that, she would say, yes, actually, we are aware.
Starting point is 00:34:44 We are aware. We are aware. Of the lynching gangs. We just, I mean, we're a company that came from a dorm room. I don't know if you saw the movie. A dorm room. A dorm room. Isn't that quirky?
Starting point is 00:34:55 Isn't that quirky? We're just like, you know. We were fed in the dorm room and now college students are being dragged out of their dorm rooms and beaten to death in several countries for being gay. Isn't that great? We hear you. Yeah. We hear you.
Starting point is 00:35:09 It's not our fault that fake news about them assaulting people spread on Facebook and then they got murdered. Right. It is not our fault. We like dorm rooms. And if you try to put in any law to stop us, we'll just get slower and worse at doing this. So don't you fucking dare.
Starting point is 00:35:24 Don't you fucking dare. Don't you dare. We're Facebook. We're Facebook. Now, this shit has happened in Sri Lanka too. Last year in the capital city of Colombo, an anti-Muslim video went viral. Activists and government officials watched in horror as prominent racists posted things like, kill all Muslims.
Starting point is 00:35:39 Don't even save an infant. And let Facebook's algorithm carry it off to millions of angry armed people. Now, social media analysts in Sri Lanka flagged the video and that baby killing post and then sort of sat back to see if anything would happen. Despite repeatedly complaining about the horrific violence unleashed by Facebook, the company had not provided these activists with any kind of direct line. Facebook had assured them the tool would work well enough. The anti-Muslim video and the kill even babies post were found not in violation of Facebook
Starting point is 00:36:06 standards. One of these researchers who helped flag the videos told the New York Times, quote, you report to Facebook, they do nothing. There's incitements to violence against entire communities and Facebook says it doesn't violate community standards. Now, Facebook standards can be hard to parse out. Or understand. Or understand.
Starting point is 00:36:24 Unlike a human and emotional level. Yeah, since they are a private company, we have no right to that information. The best that we can do is look to some of the comments Mark Zuckerberg himself has made on similar matters. Shall we? Yeah, let us shall. In an interview with The Guardian, he was asked about the proliferation of Holocaust denial talking points on his site.
Starting point is 00:36:44 He called such content deeply offensive, but said, quote, I don't believe that our platform should take that down because I think there are things that different people get wrong. I don't think that they're intentionally getting it wrong. It's hard to impugn intent and to understand the intent. I just think as apart as some of those examples are, I think the reality is that I also get things wrong when I speak publicly. Oh, so, oh, well, I mean, it makes sense. He is a liar and a thief, so he should allow a site that allows other liars and thieves.
Starting point is 00:37:18 Because that is he, he's a billionaire. I get things wrong and I accidentally claim credit for the work of diligent activists who are trying to stop the damages of my platform. And so also, Holocaust deniers get things wrong too, and I can't, I can't be angry at them. Yeah, I can't because who am I to get angry at them? Now, Facebook is a private company. They are publicly traded, but they are a private company and they can set a policy of censorship
Starting point is 00:37:44 for any of their, like any of this internal stuff. They can claim that this is all, like our standards and stuff is like a business like thing that we need to keep secret, otherwise other social media companies could copy it or whatever. Yeah. Exactly. So, I just love that quote. Yeah, it's a good quote.
Starting point is 00:38:00 Well, if they're honest Holocaust deniers, why would we censor them? If they honestly think Muslim babies should be killed, why would we censor them? It's their opinion that Muslim babies should be murdered. And that's okay on Facebook. In my online country, that's okay with me. That's okay. Now, in that interview, Zucky Zuck Zuckaroo stated his opinion and presumably Facebook stance that offensive speech only crossed a line when it endangered people.
Starting point is 00:38:25 We are moving towards a policy of misinformation that is aimed at or going to induce violence we are going to take down. If it's going to result in real harm, real physical harm, or if you're attacking individuals in that content shouldn't be on the platform. So it's sweet to know that kill these Muslims and their babies did not cross that line. Right. For him personally. For him personally.
Starting point is 00:38:43 Cross that line. Maybe he understands what danger or danger across means or is no, his life would change irrevocably if someone just punched him in the face once, which is why I am in favor of punching rich young men in the face. I agree. I think he would therefore maybe understand. Oh, what these people in Sri Lanka and Myanmar are going through might be like that one time I got hit in the face and I hated that.
Starting point is 00:39:08 Oh, no. Oh, no. Me. Maybe that would help. Oh boy. In that same interview, Marzok also addressed the hate speech ethnic cleansing problem in Myanmar. Well, people use tools for good and bad, but I think that we have a clear responsibility
Starting point is 00:39:26 to make sure that the good is amplified and do everything we can to mitigate the bad. Oh, they're one of their, I don't know, spokesperson verbatim said about the Myanmar. They had a bad experience. A bad experience. They had a bad experience. Both mobs and an ethnic cleansing. And so we, as a company, we want to make sure that we want to reduce the bad and increase the good.
Starting point is 00:39:52 And increase the good. Yeah. Now, don't ask me any more questions. I used to imagine her with like a microphone talking to some lady who's been beaten to death in the street for being a Muslim in Myanmar, so could you tell us how could Facebook improve your experience? Right. How could we make this better?
Starting point is 00:40:05 How can we fix this? Just a little bit. No, not that. We're not going to take it down. We're not going to take it down. No. No. No.
Starting point is 00:40:15 When he said your baby should die, that was not a violation of our terms. Right. Because maybe your baby should die. What if we improved the timeline? Would that help you? Yeah. How's that? We could make it easier for you to tell people that your baby got murdered.
Starting point is 00:40:27 Are you dead? Okay. Let's get someone else. Okay. Okay. So maybe that does not all clear up exactly what Facebook's line is, but thankfully an internal guide they handed out to their content moderators did leak out, and it included the clearest statement from the company yet on when violent hateful speech crosses a line.
Starting point is 00:40:45 It's like a PowerPoint slide. Introduction is up at the top, and then it says, why do we IP block content? And then there's some bullet points. The content does not violate our policies. We face the risk of getting blocked in a country or a legal risk. We've respect local laws when the government has made clear its intention to pursue its enforcement. Holocaust denial, illegal in 14 countries.
Starting point is 00:41:04 We only consider it for the four countries that actively pursue the issue with us. So Facebook will only IP block the spreaders of dangerous content with the governments of those countries actively go after them. We don't care that it's illegal in your country to deny the Holocaust. Only if you threaten Facebook's bottom line will we block Holocaust denial content. Yeah. Yes. Then and only then.
Starting point is 00:41:31 Then and only then. Freedom of speech. If you're a sex worker, you can't use Facebook and post stuff. Oh, absolutely. No, that's not okay. Still the wildest disconnect. Deny the Holocaust? Absolutely fine.
Starting point is 00:41:40 Absolutely fine. Do it all day long. Advertise your business as a sex worker? No. No. No. No, sir or madam. No.
Starting point is 00:41:48 Violence, yes. Sex, no. Violence, yes. Sex, no. No. Very American. Yeah. Very American.
Starting point is 00:41:56 Very Mark Zuckerberg. Mark Zuckerberg. Violence, yes. Sex, no. Actually, that should be our T-shirt. Oh, violence, yes. No. Mark Zuckerberg's face in the middle.
Starting point is 00:42:04 Yeah. I think we got us a T-shirt. Oh my God, I'd wear it. I'll buy that. I'd wear it twice. I'll buy it and then I'll donate mine to my local sex workers. There you go. Yeah.
Starting point is 00:42:12 Yeah. As you all should. They will appreciate that. Yeah. Yeah. Yeah. All right. We got some ads.
Starting point is 00:42:20 All right. Some products. Maybe a service or three. And while we wait for that, I'm gonna, are you just licking that Dorita? Yeah. This is, I actually licked it earlier and this is my second lick. Maximizing the flavor potential. Yeah.
Starting point is 00:42:33 This is a rule. The F.P. Oh my God. Yeah. That's what we call in the biz. Yeah. Products. During the summer of 2020, some Americans suspected that the FBI had secretly infiltrated
Starting point is 00:42:46 the racial justice demonstrations. And you know what? They were right. I'm Trevor Aronson and I'm hosting a new podcast series, Alphabet Boys. Because the FBI sometimes, you gotta grab the little guy to go after the big guy. Each season will take you inside an undercover investigation. In the first season of Alphabet Boys, we're revealing how the FBI spied on protesters in Denver.
Starting point is 00:43:15 At the center of this story is a raspy voiced, cigar-smoking man who drives a silver hearse. And inside his hearse were like a lot of guns. He's a shark. And not in the good and bad ass way. He's a nasty shark. He was just waiting for me to set the date, the time, and then for sure he was trying to get it to heaven. Listen to Alphabet Boys on the iHeart Radio App, Apple Podcast, or wherever you get your
Starting point is 00:43:39 podcasts. I'm Lance Bass and you may know me from a little band called NSYNC. What you may not know is that when I was 23, I traveled to Moscow to train to become the youngest person to go to space. And when I was there, as you can imagine, I heard some pretty wild stories. But there was this one that really stuck with me about a Soviet astronaut who found himself stuck in space with no country to bring him down. It's 1991 and that man, Sergei Krekalev, is floating in orbit when he gets a message
Starting point is 00:44:16 that down on Earth, his beloved country, the Soviet Union, is falling apart. And now he's left defending the Union's last outpost. This is the crazy story of the 313 days he spent in space, 313 days that changed the world. Listen to The Last Soviet on the iHeart Radio App, Apple Podcast, or wherever you get your podcasts. What if I told you that much of the forensic science you see on shows like CSI isn't based on actual science?
Starting point is 00:44:54 The problem with forensic science in the criminal legal system today is that it's an awful lot of forensic and not an awful lot of science. And the wrongly convicted pay a horrific price. Two death sentences and a life without parole. My youngest, I was incarcerated two days after her first birthday. I'm Molly Herman. Join me as we put forensic science on trial to discover what happens when a match isn't a match and when there's no science in CSI.
Starting point is 00:45:26 How many people have to be wrongly convicted before they realize that this stuff's all bogus, it's all made up? Listen to CSI on trial on the iHeart Radio App, Apple Podcast, or wherever you get your podcasts. We're back. We just finished talking about when Facebook's cool with Holocaust denial and the answer is usually unless you're Germany and you threaten to go after them, which good on Germany. Good on Germany and also I realize there must have been a day where someone wrote down the
Starting point is 00:46:02 four countries that will come after them. Where we have to stop Holocaust denial, only these four places. And everywhere else where it is illegal. We don't care. But it won't affect us. But it won't affect us so we're fine with it. Yeah. Now it would be unfair of me to acknowledge all of this without acknowledging that Mark
Starting point is 00:46:21 Zuckerberg himself does not want any of this shit to spread. In fact, there are anecdotal stories that when Donald Trump first announced his run for president with that uber racist wall speech, Mark wanted to ban Trump from Facebook and his campaign from Facebook. He was reportedly talked down from this. I have no doubt that if Mark Zuckerberg had been on the other end of that flagged murder the baby's post when it came through, he himself would have deleted the comment and IP ban that person, any human would.
Starting point is 00:46:47 But Mark didn't put a human in charge of that job. He chose a robot. The biggest problem with internet.org, the reason it has been responsible for so much bloodshed, it was also, by the way, absolutely critical to the rise of Rodrigo Duterte who has now killed 20,000 people. He has a social media army who harasses and sends death threats via Facebook to his detractors. Anyway, Facebook also sent people out to help train his team and how to use Facebook. Oh, that's fun.
Starting point is 00:47:11 They do do that. Don't they? They send them people, yeah. But the big reason that all of this violence has been possible is that Mark Zuckerberg launched his groundbreaking society-altering technology into countries that neither he nor anyone else at his company understood. This is the Silicon Valley equivalent of the Iraq War, which, by the way, was planned without the input of anybody who spoke Arabic, let alone understood Iraqi culture.
Starting point is 00:47:34 They didn't have one guy who wanted to take Saddam down so that he could make a bunch of money because he was a corrupt motherfucker and wanted to be power himself, but they didn't have any experts on the culture involved in that. This is what Mark Zuckerberg did, too, when he brought the internet via Facebook into these countries. The tool. There's a very short distance for a tool to become a weapon. You just have to hold it above your head.
Starting point is 00:47:56 And there was no people from Myanmar, no people from Zambia, no people who were working for Facebook and on the ground in the country. And letting the know the whole- They're letting algorithms deal with it and figuring that'll be fine. Now, Victor Rio, a social media analyst in Myanmar, told The New York Times a major issue in the spread of violent rhetoric via Facebook in that country was the fact that Facebook had almost no Burmese speakers that local watchdogs could communicate with. The few people who speak that language and work for Facebook are based in Dublin, which
Starting point is 00:48:23 you may note as not Myanmar. Yeah, not at all. Not at all. Not even close. No, not. And this is where we see, in my opinion, the clearest downside of the move fast and break things ethos. Sometimes the things you break are people.
Starting point is 00:48:38 Maybe if you're going to introduce a service to a new country and that service has the potential to absolutely revolutionize the way they communicate, you should not do that until you have a sizable team of people who speak the language and understand the country actually working for you. Maybe doing anything else is unspeakably irresponsible and perhaps even evil. This is why Mark Zuckerberg is my pick for the worst monster of the 21st century so far. He is not a murderous, violent man like Vladimir Putin or Rodrigo Duterte. In fact, many, if not most of the people who spend a lot of time around him describe him
Starting point is 00:49:10 as warm, decent, a good listener, and an empathetic person. But the 21st century so far is a period defined by arrogant, mostly men making rash decisions based on little evidence that have a shattering, violent impact on the lives of millions of people who live far away from them. Mark Zuckerberg is the equivalent of a little kid who asked for a BB gun for Christmas and was given a nuclear warhead, and that is the positive way to spend this, the conclusion that gives him the most credit as a human being. There is a lot of evidence that this is exactly what Mark wanted, that his dream all along
Starting point is 00:49:37 was to become basically the dictator of a digital nation, and that connecting people has been less important to him this entire time than building an empire. In October of 2010 Vanity Fair declared Mark Zuckerberg our new Caesar in an article lauding him as the greatest of the Silicon Valley Titans. I'm going to guess that was a comparison Mark really enjoyed. Here's another quote from that fabulous New Yorker article. He first read the Aeneid while he was studying Latin in high school, and he recounted the story of Aeneas' quest and his desire to build a city that, he said, quoting the text
Starting point is 00:50:09 in English, knows no boundaries in time and greatness. Zuckerberg has always had a classical streak. His friends and family told me, Sean Parker, a close friend of Zuckerberg, who served as Facebook's president when the company was incorporated, said, there's a part of him that it was present even when he was 20, 21, that this kind of imperial tendency. He was really into Greek oddities and all that stuff. At a product meeting a couple of years ago, Zuckerberg quoted some lines from the Aeneid. On the phone, Zuckerberg tried to remember the Latin of the particular phrases.
Starting point is 00:50:35 Later that night, he imbed me to tell me two phrases he remembered, giving me the Latin and then the English. Fortune favors the bold and a nation empire without bound. We are all duly aware of how fake news spreads on Facebook and the impact it may have had on the 2016 election in the United States and for some of us, our relationships with our family members. Mark denied this at first, but he has gradually copped to a tiny amount of responsibility for the hundreds of thousands of fake news pieces Russia's internet research agency managed
Starting point is 00:51:02 to promote, although he thinks it's silly to blame the results of the election on that. There have been numerous calls since 2016 for Mark to step down from his creation. He has so far refused them all. At present, despite a falling stock price and all the corpses, Mark Zuckerberg has no plans to give up his empire. And I don't think he ever will. No, I would. No, I would.
Starting point is 00:51:24 At this point. He is my pick for worst person of the 21st century because he's like Hitler, pretty much everybody's going to call the worst person of the 20th century. Mao killed more people, but Mao killed more people mostly by accident. Hitler killed them because he wanted to kill them and kill them faster. Concentration camps are a huge deal in that country. Hitler is the guy who really figured out how to make the most efficient, most vile, most terrible and effective.
Starting point is 00:51:47 Mark Zuckerberg is the greatest, George Bush is another one, just a dumbass guy who got in power. He talked with a place he didn't understand and caused unspeakable harm. And Mark Zuckerberg is that guy, but for the world. It's just fascinating. And the reason I kept saying that Mark Zuckerberg is dumb is because I think one, it would have hurt his feelings the most. Yeah, that's the thing he clearly values.
Starting point is 00:52:11 He clearly values it. And two, I don't understand how he just blatantly didn't want to consider anything outside of his. No. I'm going to connect this country, but I'm not going to have an office in this country where there are people who can deal with, say, the spread of violent and racist propaganda. That would reduce my stock value because it's more expensive. It's baffling.
Starting point is 00:52:35 And I don't know if a cautionary tale is correct, but kind of just like a diagnosis for a problem. It's a problem and it needs to be fixed. Facebook is here, you're not going to stop people from connecting on social media. He needs to be removed. Right. He needs to be gone. He needs to understand like a different set of values for what our online space is going to look like and how the tool is going to be used.
Starting point is 00:53:00 If you have to put a guy in charge, pick someone like Hamdi Ulakai, the Chobani CEO who is a refugee as a kid and understands the dangers of hateful rhetoric spreading like wildfire. It's just, it's sad. It's sad and also speaks to the problem when power is hand in hand with money because this will always be a problem when that's the case. And Mark will always have power. Because he will always be one of the richest people in the world and he will do, God knows what with it next.
Starting point is 00:53:31 God knows what. God knows what. But I'll bet you he doesn't think it through. I bet he doesn't either. I bet he's not going to do like what LeBron James did and just start good free schools for poor kids. Hell no. Even though he could.
Starting point is 00:53:45 He could do hundreds of them. There could be Facebook schools all over, but instead there is a Facebook hospital that does not accept anybody's insurance. But does it accept Bitcoin? Your goddamn bet does. Your goddamn right it does. It sure does. Yeah.
Starting point is 00:54:00 You know, also a good time to point out that he doesn't have a charity. He has an LLC. Everything is for profit. But he's giving away 99% of his wealth. Yes, sure. Yeah. No. Are you telling me that when he said he was making a charity, he actually built a perpetual
Starting point is 00:54:14 money machine for himself and his family? It is like what I'm saying. That sounds like Mark. Yeah. That's what I'm saying. Oh, Zuckie Zuck. Oh, man. You crazy fuck.
Starting point is 00:54:24 It's. Yeah. Wow. It also, it's fascinating, you know, watching all these interviews of Zuckerberg, how many times he cites his origin story as portrayed by. Yeah. Yeah. Even though he says he doesn't like it.
Starting point is 00:54:39 Yeah. Yeah. He has really glommed onto that idea of himself. And there's. Again, just a snake eating its own tail. Yeah. He believes the mythos of his own genius. The Mark, the Zuck, Zuck man with a samurai sword in one hand and a dead Burmese baby
Starting point is 00:54:55 in the other. Yeah. That's how we should start picturing Zuckerberg. That's the shirt. That's the shirt. That's the shirt. That's the shirt. Sophie looks like she thinks this is a terrible idea and it may be, it may be.
Starting point is 00:55:10 But let's do it anyway. Let's do it anyway. My income is no longer tied to Facebook. Fuck it. Oh, boy. Maggie, do you have any research that you didn't get to in this podcast? Because I know you did a bunch. You know, normally our guests come in cold.
Starting point is 00:55:24 Yeah, I did. Which allows me to have the illusion of being smart. Right. Well, again, I think we did basically cover everything. Yeah. The idea that he buys into his mythos. When there was talk of him running for president, I. That motherfucker.
Starting point is 00:55:38 That motherfucker. He was 100% running off of the reputation that the film and the book gave him. Yeah. None of his own work or the facts. And yes, this is a personal opinion. Yeah. I have no problem with that picture of Trump in a truck. Uh-huh.
Starting point is 00:55:55 I have a huge problem with that thing of Mark on a tractor. Because with Trump in the truck, he's clearly having a great time. He got a chance to be behind the wheel of a big truck. Of course. Why not? That's the most human thing he's ever done. He's like, yeah, I want to be that one fuck with this big truck. It is the most human thing he's ever done.
Starting point is 00:56:07 Fine. Yeah. Mark looks like a fucking, ah, this is how you humans make money. Yeah. Yes. Yeah. Um, and it's also crazy, he wanted to run for president. And yet when you hear him talk.
Starting point is 00:56:20 Oh, he's a void of charisma. Whoa. It is a just a whole. Yeah. Look at him on that tractor. You should look back and fall out. Back as straight as a rod. That should be on the T-shirt.
Starting point is 00:56:30 Yeah. Holding up a pole. Yeah. Oh. Oh. It's just, it's wild. The way that he was able to, um, adopt his own, I don't know, he became a Greek legend in his own mind.
Starting point is 00:56:46 Um, and so literally nothing will stop him. He will keep stealing and keep breaking until there are laws in place, which again, their company, every way they word their responses is basically a warning. Like, look, these are problems that we like may have, could have fixed, but one, first of all, like who could have seen it coming? Who could have seen it coming? Yeah. Like who?
Starting point is 00:57:11 And then secondly, um, basically we just like don't have a lot of resources. So like you better not try to like regulate us or like we, more people will die. We don't need to be regulated. Yeah. Why would that be necessary with the utility, which we're not. Right. Even though people use us in their daily life and we're indispensable now. Right.
Starting point is 00:57:32 So we can't be regulated or controlled in any way, shape or form. Yeah. It's a problematic mindset that comes from his background and who he is and his early success. And, you know, the way that he talked about women being compared to farm animals, what are people signing up for Facebook, but more farm animals, more dumb in his mind, people giving us their data to hand off to others for profit. For profit.
Starting point is 00:58:00 Um, and. And for the sake of completion, which I do think is a big fact. He just wants, he wants, he wants six billion. He wants six billion. Yeah. He wants all of it. He wants, you know, when we tried to take over America, what is that called the expansion or, you know.
Starting point is 00:58:14 Manifest destiny. Manifest destiny. Yeah. You're right. You're right. This is that exact attitude. Yeah. It's manifest destiny.
Starting point is 00:58:22 Uh-huh. For the internet and people's data. Yeah. That's a really good comparison to draw. Yeah. Yeah. I mean, we were joking a little bit earlier about, you know, people who have drug problems and then sober up and do terrible things.
Starting point is 00:58:35 Right. I think the answer is. Uh-huh. Really, really rich kids. Yeah. Should be encouraged. Just no, don't go, don't start an app. Don't start a company.
Starting point is 00:58:43 Just do it. Your dad's got all the money. Just do shitload of drugs. Mark, it's fine. It's fine. It's fine. Listen, be lazy and be by yourself and. Let some kid who's been at the other end of a hate mob and survived start the social
Starting point is 00:58:54 network and be like, yeah, but we got to make sure it can't be used to do this. Yeah. Because that's terrible. I know how bad that is. Yeah. Um, I mean, again, it's just like another one in a long line of why like diversity will make your company better. Not just will make it better.
Starting point is 00:59:12 We'll stop it from committing horrific crimes against humanity or ruining itself or ruining itself and its brand by contributing to horrible crimes against humanity. Great. Well, Maggie, as we close this out, I'm going to propose something. We have these, these two Doritos, which are merged in the Dorito equivalent of coitus. I'm going to grab one, grab the other and we'll pull them apart and we'll have these last Doritos. Ready?
Starting point is 00:59:39 Here we go. Okay. All right. Mmm. Mmm. Mmm. I feel like smoking a cigarette. That was three episodes of a lot of stuff and two Doritos and have a lot of love Doritos
Starting point is 00:59:54 with a lot of love in them. Because Doritos are love, my friends. Maggie, you're going to plug your plugables before we close this episode out. I do. Well, first I will say, since the loft is left, the loft is left for her new life with the Winklevosses. With the Winklevosses. So do make sure to follow her on Twitter, Jamie Loftus Help.
Starting point is 01:00:15 Jamie Loftus Help. You got it. Uh-huh. And Jamie Christ Superstar. Jamie Christ Superstar. Insta. Please. And also check out her podcast, Bechtelcast.
Starting point is 01:00:26 If you ever see her in public, don't say anything to her, but hand her a single orange and then wordlessly walk away. Oh my God. She'll love it. That's so Godfather. She's going to be worried, she's going to be assassinated the rest of the day. Well, she'll listen to the episode probably. And that ruins the joke.
Starting point is 01:00:42 That's true. It wasn't a nice joke. Do that or break a lightbulb. Or break a lightbulb. She'll understand. See, this is why you need a diverse crew around you. Otherwise, I'd just be a ruinous lightbulb chucking orange throwing wreck. There you go.
Starting point is 01:00:56 You're a villain. Yeah. I would also be terrible without the people around me. I'm sure of it. We all are. That's why society exists. That's right. Because we're gross monsters on our own.
Starting point is 01:01:06 Yeah. We need to help each other. Be better. Yeah. Yeah. So just follow her. Follow me on Twitter at Maggie Mayfish and Instagram. You can find out my video essays on film and cultural phenomenons on YouTube at Maggie
Starting point is 01:01:23 Mayfish. I'm including a really fantastic one on David Fincher's Opie fucking Fight Club, which is great. It's great. Also, when I was rewatching the social network, a lot of heavy lifting off of Fight Club. Yeah. I will say, much as Zuckerberg stole a lot of his data and ideas, Fincher stole a lot of his ideas from lesser known French and Russian filmmakers.
Starting point is 01:01:49 Well, who among us hasn't stolen from a Russian? Hey, that's true. Why they're so angry. I mean, that's why they're so mad. Sophie's admitting to stealing from many Russians. I don't know where your dog came from. Genetically, I don't know. That's true.
Starting point is 01:02:02 It's entirely possible. It came from some wolf on the Siberian steppes. Ooh, could be. No way to know. I mean, probably a way to know, but I don't know. I don't know. And I'm Robert Evans, the host of Behind the Bastards, which you can find on the internet at BehindTheBastards.com, where there will be the copious source list for this episode,
Starting point is 01:02:17 all these episodes, and you can find us on TheGram, courtesy of Mark Zuckerberg at AtBastards.pod. You can find us on Twitter, courtesy of another creepy guy at AtBastards.pod, another creepy guy who seems to have a real preference for Nazis. And you can find my book, A Brief History of Vice on Amazon, and there's a Bezos episode coming. Oh my God, is there? I mean, how could there not be? How could there not be?
Starting point is 01:02:45 I'm very excited. I'm very excited about his divorce. I have whatever. I'm glad she's getting a bunch of money. I mean, that's why I'm excited. She might become the richest woman in the world from the divorce. I hope you recognize. You got 130 billion.
Starting point is 01:02:58 Wife fight. 65 billion and 130 billion are the same amount of money. Maybe. Maybe. Maybe. I don't know the man. Anyway. I know the man.
Starting point is 01:03:11 Yeah, we all kind of do, don't we? We know him. Let's not. We know him. We know him. I do love one of my favorite things is to look at pictures of Sulakhan Valley billionaires back when before they were billionaires, like Elon Musk before he got money. That's awesome.
Starting point is 01:03:26 It's so cute. Oh, Sophie's pulling it up. He's a totally different man. He looks like Dana Carvey in that movie where Dana Carvey played a turtle. Oh my God. Oh my God. Oh my God. Oh my God.
Starting point is 01:03:39 Oh my God. Oh my God. Oh my God. Oh. Oh, that's a good note to end on. A little bit of joy. Look up Elon Musk before he got rich and just have you a good time. Until next week, I'm Robert Evans and I love roughly 40 percent of you.
Starting point is 01:04:06 Alphabet Boys is a new podcast series that goes inside undercover investigations. In the first season, we're diving into an FBI investigation of the 2020 protests. It involves a cigar-smoking mystery man who drives a silver hearse. And inside his hearse was like a lot of guns. But are federal agents catching bad guys or creating them? He was just waiting for me to set the date, the time, and then for sure he was trying to get it to happen. Listen to Alphabet Boys on the iHeart Radio app, Apple Podcasts, or wherever you get your
Starting point is 01:04:37 podcasts. Alphabet Boys told you that much of the forensic science you see on shows like CSI isn't based on actual science and the wrongly convicted pay a horrific price. Two death sentences in a life without parole. My youngest, I was incarcerated two days after her first birthday. Listen to CSI on trial on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts. As you know, Lance Bass is a Russian-trained astronaut.
Starting point is 01:05:10 That he went through training in a secret facility outside Moscow, hoping to become the youngest person to go to space? Well, I ought to know. Because I'm Lance Bass. And I'm hosting a new podcast that tells my crazy story and an even crazier story about a Russian astronaut who found himself stuck in space with no country to bring him down. With the Soviet Union collapsing around him, he orbited the Earth for 313 days that changed the world.
Starting point is 01:05:41 Listen to The Last Soviet on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.