Offline with Jon Favreau - The Inside Story of How Silicon Valley Rewired Our Brains

Episode Date: September 4, 2022

For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast. ...

Transcript
Discussion (0)
Starting point is 00:00:00 social media's drug-like effect is on your social instincts and your social behavior, which is not something that we are used to recognizing as a drug-like effect and changing how we think about right or wrong and changing how we think about our identity. That's not something that we normally associate with that. So we don't spot it, but it is a drug and it's one that we're taking. I think it's like 80% of Americans are taking it an average of about a dozen times a day. I think that's the median number of Americans are taking it an average of about a dozen times a day. I think that's the median number. And if you're politically engaged like us, it's sometimes several dozen. And if you're a young person and your social needs are a lot higher, it's probably
Starting point is 00:00:33 also several dozen. So we are effectively living in a world, and it sounds crazy when you put it this way, but it is really true, where 80% of the population is taking a mood-altering, behavior-altering drug at least a dozen times a day, sometimes several dozen times a day. And when you know that, you're like, no wonder. Right. No wonder we're all crazy. No wonder the world is like this. Right. I'm Jon Favreau.
Starting point is 00:00:58 Welcome to Offline. Hey, everyone. My guest today is Max Fisher, reporter and columnist at The New York Times and author of the new book, The Chaos Machine. If you've been listening to the show, Max's book was written for you. He interviews many of our same guests, Rene D'Aresta, Ezra Klein, Dr. Vivek Murthy, and he covers just about everything we have, but with the depth and scrutiny that can only come from years of slow, methodical reporting. Thank you. and executives inside these companies to paint an honest and incredibly damning picture of the current state of social media. From the creation of the Facebook news feed, to Gamergate, to the election of Donald Trump, he traces the origins of our current political shitshow to many of the internet's most consequential moments. He argues, quite persuasively, that it's not just social
Starting point is 00:02:01 media algorithms that are the problem, but the fundamental design of the platforms themselves, which have literally rewired our brains. Extremism isn't just amplified, but actually created by social media, which Max concludes may be the most destructive force in society today. I realize this is all pretty grim, but the conversation Max and I had here in our studio is one of my all-time favorites. He answered so many of the questions I had
Starting point is 00:02:29 when I first created Offline. And while his answers don't make me much more hopeful that we can actually regulate social media, they did help me understand some of the ways that all of us can reclaim some agency back from these platforms and restore a little sanity to our lives. As always, if you have any questions, comments, or concerns, please email us at offline at
Starting point is 00:02:48 crooked.com. And please take the time to rate, review, and share the show. Here's Max Fisher. Max Fisher, welcome to Offline. Thank you so much for having me. So it's hard to know where to begin because I've been doing the show for a year and you cover just about everything we've talked about in this fantastic book of yours. A lot of the characters have been guests on this show. But I think what was really interesting is how much time you spent with people who have worked at or in some cases helped run companies like Facebook, Twitter, YouTube, Reddit.
Starting point is 00:03:26 And you basically conclude that social media platforms are one of the most destructive forces in society today, if not the most destructive force. And this is key. That's not primarily because of their users, but because of their design. Can you talk a little bit about what led you to that conclusion? So the book started for me as I think the same question that you're kind of circling around with the podcast, which is how is social media changing us? What is it doing to our behavior, to our cognition? How does that change our politics? What are the consequences of that? The fact that it's changing so many people. And it was something that had started for me, I think like a lot of people, like in the back of my head after the 2016 election, when I had this kind of fuzzy sense that social media had done something to play a role in the Trump phenomenon, but I like could
Starting point is 00:04:18 not tell you what it's like, it's exacerbating polarization somehow, or there's like weird extremism on it, or there's this hatred of minorities on the platform that seem to align with Trumpism and these troll cultures. But I would have told you at this point, and most people in Silicon Valley, I think, would have told you, even the people who later, like you said, the kind of dissidents who came around and talked to me for the book, would have said that the platforms are just a neutral vessel, like at most a neutral amplifier for the things that are out there in the world anyway. And that started to change for me and then become something that feels like, okay, maybe I should actually start taking this a little more seriously a year after Trump's
Starting point is 00:04:54 election with the genocide in Myanmar, which of course is this horrible explosion, this very sudden explosion of violence in this country that was partly state led, but was also partly communal and bottom up against this minority in the country. And I was there reporting on it, and I was not thinking about social media because I thought it's just a website, it's just an app, like what effect could it possibly have? But if you were there when this was all happening, it was really obvious that social media was playing some kind of an enormous role, not just in what it was putting in front of people and not just the extremists that it was platforming and what it was spreading, but something about the way it was pulling people in and making them active participants and really making them feel like
Starting point is 00:05:34 they wanted to engage in all of this hate that was happening online up to and including the point of acting on that out in the real world. And you would hear it from everybody. And eventually, even the United Nations came out and said, this still blows my mind that it reached this point. They said that Facebook played a, quote, determining role in the genocide in Myanmar. Even then, I still did not really take it fully seriously enough until a couple of months after that. I started noticing that basically everywhere I went to report on other things, I would hear really similar stories, similar stories to the Trump phenomenon or to the Myanmar genocide that all seemed to trace back to Facebook. And I was hearing it from people I was talking to, activists
Starting point is 00:06:15 in all sorts of countries, rights groups. And it was this really eerily similar pattern that seemed to be playing out everywhere. And that was when I first started to think, and this is like early 2018. So I think there's been a lot of people who have come on your show and talk to you started to think there is a clear and consistent pattern in what social media brings out in us as individuals and as societies. There's a clear and consistent pattern in what it does and how it does that. And probably things like the Trump phenomenon, the Myanmar genocide, if it can reach that point are just the tip of the iceberg. And it probably has all these other effects that we're not even aware of because they're not as obvious. And so that was when I started to think, nobody really knows the answer to this, to what
Starting point is 00:07:00 social media is doing. And it feels really important. So that was when I started trying to figure it out over the next four or five years, partly by doing the kind of, you know, traditional New York Times on the ground reporting, where you'd find some blow up and then try to retrace step by step, post by post, how it happened, what role had social media played for people for the process there, but also working a lot with folks in Silicon Valley who were starting to sound the alarm and also a lot of experts outside of it who were reaching the same conclusion that I was, that I'm sure you were, that a lot of people were,
Starting point is 00:07:32 that social media was playing some kind of role in trying to understand it as neuroscientists who were trying to examine its effect on our brain chemistry, social psychologists who were trying to understand, okay, these social platforms are interrupting the process by which we figure out right and wrong. And what effect does that have on our sense of morality, on our sense of what is right and wrong? Political scientists who were studying, system analysts who were trying to figure out what the algorithms actually do, and tried to, as best I could, pull all of that together in a narrative that would try to answer that kind of question of what it's doing to us.
Starting point is 00:08:08 Man, if a genocide and Trumpism are just the tip of the iceberg, we're pretty fucked. Right. I mean, so you write about how most of the people who work at these companies are very smart, well-meaning, aware of the problems that are out there but you write that walking into facebook was like walking into a cigarette factory and having executives tell you that they couldn't understand why people kept complaining about the health impacts of the little cardboard boxes that they sold why do you think these very smart people yeah couldn't understand that the real problem with these platforms,
Starting point is 00:08:49 which was the design of the platform themselves, the fundamental design, not like we can make a little tweak here, we're going to do this, we're going to moderate this content, but the fundamental design was the problem. Why was that invisible to them, do you think? So it blew my mind to encounter that because it was, as you say, it was a lot of really smart people. And it was a lot of people who I knew from my time in DC and probably people who you knew who were like really smart, really rigorous minded experts in their field, policy professionals who seem to have this like cognitive wall where they would give you like, I don't think that they
Starting point is 00:09:24 were lying. And I don't think that they were trying to spin me because I would ask other kinds of tough questions. And they would say like, you know, we don't know, or this is a really tough policy area to figure out. But when it would be that specific question of anything premised in the idea that social media is actively manipulating us and changing us, that wall that would go up, they couldn't acknowledge it. I think partly it's just like basic human emotional self-protection. You don't want to believe that you're participating in the new cigarettes of our time. Partly it's a lot of Silicon Valley ideology that I think has faded in the years since among the rank and file that says that it's actually good for us to rewire humanity because we are like advancing
Starting point is 00:10:11 the species into the next stage of human evolution, which is a thing that a lot of people there really sincerely believe. And a lot of it is just is financial interest. These companies pay really, really well. And if you want to get through the day at your job there, you don't want to believe that you are, even if your role at the company is something that is positive, even if your role at the company is mitigating harms, there's a lot of the people I talked to, that was their job. It wasn't juicing the algorithm. It was trying to like reduce terrorist recruitment on it. If you're participating on that, you don't want to believe that there's something about your work that is kind of fundamentally damaging in, which is like, okay, anytime you're going to connect this many people, this is now the new public square.
Starting point is 00:11:15 This is how we all interact. It's going to bring a bunch of people together. And some of those people are going to be extremists. They're going to be bad people, just like they are in life. We all have good and bad parts to us. And so some of that's going to come out on social media. Some of it's not. And I remember, I think at one point in the book, I think you said it was a London-based executive at Facebook who was like, look, all the bad things in the world now happen on a mobile phone should we get rid of mobile phones yeah and i think that's bullshit now because i've been doing this series get it but if you're just listening to that you think well that's that's a point should we just get rid of technology and not communicate with each other and not connect with each other just because some people are being bad actors here right it makes a certain intuitive sense and that was like i said that was how i thought
Starting point is 00:12:03 initially it's like look this is just a forum It's just a place where they're gathering people. So the people bring to it, whatever's there, but the really important thing that we know now, and frankly, the people who work at these companies, they have the evidence to know it now too, is that the platforms are not actually showing you what your community and your peers and your friends and your family think. That's what it looks like. It is not real life. Right. It's not, it's not, this is the back to this. Right. It's not, it's not real life. And not only that, but I mean, it's, it's very tricky because you look at it and you see posts from real people and maybe they're
Starting point is 00:12:38 people, you know, or maybe they're just people who you were aware of. They're in your broader, you know, they're part of democratic politics. they're part of journalism, they're just part of the like online community. And you think that's the world being reflected through the platform. So what you are actually seeing are the choices made by the systems on those platforms, they're showing you a tiny percentage of what's on there, that these very powerful algorithms have selected and ordered in a specific way to, and this is something they do very deliberately, to manipulate your emotions and to manipulate your cognition to get you to spend more time on the platforms. And they are presented in a way with likes and retweets and these little
Starting point is 00:13:15 counters and these little buttons that are meant to set off certain like monkey brain instincts in you that will keep you clicking. So what you are seeing is in fact the choices, the preferences, the desires of these very sophisticated systems that want just to keep you on the platforms as long as possible. Well, I think that the most persuasive arguments in your book are about how these platforms actually change human behavior. And you trace the origins of so many social media problems to the creation of Facebook's newsfeed. Not just Facebook, but the newsfeed specifically in September of 2006, which you call Silicon Valley's monolith moment. What do you mean by that? My God, you really did read the book. It's really nice. I really appreciate that. I was like, this book is written for me.
Starting point is 00:14:05 It's written for you. And the listeners of this series, for sure. Maybe for other people, but you're definitely on a very short list who it was written specifically for. So that was, yes, that was a moment that really blew my mind. And I'll tell you why it's important. But it also was like, it was very important to me in the book to not get to the algorithms, which I know I'm talking a lot about and bringing up because they're so powerful, until like halfway through the book, because it's easy to think, oh, it's just the algorithm.
Starting point is 00:14:31 So if we fix the algorithms, the problem will go away. And a lot of it is hardcoded and much more basic elements of the platform, like the newsfeed, which basically originated modern social media, as we know. So before then, you go onto your Facebook page and it looked like a MySpace page. It was just your profile, your friend's profile, maybe some posts on it and you would kind of interact with it. The newsfeed was the first time that you would go onto the Facebook homepage and it would show you a ranked and sorted list of everything that had happened on the platform that you were in some way connected with or maybe not connected with and the platform just thought you might want to know about. And the idea was it would be like this party that would
Starting point is 00:15:08 never end that all of your friends were in. But some people hated it because it was also like any Twitch, any like post that you made, any like, any group that you joined would suddenly show up in everyone's feed. So it was like this all seeing eye that maybe people didn't want to be a part of. And as happens on the internet, anytime anything happens of any sort, some people got really mad. And they posted about it in these groups that if you were online at the time, I was in college, I'm sure you remember called like against Facebook or like fuck newsfeed or like get rid of newsfeed or fuck you, Mark Zuckerberg. And these groups were some of the very, very first things, ironically, because they're against the company that created this, that went like mega viral. Because what happened is anytime someone would join a group,
Starting point is 00:15:50 it would pop up on the feeds of all of their friends. And it wasn't just that they would see a lot of it. It was that it tapped into this very specific emotion that Mark Zuckerberg didn't know this, but we now know this, is basically the most powerful emotion on the internet to engage your attention, which is moral outrage. A lot of that. Yeah. You may have seen it occasionally. It will appear on Twitter. There's a lot of righteousness everywhere.
Starting point is 00:16:16 There's a lot of rights. in the book, and I promise I'll come back and finish the story, where someone tracked every different kind of word that you could put into a tweet by its emotional valence. Basically, if it was angry, sad, happy, left-leaning, right-leaning, and what that would do to engagement. And for the most part, most kinds of sentiment were neutral on engagement. But if you had one word in a tweet that was called a moral emotional word, which basically means moral outrage. And moral outrage is anything that is, it's not just anger, but it's outrage against a social transgression. It's like calling someone out for cutting line in the bus or cutting line to get onto the bus, I guess. I know how buses work.
Starting point is 00:17:03 That if there was one word, it would increase engagement by 20% for every single word that was in it. And that is something that, again, to take it back to the 2016 election, Trump tweets have a lot of moral outrage in them, which is like, how dare Latinos? How dare Muslims? The Democrats are awful liars. And Hillary Clinton did not use moral emotional words because she was about togetherness, lifting us up, whatever. And that was not something that would go viral. So against Facebook, Facebook is spying on you. These are moral emotional words.
Starting point is 00:17:35 So they would just shoot off and go crazy in terms of the engagement that they provoked on News Feed. If you saw it, you became much likelier to click the little like button on it. And if you click the little like button, maybe you're not really outraged. Maybe you're just like, yeah, sure. I guess I agree with that. But once that gets cycled through all your friends' news feeds and they see, wow, hundreds of people, everybody I know, they're all outraged about that. Not only do you feel a compulsion to go along and to click it too, but the way your brain processes these emotions is so powerful that even if you don't really care about Noose Feed, you will yourself internally feel outraged. Your brain basically tricks yourself into agreeing with
Starting point is 00:18:15 your friends if they feel morally outraged into feeling the same and to joining in. And these posts started completely overrunning Facebook, tens of thousands of shares at a time when that was a crazy number. You say like 2008, something like that? Yeah, 2006. 2006, yeah. You wrote the book, right? Yeah. I wrote right here, 2006. I was like, I don't, I was still not a Facebook person back then. Are you really? Yeah, well, it was, you made a wide choice.
Starting point is 00:18:44 I'm an elder millennial so i was i graduated 2003 from college you were you were on myspace just i don't even think i was on myspace really i was like no i was using instant messenger in college to communicate okay i i had a bit so you do know how to use a computer yeah yeah i did i had that basic okay okay so you're competent so this as this is like a pattern that is now very familiar. All of this online outrage that initially was actually a very small proportion of users and for a lot of users was like superficial, was not real, manifested into real world activity and real mobs of real people gathered outside the Facebook headquarters to say,
Starting point is 00:19:21 take down News Feed, which is this incredible demonstration. Deep irony. Right. Yeah. I mean, first of all, it's a deep irony and it's a demonstration of this power. And initially people at Facebook were like, oh my God, we have to turn this off. But Mark Zuckerberg, who is smart in a very narrow and specific set of ways, said, wait a second, this is actually a demonstration of the incredible power of our systems to create all of this engagement. And he was right. And they've really leaned into that development. And then all the other platforms copied it. Every platform that didn't have it completely died out.
Starting point is 00:19:55 And that was basically the genesis of social media as these mega companies that are now among, or if not the largest corporations on earth. I think it's a very important dynamic that we can all relate to, though, especially now. When you see that something's a thing online that everyone's outraged about, it's repetition, right? If you're seeing it enough, if you're seeing enough people angry about it, first of all, your perception is that there's a ton of people angry about it, even if that's not the truth, because social media just amplifies all the people who are angry about it first of all your perception is that there's a ton of people angry about it even if that's not the truth because social media just amplifies all the people who are angry about it even if they're a minority and you either agree with it or if you don't agree with it you still stop and think to yourself maybe there's some truth to that there may be because there's so many people who are mad and i'm seeing it everywhere so it must be a little that's that's
Starting point is 00:20:44 one of the big dynamics. Right. There's definitely, and there's two studies that come to mind for that. There's one that's not social media specific. It's this famous study where they were looking for or identifying this exact like cognitive loophole where you're talking about,
Starting point is 00:20:56 where you see something repeated a lot, especially if you see it repeated by your peers, it feels internally true to you where they would repeat this phrase to people over and over again, the internal body temperature of a chicken. And if they repeated it enough times, they could say any number at the end of it, they could say the internal body temperature of a chicken is 300 degrees. And people would say, yes, that's true. I think that that's accurate. Wild.
Starting point is 00:21:20 Yeah. We're not actually that good at processing, especially if it's in a social context. We are really bad at processing information. Especially if there's this much information coming at us all the time, like every single day from a million different sources, which is the problem now. Right. And all of it cycled through this feeling of this mob gathering to shout with this one unified know, rumor, this thing that we're all outraged about. And the other study that I wanted to mention that comes to mind for me a lot is the, this is an extension of that moral outrage study about, you know, tweets with moral outrage travel further. They ran a version of this where they would, they would pull people to see like how outraged are you at the other political party? And people who had low levels of emotional outrage
Starting point is 00:22:07 towards the other party, for whatever reason, maybe they're just not an outraged person, or maybe they're not politically engaged. If they would send a fake tweet on this fake Twitter platform that they set up for the experiment that expressed outrage towards the other party, and it received a lot of engagement, they would want to post more because they were internalizing that reward. And very quickly after doing that just a few times, their internal sense of outrage towards the other party would have shot way, way up. Their underlying nature would have changed. And the same is true. It's easy to hear that with party and think like, well, maybe that's not so bad because the stakes of our politics are really high right now. So, you know, I'm sympathetic to partisan outrage, but that also applies to any sort of in-grouping, out-grouping, for example, religion or race.
Starting point is 00:22:57 You make the argument that I've heard from so many offline guests, which is that the single most powerful force in social media is identity. Yeah. Which I think is so important because it gets mixed up now in our politics. Well, it's very connected now to our politics, to extremism, to everything else. Can you talk about why that is and why that's become so harmful? Because, you know, I think the perception is like why should identity be harmful right like it's it allows us to express ourselves to uh for to talk to people like us to connect with people who are like us like why has identity become such a divisive and destructive force in social media right and it's a good point that identity in itself is not only is it not a bad
Starting point is 00:23:40 thing but it's important basically for our emotional well-being. We need a sense of identity and a sense of community to just cope with the world day to day. The reason that the refraction of identity through social media has been so damaging is twofold. I'm going to try to remember the second half while I tell you about the first. The first is that just like with moral outrage, the form of identity engagement that really works on social media and therefore that the platforms, and this is all of the platforms, will push over and over again is something called either outgrouping or identity threat. And it's this idea that your in-group, and that might be your political party, that might be race, religion, it might be something smaller like your local community, or it might be moms who are concerned about the health of their kids, is under threat from some
Starting point is 00:24:30 external outgroup that is coming to get us. And that's really powerful because it taps into these deep evolutionarily ingrained instincts that we have as these like tribes that we evolved in to try to defend ourselves against external threats. It's like this alarm bell that goes off in our head. So when that gets activated, it makes us really, really engaged with the platforms. It makes us want to spend a lot of time on it, raising the alarm about the threat, organizing against it. So that is the form of identity time and time again that you see get amplified. And again, with political party, it's easy to kind of dismiss that because, you know, our politics are like that anyway, so it's trickling down. But you see other forms of that, like one case that I spent a lot of time in the book, because it was this
Starting point is 00:25:13 kind of like early case that I think showed us a lot of what was coming was anti-vaccine sentiment, where a form of identity that is really prevalent but is not particularly engaging online because it's not threatened it doesn't have that sense of outgrouping and identity threat is just um moms moms of young kids like that was a community that was online early and you talked to renee di resta so you know about this yeah a community that was really online really early because you're on message boards you're on facebook trying to find tips for raising your young child or your infant. And what the platforms learned, and by this, I mean the systems, not the people working there because they had no idea what their systems were doing and they still don't really know.
Starting point is 00:25:54 The systems learned is that if you're a mom and you search for parenting tips or child health, and it shows you some basic tips, you're not going to spend very much time online. But if it shows you a conspiracy theory that says that the vaccines that doctors are giving you for your kids are going to spread autism, are going to actually give them disease instead of preventing it, that that is very engaging,
Starting point is 00:26:20 not just because it's a scary conspiracy, but because it creates this sense of identity threat that we moms collectively are under siege by, you know, Soros, vaccines, doctors, Bill Gates, whatever it was. And it sounds crazy that that would spread very widely. land the platforms because they become very sophisticated at inching people into it, at kind of recommending in small incremental versions of it until you get this much larger community of not just moms, but anti-vaxxers. And of course- So it starts with just asking questions and- Exactly. You know, maybe vaccines are fine, but maybe did you see this one study where blah, blah, blah,
Starting point is 00:27:01 and then it gets you into the next and then it gets you into the next. Yeah, exactly. And it's questions and we're not sure. And that's really engaging, because if you're trying to figure out, well, are vaccines safe? That's a reason to spend a lot more time online, which is how the platforms learn to push it. So that's one reason that identity threat has been something that has been really damaging on the platforms, or that the platforms have used a really damaging effect because it galvanizes these kind of identity groups against these causes that might not be real or might be very damaging for us socially. Look, I think one challenge with sort of persuading most people that this is what's going on is a lot of people
Starting point is 00:27:36 think, okay, some of these extremists are just easily brainwashed people. They're idiots. I can't be brainwashed like this. I'm one of the smart ones i'm one of the smart ones so the anti-vaxxers something's got to be going on there right i would never do that but i think even like finding information about the pandemic over the last two years absolutely and i am someone who like covers this i want we did our pods of america and i'm also like a hypochondriac by nature so i was a little nervous about it i would like dug deep into covid and what i found over the last two years is like it's just so much easier to find extremes not just on the anti-vaxxer right side but on the
Starting point is 00:28:19 other side too um that like it's really tough to even if you're digging into these platforms even if you're like trying to follow people who have the best credentials anything that is nuanced or complicated you're not going to find if that's the answer if the answer is any kind of nuance or complication you're not going to be able to find it you only find the extremes you find the extremes and you find blame especially and sometimes that's appropriate. Sometimes that's right. And sometimes social media can be really valuable. Which makes it even more difficult. that we, even those of us who might be the kind of person who would pick up a book on social media and think of ourselves as kind of above it, are affected by it, but in degrees and in ways that might not be as obvious because we're not out there denying vaccines for our kids. There's one
Starting point is 00:29:15 study that I write about where they asked people in two groups, one group keep using social media and another group, they said, deactivate Facebook, not all social media, just the Facebook app on your phone for four weeks. And it was really hard to get them to do that because we don't want to turn it off. We have a hard time turning it off. But when they got that group, this really large group to turn it off, and they would kind of over those four weeks study what was happening with them. The one thing that is not going to be surprising is they reported much higher life satisfaction at the end. They were just happier because they weren't spending time on these platforms that do, in fact, make us miserable. And we all kind of know that it makes us miserable.
Starting point is 00:29:50 And, in fact, they found that the increase in happiness and life satisfaction was equivalent to about a third the effect of going to therapy. Wow. Which I know. I know. So if you're spending money on a therapist, it's a lot cheaper than going to a therapist. Yeah. But the other thing they found that I felt really important to me was that people who deactivated Facebook, whatever their political affiliation, completely regardless of it, they were significantly less polarized in terms of how they viewed, not necessarily overall, because just four weeks, but any issues that had come up in those four weeks weeks any issues that have become salient in the news or kind of salient in society they were less polarized than people stayed on facebook equivalent to 50 percent of the overall increase in polarization in american life over the past 25 years so it's funny i do um there's
Starting point is 00:30:43 another podcast i do called The Wilderness, which is about the Democratic Party. And so I just finished a bunch of focus groups. And I specifically wanted to talk to voters who turned out in 2020, but do not follow politics closely, which by the way, is most voters. So not just most Americans, but of the voting population it's like 20 of the people who are like us who are following politics super closely 80 or not and people like us who follow politics closely would look at these voters and be like oh why are they all so moderate seeming but part of it is that they don't pay attention as closely to stuff and so that they they have strong opinions on issues some of them like believe strongly and in progressive ideals and talk that way but they don't talk in a way that we all talk
Starting point is 00:31:31 and they're in that in that polarized way because they're not following it as closely but conversely their views of politics are those people are all fucking crazy in politics because they're all yelling at each other all the time and they're screaming on Facebook and there's less trust in institutions because what they're getting reflected from the media and politics is a bunch of people who are extremely polarized and yelling online all the time. That's a great point. It's the political class is also the very online class at this point. And when people outside of it look at that, they're partly they're seeing the nature of the political class, but they're they're seeing the nature of the political class, but they're also just seeing the nature
Starting point is 00:32:06 of what it is to be very online and very engaged. And it's a reminder, I think, that so much of our politics now are refracted through social media, even though obviously there are a lot of other venues for it, that there's so much political activity now that happens online. Even if it's just people like us,
Starting point is 00:32:23 the political class being online, it affects you so strongly that that filters through the entire system. Yeah. You talked about extremism and I thought it was fascinating how you mentioned Gamergate and you basically draw like a line from Gamergate to MAGA to January 6th. Can you sort of like trace that that like the rise of the alt right and why it started with Gamergate and just for people who might be listening who don't know what Gamergate is can you just talk about that a little first of all if you don't know Gamergate bless you turn off the podcast right now
Starting point is 00:32:56 I don't honestly like until I started the series I didn't really know Gamergate and it wasn't till I dug into your book that I'm like okay now I somehow in 2014 I missed Gamergate good for me it was it did actually it came at this kind of fascinating moment was right at the end of the era when there was like just the internet when like I dismissed at the time and thought like this is just it's just people online it's just like crazy nerds who were just going nuts And people who were affected by it were trying to warn us. And in some cases, trying to warn me specifically, this is about a much bigger shift in our politics. So what Gamergate was, was 2014, right?
Starting point is 00:33:36 Yeah. Okay. An explosion of this conspiracy theory whose origins are very convoluted and frankly not actually important to where it comes out that uh game developers were in league with the gaming media in a conspiracy it's ridiculous i know and it's completely as most conspiracies are when you explain them yeah yeah right right uh we're in league with uh the gaming industry and gaming media to basically marginalize young white men on behalf of feminism. And they were going to change gaming to take out all traditional masculine portrayals. They were going to make you play games with
Starting point is 00:34:15 women characters and LGBT characters. And this was a, and now it might start to sound a little more familiar. This was a conspiracy by the elite and a war on men to keep down and suppress men. And this is something that started on 4chan and Reddit. And if you remember it, you probably remember it because it culminated in this really horrifying harassment campaign against basically any woman involved at any point in the gaming industry or the gaming press. But what was really significant about it was that it started as this like not so conspiracy, but it got picked up by these platforms, especially Reddit, and especially YouTube, that identified it, their systems identified it as something that was going to be really, really powerful at hooking people in because of this idea of identity threat, and cultivating this sense of identity that hadn't really existed before of you're a gamer and that's not just someone who plays video games, but that is who you are. It's your community, your identity, and it is under attack by these nefarious forces. And that
Starting point is 00:35:14 was something that because that's a conspiracy theory that if you could at all relate to that identity was engaging to you, you would spend a lot of time on YouTube and Reddit clicking on it. And that meant that the platforms would push it up more and more. So if you were on these platforms at all, which a lot of people were in 2014, it was something that you would be exposed to over and over again until the point that it would feel real. And that was also something that gave rise to Milo Yiannopoulos and a lot of people who are now quite familiar to us as the alt-right. And in fact, a lot of alt-right and far-right and neo-Nazi websites identify this.
Starting point is 00:35:48 And they said, these are our people. This is recruiting pool for us, unlike we have ever had before. And sure enough, the platforms, their automated systems also made that connection. And they realized that someone who had spent a lot of time in Gamergate, they might also like a video
Starting point is 00:36:02 about how the white race is under threat. And they might also like a video about how the white race is under threat. And they might also like a video about how the Democrats and the feminists are part of a conspiracy to keep down white people and to keep down men that might engage them. Nobody at Silicon Valley wanted to promote the alt-right, but that was what their systems arrived at and cultivated this much larger community that became the kind of the Pepes, you know, the online alt-right that then identified Trump when he was very obscure, when he was not someone who had much of a base out in the world as kind of their guy, because he, like so many things that then dovetail with this movement, spoke to the incentives of the platforms and what rose on them. And so they all got pulled together
Starting point is 00:36:42 into this kind of bigger mishmash identity and community that we're now all living with. Yeah, I think it's obvious now, but I think it's underappreciated how much Trump's personality aligns perfectly with what these platforms incentivize to get people to engage with them. Right. Just perfect. Right. And it's something you see over and over where it's someone who might be kind of obscure, but they just, their personality where, I mean, with Trump, it's outrage, it's provocation, it's insults, and especially it's us versus them. And just the fact that he's just got poster brain where he's just posting constantly and it just, whatever will be like the issue of the day, how can it be a little bit more provocative,
Starting point is 00:37:23 a little more extreme on it? You see in place after place, the people who have this personality will just rise, will just rocket on the platforms. And I think when that first started to happen, it was easy for a lot of us to make the mistake to say, oh, he's a master manipulator of social media. He's got Steve Bannon. He's got the dark arts. Breitbart has the dark arts. They understand how all this stuff works. But it turns out that they were really just passive beneficiaries of the preferences of the platforms who identified that this would be really engaging to other people and would help create these communities that would keep people online and clicking and clicking. Well, I thought one of the most shocking examples of this in the book was about Charlottesville.
Starting point is 00:38:03 Yeah. Which is, again, I think some of the social media companies might say, yeah, sure charlottesville yeah which is again i think some of the social media companies might say yeah sure charlottesville was organized because there were these platforms where like-minded extremists came together and said all right let's all meet up and do this but um you talked to or you interviewed jonas kaiser who's a a Harvard researcher. And he basically comes to the conclusion that Charlottesville was created through the YouTube and Facebook algorithms. Right. Yes. I think it might be, it occurs in like the last third of the book. I think it might in some ways be like the most important thing in the book. And this is, Jonas is kind of the person
Starting point is 00:38:43 who'd like first really pushed this, but other people have since come to it, which is just important to say that there's other supporting evidence for it. And this was the second form of identity threat that I remember I'd wanted to talk about is that the platforms will figure out how to take these pre-existing communities that are not connected to each other. And this is what happened in Charlottesville. We had all of these little right-wing, far-right, neo-Nazi KKK groups that had never really come together. And it started with, in the case of Charlottesville, the Facebook groups recommendation system, where it realized that if you were part of one of these groups, you might be interested in these other groups. And not just that, but if the system routed you through all of these groups
Starting point is 00:39:25 in a specific sequence, and if it pushed you from one to the other frequently enough, it would create an entirely new super community. This like much larger kind of miasma or swirl. And this is also what happened with QAnon. This is basically how QAnon came about. That could then be much larger and then would take on the ideas of all of this. And in this first network analysis that found this, it all centered around this Unite the Right rally that became the Charlottesville rally that brought together all
Starting point is 00:39:53 of these groups. And of course, there are a lot of reasons that the far right was resurgent then. Donald Trump being president was not irrelevant to it. But it was the thing that, like you said, the answer was that, oh, the groups came together on the platforms. They were really brought together by the platforms. And this is something that Jonas found happening in, he looked at three different countries and he found the exact same pattern happening in every single country where it would pull in a lot of communities who were fine, basically, like center right people, people who just like to watch the news online, people who might have interests that were like vaguely aligned with right-wing cultural politics, but were not themselves extremists. And that it would determine
Starting point is 00:40:30 that it could link all of those together to create this much larger community that would get people to watch for seven hours a day instead of 40 minutes a day, because it's exposing them to all these ideas that are more extreme and therefore more engaging. And that it would create this sense of a new community around them. And especially, and it did this over and over again, at the kind of center, the center of gravity of these networks, the places where it would inevitably lead people. And then once there, it would keep showing them more and more content, more and more videos and more and more Facebook groups. It was always the most extreme iteration of it. It was always the extreme far right. If it was health misinformation, it was always extreme vaccine
Starting point is 00:41:05 deniers or people telling you to go out and kill doctors. And with QAnon, it was the thing that brought in all of these right-wing groups who previously never would have believed in this kind of thing or arrived at it around this crazy conspiracy theory that became the uniting force for this entirely new identity and community that was built on and by the platforms. So get to the point where it's like, so what do we do about all this, right? I mean, one of the great frustrations I have had doing this series is that the more I learn about social media uh especially from your book the less i am confident that we can ever effectively govern these platforms even if we had a functioning political system which we of course don't right which we don't and that's not just with regulation
Starting point is 00:42:00 or even if we had uh social media platform companies that wanted to do something about this in in a real systemic way um because it just seems like and you know like i talked to um alex stamos um who worked at facebook and was a security guy at facebook and you know he he sort of pushed back on me when i was like well it's the algorithms and can you tweak the algorithms and i don't think he now that I have read your book, I don't think it was pushing back on me like, it's not the algorithms, they're fine. I think what he was saying is like,
Starting point is 00:42:32 a tweak here and there to the algorithms is not actually going to fix all of these problems because the real answer is they're far more fundamental than just tweaking the algorithms, which these platforms in the last couple years have been, or at least some of them have been trying to do. Sure, yeah. There are people at the companies who are trying,
Starting point is 00:42:49 like you said, to push the algorithms in a better direction, but it's a little bit like saying, well, what if we change the filter on the cigarettes, but we kept trying to sell as many cigarettes as possible? Or like, what if we did a different kind of menthol, but we're also gonna like really ramp up our cigarette advertising, which is the entire idea of the product of maximizing engagement through technology. We just know it just taps into these really deep and really destructive impulses. So I do have an answer for, it's not a super satisfying answer,
Starting point is 00:43:20 but for like, what, what do we do question? Do you want the, do you want the, like what we do as a society or what we do as individuals? I was going to ask you both. You both. So yeah. So perfect. Okay. We'll start with the society one because it's a dream. So we might as well start with the, um, whenever I would ask people who studied this really seriously, what, you know, if you were in charge of Facebook for a day, if you had all the power in the world, what is the change that we make? And it was always some version of turning it off, which I, right, like, good luck with that. Not necessarily the entire platforms. A lot of these people still love social media and are still real believers in its potential for and actual evidence of positive changes brought about by social media. But the engagement maximizing aspects of it, whether it's the algorithms, even Jack Dorsey at Twitter,
Starting point is 00:44:11 he was before he left, he was like, maybe the idea of showing you retweets, a little number at the bottom of a tweet, that might be really destructive. And maybe we need to get rid of that. And that was something that I would hear people say to get rid of. But it was always just these features that are meant to maximize engagement and hook you in. If you just turn those off, and if we go back to that pre-2006 social media, it really is at that point more of a neutral amplifier. Now, how do you actually get that to happen? I don't know. Good luck to you. That'll have to be your next guest. We'll have to answer that. Yeah, no, think that uh it's interesting to imagine like a version of a social media platform that could connect people in a positive way because
Starting point is 00:44:53 like underlying all of this is like there's it's not our fault but there's something there's a weakness in our composition as humans that's allowing this to happen. And so it's a pretty dark view to be like, well, we just all can't be connected in this way or in any big way because we're all just gonna be at each other's throats. And so I always was trying to flip it and think like, is there a way to connect a billion something people around the world in a way that's healthy and positive?
Starting point is 00:45:23 Sure, I mean, that pre-newsfeed social media, people spent a lot less time on it. It was a lot less powerful, but it just, it did not pull out those same destructive forces. It did not hook into those kind of cognitive weak points, those instinctive weak points that you were talking about, and exploit them in the same way because it was something that really was a much more neutral vessel in terms of what it showed you. Maximizing for engagement seems like the original sin. That's right.
Starting point is 00:45:51 So what can we do individually? So this is, I think, the harder question. And it's partly because the answer I know that you hear a lot on your podcast is like, well, just turn off your phone for 18 hours a day. And it's like, okay, it's not really an option for me. And that's both because like for work, I need a lot, but it's also just like, we live in a world where these social media companies have really captured and control information consumption and news consumption, how we relate to other people. So if you want to do those things, which you should want to do because they're basic human needs, then I'm sorry, but you need to do those things, which you should want to do because they're basic human needs, then I'm sorry, but you need to be online occasionally. You need to have your phone on.
Starting point is 00:46:29 And phone diet is a great idea, but it's not a solution. And even if it is something you can do, it's not really a solution because so many of social media's effects are atmospheric. Even if you did throw your smartphone in the trash, the world around you is still profoundly influenced by social media and you have to live in that world. That filters back to you through your friends, you're affected by it, even if you are not on Facebook or on Twitter or on Instagram yourself.
Starting point is 00:46:55 So what can you actually do as an individual when you were living in that world? And the answer that I arrived at, and it is going to sound both pat and convenient, but I promise I have a good explanation for why it is helpful, is to understand as deeply I talk to, it operates like a drug. It operates not just metaphorically, but it creates a specific chemical reaction in your brain that is addictive. And not only is it addictive, but like any drug, it changes your behavior and it changes the way that your brain works. But it's especially pernicious as a drug because its effect is invisible because you don't realize you're taking a drug when you're on Facebook. You think that you're just chatting with your friends or reading the news or whatever.
Starting point is 00:47:48 And it's also pernicious because whereas, say, alcohol's effect is hormonal, so it might affect your balance, your temper, things like that, social media's drug-like effect is on your social instincts and your social behavior, which is not something that we are used to recognizing as a drug-like effect is on your social instincts and your social behavior, which is not something that we are used to recognizing as a drug-like effect and changing how we think about right or wrong and changing how we think about our identity. about a dozen times a day. I think that's the median number. And if you're politically engaged like us, it's sometimes several dozen.
Starting point is 00:48:29 And if you're a young person and your social needs are a lot higher, it's probably also several dozen. So we are effectively living in a world, and it sounds crazy when you put it this way, but it is really true, where 80% of the population is taking a mood-altering, behavior-altering drug at least a dozen times a day, sometimes several dozen times a day. And when you know that, you're like, no wonder. Right. No wonder we're all crazy. No wonder the world is like this.
Starting point is 00:48:56 Right. It does, and of course it does explain everything, but it does explain a lot. And it really is as if, and I'm going to sound like that guy in Dr. Strangelove, but it really is as if there is something in the water that is changing all of our behavior that we're not aware of, that we don't see. And so that's why in the way that you can take drugs, you can drink safely, right? I had two cups of coffee this morning. I'm probably going to have two glasses of wine tonight. And I can take those safely because I understand their effect on me, on my behavior. I understand what I can do and can't do safely with them. And I know how to actually the effect of the drug and the effect of this training that is being induced upon me by these very powerful companies. It doesn't remove the effect, but it becomes much easier to manage it and to live with it and to
Starting point is 00:49:55 know how, and this will depend on each of us and our individual personality and what we want, how to kind of cope with having that be a part of our life because it's going to be a part of your life in a way that is a little more responsible and a little safer for us. Yeah. I mean, I did not think that answer was pat at all because that is the conclusion I've come to personally through doing this series. And, you know, a lot of my friends and people who know me were like, oh, so are you not using social media anymore? Are you throwing away your phone? Cause you did offline and they'll make fun of me when I'm still on my phone a lot, which is all warranted.
Starting point is 00:50:29 But I have changed a little bit of how much time I'm on my phone. But I've changed how I sort of view the political world and everything else and my behavior and my socialization with other people because I know what's going on. And now it is tough. And you've mentioned this a couple of times, like being in politics, I still very much believe, and you've written about it in this book, that the threat from right-wing extremism is quite real, right? And when I go on Pod Save America, we talk about this, we talk about the threat to democracy posed by Trump and a lot of the Republican Party right now, I genuinely believe it. And I don't think it's just something that social media is making me think. But I am also much more wary now when I'm looking at something online or I'm scrolling through Twitter and people are getting outraged about something or everyone's taking it up to an 11 or everyone's freaking out. Is it real?
Starting point is 00:51:21 How much is based in reality? How much is just this the algorithm and everyone just getting, you know, the effects of social media? It is something that I'm a little more attentive to now. And the real struggle is trying to figure out, I mean, this is what social media does. What is real? What are the real threats? What should we genuinely be worried about? And what is the social media bullshit? Yeah, i think that's a great way to put it it's about learning how to differentiate and not just what you see out in the world it's about learning how to differentiate the emotions that you feel internally because social media is
Starting point is 00:51:54 so effective at drawing this out so that in a way that it doesn't feel like it's coming from facebook it feels like it's coming from within and when you can see okay this is something that twitter has trained me to like feel really outraged about this. It becomes, I think, easier to live with and to not feel crazy about. And also just to know where should I actually dedicate my energies or not. Yeah. And to like, make sure that you're not participating in it too, which I, again, I struggle with all the time. Like we are a progressive media company. I know when we put out clips where one of us goes on a rant about something, those clips do better, right?
Starting point is 00:52:30 And it's like, oh, great. We had a clip that performed really well. And it's like, you know, is the clip performing really well? Is that the measure of success? Because we're also trying, the whole purpose here is to try to persuade people of a view.
Starting point is 00:52:45 And persuasion sometimes takes a different set of strategies than just righteousness and outrage. Right, and getting the people who already agree with you too. Yeah. And it's not always necessarily a force for bad. I try to spend a lot of time in the book on Me Too, on Black Lives Matter, on the incident in Central Park in summer 2020, because I feel like these bad, that sometimes outrage amplification can be a force for good. And social sanction, which is what we do to punish social transgressions that are not illegal, but that are bad for the community, is a really important tool that has been democratized to a large extent by social media.
Starting point is 00:53:39 Though at some point in the book, you talked about the civil rights movement. And I do think there's a type of organizing that happens online, even when it's for media, you had all of these different civil rights groups. They knew each other. They met with each other. They understood each other. They worked together. They argued with each other. And they did it in person. And that kind of strategizing when you're in person with people, when you're organizing for a greater cause, like you can't just do that online.
Starting point is 00:54:21 Right. Yeah, this is research by Erica Chenoweth that I found to be so mind-blowing. And she studies public protests, basically, and protest movements. And something that she has found is that from, like, the end of World War II, basically, up through, like, 2010, both the rate and the success rate of protest movements was consistently going up. And then all of a sudden, around 2010, and of course we know what happened then, which is the widespread adoption of social movements was consistently going up. And then all of a sudden, around 2010, and of course, we know what happened then, which is the widespread adoption of social media, two things changed. The first were that protest movements became much more common, and they became exponentially larger. I mean, the masses that you could get out on the street
Starting point is 00:54:57 overnight, because you didn't have to do this traditional SNCC organizing on the ground that would take years, you could just things could go viral for free on this platform that might be outside of the control of authoritarian governments. But the other thing that happened was that their success rate that had been climbing for so long suddenly plummeted from like, I'm going to get these numbers wrong, it was like 70% to like 15%. All of a sudden they started failing. And this is something that if you think back to the Arab Spring and basically every major protest movement since then, that might sound familiar. Suddenly, there's a million people on the street, but it never seems to go anywhere. And partly that's because, like you're saying, the protest movements look much more impressive because there's so many people, but it's very flimsy because all you have to do is read the post and go outside. You don't have to have that traditional organizing structure that has been displaced by social media. And that has been really, that work doesn't happen anymore
Starting point is 00:55:51 because it happens so quickly online. And because this commitment is so low that it's very easy for things to recede. And also because authoritarian governments turn out to be really good at manipulating social media. And they have the resources that even the best and the most skilled activist doesn't. And you hear this from a lot of like Arab Spring activists who 11 years ago, I had to do some math there, 11 years ago were, you know, naming their kids Facebook or Mark Zuckerberg. And we're like, Facebook has brought liberty and democracy to our country. And then three or four years after that, we're saying,
Starting point is 00:56:24 oh no, actually now that Facebook has really come in or Twitter or YouTube have come in and have started to play this really important role as mediaries in our society, they have spread this division and this fear and this conspiracy theory that is completely undone all the positive work that we were able to do through the platforms. Yeah. And I think on the political side, we're seeing now over the last couple of campaign cycles, the most successful organizing and the most successful political persuasion is happening within people's own social networks and with traditional door-to-door gathering with people in person. So it's coming back around. Right. Last question. What's your favorite way to unplug, particularly after spending however many years you have writing a book about the ills of social media?
Starting point is 00:57:11 So it's actually the and I'm not the only one to come to this. The version of unplugging that I found most helpful is a like half unplug where I spent a lot more time on just old style group chats and have like a couple friend slacks. And I've tried to use that to displace, you know, when I open up my phone and I want to look at Twitter instead, pull up the group chat that has 30 friends on it because it's always activity. It's a good thing to kill 30 seconds at a stoplight. It replaces a lot of those feelings, but it's real connections. So it feels much more meaningful instead of that just like fake superficial dopamine rush. And you also don't have the algorithms and all these distorting features. I find it to be a much safer, healthier outlet that still allows me to be on my phone 362 hours a day.
Starting point is 00:58:01 So I would recommend that really strongly to people who are looking for a way to like unplug a little bit. Yeah. I do a lot less tweeting and a lot more talking about tweets in the news with a group of friends on a text chain. Exactly. Yes. That's right.
Starting point is 00:58:14 You copy paste it instead of getting outraged on Twitter. Or we get outraged to each other and we can talk about it and it's fine. Right. It's the knicker at gum. Yeah. Max Fisher, thank you so much for coming on Offline. And everyone go buy this book and read it, especially if you've been listening to the series. It is fantastic.
Starting point is 00:58:30 Oh, thank you so much. The book is The Chaos Machine. Thanks a lot, man. Offline is a Crooked Media production. It's written and hosted by me, John Favreau. It's produced by Austin Fisher. Andrew Chadwick is our audio editor. Kyle Seglin and Charlotte Landis, sound engineer of the show.
Starting point is 00:58:55 Jordan Katz and Kenny Siegel take care of our music. Thanks to Tanya Sominator, Michael Martinez, Andy Gardner-Bernstein, Ari Schwartz, Andy Andy Taft and Sandy Gerard for production support and to our digital team Elijah Cohn Nar Melkonian and Amelia Montooth who film and share our episodes as videos
Starting point is 00:59:11 every week Hey Offline is finally getting its own merch collection Hell yeah let's go It's a line of NFTs
Starting point is 00:59:24 That's surprising No I'm just I'm kidding Tommy I would never do that Got it They're IRL products finally getting its own merch collection. Hell yeah, let's go. It's a line of NFTs. That's surprising. No, I'm just kidding, Tommy. I would never do that. Got it. They're IRL products inspired by the conversations I've been having on offline in the cursed online ecosystem where I spend the rest of my time. Just printouts of your tweets.
Starting point is 00:59:38 The collection includes t-shirts and mugs with a field guide to Twitter's colorful archetypes, a not-until-I've-had-my-content mug, and a beautiful phone case to protect the device that's destroying your attention span and slowly sapping your life of joy. It's a dark, dark picture. You're painting. It's a great, it's a phone case that just says,
Starting point is 00:59:58 fueled by negative energy on the back, which I love. It's accurate. You need it to listen to podcasts, and you've got to keep it safe. The best part? You can enjoy all this merch offline after you go to crooked.com slash store
Starting point is 01:00:08 first to check it out.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.