Offline with Jon Favreau - The Neuroscience of Why We’re Susceptible to Lies, Outrage, and Fascism

Episode Date: February 18, 2024

Cass Sunstein, Harvard professor and coauthor of the forthcoming book, Look Again, joins Offline to discuss the dangers of habituation. When things become so commonplace that they blend into the backg...round of our everyday lives, we stop appreciating the good and identifying the bad. Jon and Cass examine how authoritarian regimes are normalized, whether you can pay people to quit their social media addictions, and why repeating lies makes them more believable. But first! Max and Jon dive into Meta’s decision to stop recommending political content on their platforms, President Biden’s foray onto TikTok, and what a recent Selena Gomez deepfake means for the future of scamming. For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.

Transcript
Discussion (0)
Starting point is 00:00:00 You know, in the history books, it seems really fast, Hitler and then war and Holocaust. That's not how it was experienced in real time. It was Hitler and then kind of threats of various sorts and celebrations that seemed a little kind of over the top and distracting. And books, things are happening to books and Jews are kind of being monitored and and what happened was as people described it in real time they thought at first this is like an increment that's not good but it's not going to end up where it did and each increment was bigger and less horrifying than it would have been if it had happened all in a flash. So the people who were there said it was like being in a field with the corn growing and growing and growing, and then it's over your head.
Starting point is 00:00:55 But you didn't notice it going over your head until it was. I'm Jon Favreau. Welcome to Offline. I'm Jon Favreau. Welcome to Offline. I'm Jon Favreau. I'm Max Fischer. And you just heard from this week's guest, Harvard Law Professor, world-renowned behavioral economist, and fellow Obama alum Cass Sunstein. Cass and I are old friends, having first met on the Obama campaign in 2008. I probably could have invited him on the show to talk about any one of his books.
Starting point is 00:01:21 He's written over a dozen, including the 2008 global bestseller, Nudge. But today I wanted to talk about his newest book, Look Again, any one of his books he's written over a dozen including the 2008 global bestseller nudge but today i wanted to talk about his newest book look again which is out on february 27th look again is a book about habituation how when things become part of our daily lives they begin to blend into the background preventing us from fully appreciating them or in the worst case understanding how they might harm us uh it's a very offline book it sounds really good very offline like cass emailed me about it and i was like he's like would you want to do a quick interview on this on one of your pods i was like boy do i have the pod for you
Starting point is 00:01:56 uh he's kind of the third chair you know even when he's not on he's kind of here with us in spirit um and uh so it's very offline especially as as it relates to what Cass calls in the book our technologically induced coma. And it has some important lessons about social media, misinformation, and democracy, which are all things that we love talking about here. Those are my three favorite things. That's right, yeah. But before we get to that, speaking of technologically induced comas,
Starting point is 00:02:23 Meta has announced that Instagram and Threads, the company's Twitter competitor, will stop recommending political content. The change will limit the content users see from accounts they don't follow, preventing accounts that talk about politics from appearing in the recommended feeds. Despite the potential consequences for the upcoming 2024 election, Meta has yet to clarify how they define, quote, political content. So, huge implications for how people consume news on those platforms, right? Yes. And I think Facebook itself is going to follow too. Yes. At some point they said. So just to like parse out what this means, because I think there's a
Starting point is 00:03:01 lot of confusion about what they're actually announcing. Facebook and Instagram and threads, threads, come on, are not going to be suppressing individual posts for being political and no one is getting censored. Rather, what's happening is that the algorithms will no longer show you accounts that it both considers to be politics focused, although no one knows how it determines that, and that you do not already follow. So this is going to change the platforms a lot though, I think in at least two ways. Number one, this is going to be bad for big explicitly political accounts that want to use Instagram or Facebook to reach new audiences. Like Pod Save America or Crooked Media. Exactly, right. Like this is going to affect us because this will limit our reach and our ability to reach new audiences. And that will be true, I think, for basically all media outlets, reporters, political activists, so on.
Starting point is 00:03:50 We don't know whether that will also be true for like influencers who sometimes post about politics because probably Facebook doesn't even know exactly how their algorithm is going to filter this. And this is part of a message that like meta has been sending to institutions like us for a while that they they kind of just don't want us anymore like they've been breaking up with news for a while now uh but the second and i think probably bigger more consequential change even though it's not going to be as obvious initially is how this will change the experience for users uh because this does not actually mean that you will no longer ever see political posts or content because that's now how the algorithm is filtering right it's filtering by accounts so you might still see like say algorithmically promoted posts from
Starting point is 00:04:38 influencers and mostly talk about you know sports or fashion or comedy or whatever but that mention politics or things that discuss politics indirectly um and we have a sense i think for how this will play out because back in 2018 facebook actually tried an experiment very similar to this in a like handful of six or seven countries in the global south what they did is they put everything related to news so they were filtering by news or by politics on a separate feed away from the main news feed so your primary news feed no longer had any news accounts on it um but what happened one of the like big important learnings from this is that people did not actually stop caring about news they didn't stop being interested in they didn't stop wanting to discuss it they didn't stop being interested in it. They didn't stop wanting to discuss it. They didn't stop clicking on it. So all of that desire among users in the audience to discuss and consume news content just got diverted away from credible news sources and instead toward non-authoritative accounts like random influencers or rumor mongers or like your aunt and uncle, like whatever they had to say on the topic, like whoever happened to fill this vacuum
Starting point is 00:05:46 left where the credible news source has been. And one of the countries where this experiment took place was Sri Lanka, which became actually kind of notorious for this because the result was the void where news sources had been got filled by conspiracy theories and race baiting rumors because that's what was available to fill this desire
Starting point is 00:06:04 for people in this country to talk about what's going on in their communities. And this ran so rampant that it snowballed into mass incitement to race riots that Facebook later admitted it had like more or less caused. So I think my worry is not that this new change will suppress political discussion or political influencers, even though it's going to be bad for institutions like us. But people will still want to engage with those topics. So it will degrade, I think, maybe substantially the quality of information and discussion that we get on them. I guess my question is, is it to that point, do we feel like the algorithm as it currently exists is showing people the kind of news and information that is making them more informed
Starting point is 00:06:47 and less polarized. That's a good point. This is like, it's another incremental step in a direction they've already been going. Like Adam Masseri, who runs Instagram, said a while ago, like, we're not going to be promoting news anymore. And they're trying to pull people away from this. So I think some people look at this and they get concerned and they're like, oh my God, discussion of like, you know, racial justice won't exist anymore on these platforms. And what we've seen already from how these changes have played out is that it does.
Starting point is 00:07:15 We're just diverting that interest in those topics away from credible sources to less credible ones, which leads to a like much worse quality of information that we're getting and much worse quality of discussion. And it's like Facebook's, they just want to maximize engagement. Like that's what this is always all about. It makes me think that, and we've talked about this before, every time one of these social media companies and especially Meta tweaks the algorithm, there are, you've written about this obviously there are unintended effects that are usually bad right so you try to solve you try to solve one algorithmically generated
Starting point is 00:07:53 problem you try you think you solved it and then it creates a whole bunch of unintended consequences that you hadn't thought about and the real problem is the existence of the algorithms themselves that's right and so if you go on threads which i do because i'm a big threads guy uh you're threading i'm threading left and right so there's two there's two columns that you can choose there is for you right and there is following right and i never go to the for you column because I'm like, I don't want to just be recommended bullshit from meta based on some algorithm about what I want. I chose to follow a certain group of people because I either trust the news sources or I like their opinions and their takes.
Starting point is 00:08:39 And so those are the people that I want to follow. That will not change at all from this and if you even if i wanted to make sure that my for you feed if i if i was into that sort of thing and i liked the for you feed you can there's a setting where you can say i do want to be recommended political content right so you can so you can do that so basically this is just saying to people like i mean we talk about like, you don't have a right to amplification. Like everyone has a right to have their voices heard. No one has a right to have their voices heard by strangers who didn't ask for it.
Starting point is 00:09:14 Right. Which is what algorithmic amplification is and is what is going to be taken away from certain accounts. Yeah. And I kind of, I mean, again, as the, as the founder of this company, I would, I would like us to be able to... We'd like to reach people. We'd like to reach people who might not necessarily have heard of Crooked Media via algorithm. But if it's not via algorithm, then like, it sounds like we're going back to the days where it's like you have to build your own following. Right. And you have to go by word of mouth and you advertise all this other kind of shit, right?
Starting point is 00:09:41 As opposed to just like spinning the wheel and hoping that the algorithm recommends you to like-minded people. But that means participating in something that we know is overall harmful to society, even if occasionally a good post or a good account like ours, which is impeccable, might benefit from it. And I think also like something really important, like you made the point, which is a good one, that anytime you try to solve one problem with an algorithmic change, you create a ton of unintended downstream consequences and like i think it's that's true and it's even worse than that because they have said before whenever they make an algorithmic change they always present it as like oh we're fixing a social problem we're fixing some harm created by the other that has never once been true every single time the reason for the change is that they have deduced that this is going to be a way to juice engagement. And they, after the fact, after they've decided we're going to push you more towards people in your community or we're going to push you towards this kind of content or that kind of content because it will juice engagement, they say, oh, by the way, it will also fix polarization. And that's never been the actual goal. It does seem like the reason, one reason they did this is they were tired of getting
Starting point is 00:11:06 shit about influencing politics, you know? And they're like, you know what then? We're out. We're out. You can, you can follow, uh, political content if you want. You can talk about politics with your own followers in your own feeds, but our algorithm, we're out of politics. So enjoy. I think that's true. But I think it's also like, since Instagram made this change that they are going to significantly downrank news, their traffic has gone way up, especially among younger people. And like the relationship between those things is probably more complicated than young people don't want to see things that are explicitly branded as news and probably has to do with the kinds of accounts that are getting promoted. But to me, I think it's absolutely correct that they're tired of getting yelled at. But I think if they thought there was the best way to get more money and more engagement was
Starting point is 00:11:58 to lean into news, I think they would do it. And I think what they have learned is that the strategy that they're going down now works for them financially i do um i have a concern about it just from a like democracy political perspective which is you know it is really hard right now for political campaigns to reach infrequent voters who do not consume a lot of news and this will make that harder right we there are and we're going to talk about TikTok in a second, but there are a shrinking number of levers that we can pull to reach voters. And fewer people are watching television, television ads.
Starting point is 00:12:38 Now Twitter's sort of broken. And the decentralization of media has now become like the disintegration of media and it's just really hard to get anyone to pay attention to anything we've talked about monoculture a few times here too and if you don't have the tool of algorithmic amplification on one of the biggest social media platforms in the world um that's going to be tough. That's going to be tougher to reach people. And not only will it be much harder for political campaigns to reach those infrequent voters,
Starting point is 00:13:10 but those infrequent voters are going to get, on their social media feeds, much more non-credible sources, rumors, misinformation, because that's what we know fills that void. Yeah. And are you saying they'll get that because the people they follow will put that in their feeds? It's because everybody has some level of desire to read about things that are happening in the world.
Starting point is 00:13:32 And they have some level of innate desire to read about news and information and social issues. And when they are denied access to news accounts or credible political accounts, as this algorithmic change is going to do. But it doesn't deny them access to it. No, you're right. It doesn't deny them access to it, but they will in their feed, it will not be right. What will be filling that need instead, the available supply to fill that need, that desire for political content is going to come from one step down in terms of credibility from the sources. Hey, go to the New York Times, everyone. Go to the Washington Post.
Starting point is 00:14:06 I know. Listen to Pod Save America. They're pretty good. Listen to Offline. Great ways to get credible news. You just have to do it instead of having it shoved in your face with an algorithm. This is why I'm so,
Starting point is 00:14:16 it's a little like, and I also, I saw a lot of people say this is going to like silence. It's not going to silence people. It's just not. Right. Like you,
Starting point is 00:14:24 the people who choose to follow you and you will still be able to, anything you want to say, all those people will still be as explicitly political as it is. Right. Or less explicitly political. Everyone who follows you will be able to see it. And if they want to share it, they can retweet it or rethread it or whatever the fuck we're talking about now. So all that still exists. It's just that if you're betting on the For You feed or in the Facebook news feed or Instagram or whatever else, it's not going to show up. Which we've all been trained to rely on to some extent.
Starting point is 00:15:00 Yeah. Although I hate it. Sure. But that doesn't mean that it's not incredibly influential. Well, so the recommendations out of this is go find credible news sources to read, as opposed to just relying on your algorithm. And maybe don't rely on the algorithm and actually just think about your followers, who you are following, and then who follows you as a way to make sure that you are educated and informed about what's going on in the world. All right. Well, for TikTok users who are looking for political content, do we have good news for you? Joe Biden. Hot new influencer on the block.
Starting point is 00:15:38 Joe Biden has finally joined the platform. Last Sunday during the Super Bowl, the president's official campaign account posted its first ever TikTok in which Biden answers some either or questions about the Super Bowl and makes fun of a few right-wing conspiracy theories. Let's play it. Chiefs or Niners? Two great quarterbacks. Hard to decide.
Starting point is 00:15:56 But if I didn't say I was for the Eagles and I'd be sleeping alone, my wife's a Philly girl. Game or commercials? Game. Game or halftime show? Game. Jason Kelsey or Travis Kelsey? Mama Kelsey. I understand she makes great chocolate chip cookies.
Starting point is 00:16:09 Are you decently plotting to rig the season so the Chiefs would make the Super Bowl or are the Chiefs just being a good football team? I'm getting trouble if I talk to them. Trump or Biden? Are you kidding? Biden. I like that he had to say Biden at the end. Are you kidding?
Starting point is 00:16:22 It is Biden. Just to remind us who we're looking at, who we're voting for. Can we talk about the setting? Was he at, like, my grandmother's house? I like the little dip he did, the little dance. How many social media consultants got paid to say, Biden, do a little dip? Honestly?
Starting point is 00:16:38 I think that was, I think it was organic. All right, so youth vote secured? I mean, look, the two two of us you are the kind of election knower here um i will say that this reminds me of this legal term of art necessary but not sufficient to me biden 2024 joining tiktok it is not sufficient to going back the youth vote but it is necessary like we talk a lot about t about TikTok's dominance among young people, but you really can't overstate it. Like I was looking up the numbers this morning again. I had forgotten this. The average American aged 18 to 24 spends twice as much time on TikTok as any other platform
Starting point is 00:17:16 on average. Wow. Yeah. And nearly a third of Americans under 30 say they regularly get their news from TikTok, according to Pew. So that's a lot. Absolutely dominant. So you kind of have to be on it. I think the Biden campaign is very well aware of the limitations of the strategy, the necessary but not sufficient. They've said this. And they have said before that what's going to be more important is to have people, Biden supporters who are TikTok influencers, who have big TikTok followings, have them post content more so than whatever the campaign does.
Starting point is 00:17:54 Both of them. Exactly. Well, that's the challenge. That's the challenge. But, you know, Rob Flaherty, who is the deputy campaign manager there and was the director of digital in the White House before that, he talked to Dan for Pod Save America. He also talked to Charlie Warzel after the TikTok account happened.
Starting point is 00:18:13 And he said, in 2014's Internet, you could afford to swing big and hit home runs, big one-off campaigns for more centralized audiences. That's changed now. We're going to look for home runs, but we've got to collect singles and doubles. It's about being in more places and narrow casting and getting them to add up to a broadcast. So their strategy for TikTok is not necessarily like, we're going to post this TikTok of Biden and it's going to go viral. What they're trying to do is build an audience of supporters who, when there is a big moment and they post, then those supporters will share. So it's much more of a base play than it is a, we're going to go viral and everyone's going to go viral.
Starting point is 00:18:57 Which makes sense because if you look at all the comments on all the Biden things, it's just like, Rafa, Rafa, Rafa. It's all about Gaza, right? And we've talked about that before but i think they would say like that's not that's not necessarily concerning to them because the point of it is not to have everyone it's not persuasion right it's like giving your supporters messaging and a tool to go out and evangelize the message that feels like that that makes sense for kind of the metabolism of TikTok and the way it works, where you've got these relatively siloed communities and relatively siloed topics with a lot of big voices within them who you want saying things that are favorable for you.
Starting point is 00:19:33 Yeah. But again, I mean, their challenge, and they would have this challenge if an 81-year-old man was not at the top of the ticket. Any campaign would have this challenge right now. Is he 81? I didn't know that. Yeah, I've heard. I don't know if you've heard. But it's just communicating in this media environment as a campaign is going to be excruciatingly difficult.
Starting point is 00:19:56 Something that I think is incredibly hard for a structural disadvantage for Democratic candidates, especially, and for incumbents so he's facing both is the growing role of negative polarization in social media where because social media is built for negative polar it's built for and it always has been and but it's like becoming much more severe where the idea is that you don't go viral by saying like hey the child tax credit was really great you go viral by saying this thing that happened, I'm really upset about, I'm really outraged about, which is not to say that that's not legitimate. But when that's where the overwhelming focus is and you are both the party that is trying to do things rather than the Republican Party, which is more geared towards preventing things or tearing things down, then that's harder for you. And of course, when you're the incumbent, it's harder for you.
Starting point is 00:20:41 This is the whole challenge with the Biden people being very frustrated over no one knowing about his accomplishments, because every time you talk about the accomplishments, it doesn't get covered. And it certainly doesn't get covered on social media, because if you post about Biden's accomplishments, even if it's not Biden, if you post about anyone's accomplishments, any politician's accomplishments that you post about, not going anywhere. Because when you see it you're like and we what about this sort of habituation i talked to cass about which is like you see it great that's fine whatever but it doesn't get you it doesn't get you going right because the thing that gets you going is being outraged about and maybe once the like general starts in earnest and trump starts to become more
Starting point is 00:21:24 of a thing, people will remember that they, however much they dislike a particular policy or decision by Biden, they dislike Trump a lot more. That's the whole, that's the bet. Yeah. That's the whole thing. Which is not, not super exciting about democracy. But where we are. Yeah. Yeah. It is worth noting that prior to this post, the White House had avoided using TikTok over national security concerns. They still are. The White House is not on TikTok, but the campaign is. Was this just, it seemed like this was just necessity here. Yeah. So like, does it concern you? I mean, so obviously whatever campaign staffer is uploading these videos is not going to be doing it from a phone that also has the nuclear launch codes on it. In fact, they don't
Starting point is 00:22:02 give campaign staffers the nuclear launch codes. They don't? Come on, just for like a day. Just for a day. I do get the kind of discomfort on national security grounds with American presidents and presidential candidates having to rest their political fortune to some degree on the algorithmic whims of a Chinese social media platform at a time of extremely high stakes geopolitical rivalry with China. And obviously there were some rounds. Put it in the not great column. Yeah, it's not ideal. And like this came up a few times when some of the Republican presidential candidates who were like really anti-China, of course, all ended up joining TikTok. And to be
Starting point is 00:22:40 honest, I was not super concerned about it then, nor am I with this. And I just think that if Beijing wanted to influence a big American presidential election, they have much more powerful, much do that regardless of whether or not the campaign was itself posting videos to TikTok or not. But on some level, this dilemma that you see Vivek Ramaswamy or Joe Biden face of, do we get on TikTok? I really kind of sympathize with it because don't we all, every day, face the dilemma of whether to be on social platforms that we know are evil, but also rely on to like navigate our world because of how dominant they are. Yeah. Yeah. Our whole, our, this whole podcast is just one big, uh, what I'm saying is Joe Biden take the offline challenge
Starting point is 00:23:38 and we'll post about it on Tik TOK. Uh, all right. Finally, Selena Gomez is not giving away free Le Creuset. This week, a deepfake scam of the actress went viral. In it, an AI version of Selena's voice informs viewers that due to a packaging error, she has 3,000 Le Creusets to give away. Let's play it. Hey, everyone. It's Selena Gomez here. Due to a packaging error, we can't sell 3,000
Starting point is 00:24:06 Le Creuset cookware sets. So I'm giving them away to my loyal fans for free. If you're seeing this ad, you can get a free Le Creuset cookware set today. But just a heads up, there are a few rules. You must live in the United States and you can only get one free kitchen set per household. All you have to do is click the button below and answer a few questions these will only be given out until the end of the day today so don't hesitate supplies are running out so get yours while you can thank you guys for all your support and i hope you love your new cookware set what a deal what. What a barg. Just the dull monotone of the voice. I know.
Starting point is 00:24:48 It's so hilarious or terrifying? Or a little bit of both? I think it's both. So just to walk through this scam, what happens is that you fill out your information and it says it's $10 shipping and handling for the Le Creuset. Spoiler alert.
Starting point is 00:25:04 Spoiler alert. There's no Le Creuset. And then you start getting a $90 credit card charge every month that recurs monthly. And like most people will see that and dispute it. But the way these scams always work, like traditionally scams, is that some number of people won't see it or take them a while or they won't go through the like hassle of disputing it. And so then these scambles will make a ton of money and like i have to say the more i have thought about this video the more it actually does kind of scare me like really and like makes me take ai seriously in a
Starting point is 00:25:36 way that i didn't before because we talked a lot about before with ais we talked about like political disinformation we talked about the threat to entertainment. And I think like part of my skepticism was always that it's a very high bar to develop some sort of AI deep fake that will convincingly persuade a large number of voters or will like write a summer blockbuster that will crowd out actual artists. But if your goal is scamming people with AI deep fakes, like that's a numbers game scam has always been a numbers game it's a much lower bar to clear and like an email scam only has to hook one in ten thousand people or one in a hundred thousand people for it to be incredibly
Starting point is 00:26:16 lucrative these ais are so cheap to make they require such a small front end investment and look i think this one is hilarious in many ways. There's a very funny tweet from someone named Alex Steed. He said, I love the reality this deepfake suggests. There's a factory mishap. Selena's manager gets a call. We need Selena to appear in a video immediately. She will not. They tell her they'll settle for a voice memo, but it's non-negotiable. She does so, but very begrudgingly. That was honestly the weirdness of her voice is what, so I watched this, this is kind of embarrassing.
Starting point is 00:26:53 I've been having trouble sleeping lately because I haven't been exercising because I'm packing for a move. I watched this on my phone at 3 a.m., like bleary, half awake. Like a good offline challenger. That's absolutely, yes. I'm trying to get my addiction levels up for the next challenge.
Starting point is 00:27:06 And for a second, I was like, wow, that's so weird that Selena Gomez is doing this for Le Creuset. And part of what fooled me beyond being an idiot is the weirdness of the voice. It's like she makes this announcement once a week. She's like, hey, guys, it's Selena again. Another factory here. Here we go. We got more. hey guys, it's Selene again. Another factory here. Here we go. We got more. Just you know the drill.
Starting point is 00:27:27 Sign up. But like this is also the current version of free AI for people, right? Like this is going to get better. So the fact that like when she's talking there, it's not like her mouth's moving, but it's not really lining up. The voice sounds too monotone.
Starting point is 00:27:44 Like it could get better. The context itself is crazy. But it doesn't even have to get better. Because it's so cheap. It's so easy for so many scammers to try it. And scams already cause, let me look it up, $9 billion a year just for Americans get roped in in scams. And that's just what we know about and it's like it's a huge impact on the economy like the elderly and underprivileged people are disproportionately
Starting point is 00:28:10 affected by it a lot of people go bankrupt from it it's a really big problem and internet automated internet scamming itself is also a huge problem just because of the way that it can completely flood spaces. Like email almost crashed in like the early 2000s because email spam was becoming so lucrative and so cheap to automate that something like 99.99% of all emails were automated spam. And it's actually the- I feel like that's where we are at now.
Starting point is 00:28:43 It's gotten worse. There were real predictions in the late 90s that it was going to get so much worse that it would completely crash the internet just because there would be so much email. And actually, the machine learning algorithms that form the basis of social media were first invented to defeat email spam.
Starting point is 00:29:00 Huh. Wow. I thought you learned something new every day here. And if you want your Le Creuset. Folks, there was a factory error. We're so grateful to all of our fans. How much do you want to bet that, I don't know whether it's this year or years down the road, there's going to be a Donald Trump deepfake where he's going to be selling like Trump coins. Someone's going to be making a lot of
Starting point is 00:29:25 money from his supporters. He's already doing that. He's already going to say he's doing it with democracy now, but there'll be some money in it later. All right. Before we get to break, if you've run out of fresh crooked content for the day, no, you haven't. We have so much more to check out on YouTube. Hysteria has a series called This Fucking Guy, where they roast the men who deserve it most. Tommy has a show with Brian Tyler Cohen called Liberal Tears with rankings and drafts of everything political. There's also some
Starting point is 00:29:51 light hazing. And Lovett has a new segment called What a Weekday where he jokes about the early breaking news of the week. For all this YouTube exclusive content and more, you can head to crooked.com slash videos to watch now. Also, we got new merch in the store for kids.
Starting point is 00:30:07 Merch. Pick up brand new I Can't Vote But You Can onesies and toddler tees for all the kids in your life. Shop all Crooked Kids merch, including Read Me a Banned Book, by heading to
Starting point is 00:30:16 crooked.com slash store to shop. After the break, my conversation with Cass Sunstein about the power of noticing what's right in front of us.
Starting point is 00:30:37 Cass Sunstein, welcome to Offline. It is so great to be on Offline. It's been so long since the two of us got a chance to catch up that we actually had to schedule a podcast interview just to chat. Completely. So this is 100% social, but maybe we'll just have substance in the middle of this. Well, so in a typical year, I don't read as many books as you publish. But I'm excited to talk about your latest look again, which is about why we stop noticing things that are great and get used to things that are terrible. And a lot of what you cover in the book, we've actually talked about here on this show, social media, misinformation, the rise of authoritarianism. But before we get into all that, what made you want to write this one? Well, there's been an outpouring of books on
Starting point is 00:31:23 humanity and behavior. If you go to the airport, maybe two-thirds of them are going to be about something about behavior, and some of them aren't as terrible as others. There's one thing that's not covered at all in the books, which is the most fundamental of all, which is that if you are a dog or a cat or a horse or a person, you're very alert to change. So if it gets really cold today, my gosh, is it terrible. Whereas if you've been around cold for, let's say, a month, you're kind of used to it. And this is fundamental to living creatures, that what's fantastic and old is treated as background noise, and what's terrible and old is basically subject to the worst phrase in the English language. It is what it is. So I know you wrote this book with a neuroscientist. What does the science say about why our brains habituate, which is the term that you use in the book.
Starting point is 00:32:26 Okay, so the human brain is showing every moment decreasing sensitivity to stimuli. So if you go in cold water, your brain is going to react. It's going to show a big surprise signal, and it's going to be, I'm not the neuroscientist of the pair, so I'm going to use a very surprise signal. And it's going to be, I'm not the neuroscientist of the pair, so I'm going to use a very technical term, where your brain is on fire. But once you are in the cold water for, let's say, 15 minutes, your brain comes down and knows you're not in danger, and it's basically not noticing it anymore. You can demonstrate the neuroscience pretty easily if you show a colored object and put a little like a cross in the middle. This is not a religious idea. It's just a
Starting point is 00:33:12 crossed object. You stare at the cross, then the colors are going to disappear. You won't even see them anymore. That's because the brain is very sensitive to novelty and everything turns gray in the brain if it's staying steady state. Yeah, I saw that part in the book and it worked. I suddenly saw a bunch of gray. The part of the book about social media helped me understand something I've always wondered about, which is why don't we notice that too much screen time and especially too much social media can make us miserable. Can you talk about what you guys learned? Yeah.
Starting point is 00:33:47 If you put something in your mouth, let's say your dentist tells you to put something in your mouth that hurts a little bit. After a while, it's just this is how your mouth is and you eat and talk and your mouth hurts a little bit. Then when you take it out, you notice, my gosh, it's all comfortable and great now. And I don't have a little jolt to my system anymore. Social media is like that. It's like something in your mouth where I use social media, I like social media, but it's kind of in my mouth and it hurts a little bit. We habituate to it in the sense that maybe a really angry thing or a really ridiculous thing or a really false thing isn't going to trigger the surprise signal anymore. That's the habituation aspect.
Starting point is 00:34:37 But an angry thing or a false thing or a terrible thing is going to deliver a jolt, but it's a jolt which we take like a background noise in our system. And that's not great for our system. No, I also think that this is the major challenge with getting people to realize the harms of social media, just because a lot of times you'll see studies about the harms of social media, and then you'll get people who are social media users pushing back and saying, well, I like it, and it's good for me, and it helps me. And it's like, yeah, I know you like it, but guess what? We all, when you're addicted to something, you think it's fine. You don't do something and be like, oh, I know I'm addicted right now, and it's bad for me.
Starting point is 00:35:22 You keep doing it. There's recent work suggesting that Instagram and TikTok users, if they're asked how much money would you demand to be off it for a month, they'll say real money, maybe $100. That's consistent with your addiction story. If they're asked how much would you pay to be off Instagram or TikTok contingent on everyone being off Instagram or TikTok, then they say, oh, I'll pay you. I'll pay you for them. That's fascinating. That's fascinating. And what does the research
Starting point is 00:35:53 say about the benefits of taking a break from social media? So this is true for many things. Taking a break is a good thing from various because you can see them anew. For social media, if you take a break, it's going to be a good time. That is, the time off is highly likely to be a good time. If you're really intensely addicted, the first 10 minutes or first hour might be painful, but it'll be good. And then when you go back to it, probably you'll go back with less enthusiasm. And what's good about it, you'll benefit from more than you would have had you not taken the break. So breaks from social media are 90% positive.
Starting point is 00:36:33 Yeah, I was interested in some of the studies that you cited where people who deactivated their accounts then took a break but chose to reactivate them later. Is that addiction? This is a phenomenal finding. People were asked, how much would you have to be paid to be off social media? And a bunch of them said $100. And the experimenters said, okay, we'll give you $100 and you're off. And then they took the people who were off and compared the people to whom they didn't give the $100. And they found along every measure, the people who
Starting point is 00:37:05 were off had a better month. They were less depressed. They were less anxious. They were more satisfied with their lives. They were happier. Everything they threw at people, people were better off being off for that month. And then they asked those people, okay, now how much would we have to pay you to be off? And they said $ the same whereas they should have said you don't have to pass anything we're going to stay off now what we don't know is whether people are addicted so the idea of being off another month is like you know uh what's the best tv show there is there was an old tv show called younger it was very good i was mildly addicted to it. You were a Younger fan? I really liked Younger. So if that show had been taken off my screen, I would have suffered,
Starting point is 00:37:53 and maybe social media users, some of them were like that, so they're addicted. And it might be some of them, this is a happier story, that some of them are made a little depressed and a little anxious, but they learn about politics. They learn about their family. They learn about something that maybe agitates them. Hopefully that's not true in the case of learning about their family, but maybe sometimes is. And so they're a little sadder, but still it's worth it because they're just keeping up. One other point you guys make, you know, social media is a fire hose of negative information. Why are we drawn to consume and share information that makes us feel bad? Well,
Starting point is 00:38:33 it can be energizing. So if you see something, and I'm finding this myself, I confess, in politics these days, the things that are really not cheering in fact just the opposite they're energizing so they get your juices flowing and to be in a state of agitation um humanity kind of needs that even if it's not going to lengthen your life or improve your day and so outrage is uh an upper and to feel outrage or even fear can be something that people are drawn to, like a horror movie, where the person you care about is in trouble. Maybe that's what social media keeps telling us. Well, and then I think what's interesting about the science of habituation combined with this need for feeling outrage or agitation is sort of the more outrage and agitation you feel at each new piece of bad news or outrageous news, it dulls the senses of how outraged you are. Yeah, completely.
Starting point is 00:39:39 And this is worrisome for our country, I think, where there are terrible things. We can name names if you want. And those things which maybe five years ago or seven years ago would have seemed so horrifying as to be disqualifying or, you know, really universal outcry producing, they now are normalized. And the notion of normalization, that is kind of around us, but to get clarity that it's how the brain works, that normalization means we keep hearing something and then we aren't so alert to its terribleness anymore. Yeah. So I was going to cover this later, but I'll just jump into it now. You have a chapter towards the end titled The Devastatingly Incremental Nature of Dissent into Fascism. Fun. Why does habituation make the science of habituation tells us that there are amazing things around us, like friends and family and spouses, that we normalize and don't get as excited about as we should. And so that's a cheerful thing. But we're going to get to the distressing.
Starting point is 00:41:01 If you look at the best contemporaneous works on Nazi Germany, that is in real-time stories, people are noticing slowly, you know, in the history books it seems really fast, Hitler and then war and Holocaust. That's not how it was experienced in real time. It was Hitler and then kind of threats of various sorts and celebrations that seemed a little kind of over the top and distracting. And books, things are happening to books and Jews are kind of being monitored. happened was, as people described it in real time, they thought at first this is like an increment that's not good, but it's not going to end up where it did. And each increment was bigger and less horrifying than it would have been if it had happened all in a flash. So the people who were there said it was like being in a field with the corn growing and growing and growing, and then it's over your head. But you didn't notice it going over your head until it was. So the fall of democracy is often in stages, and that's what makes it possible. If it happened all at once, people would say, absolutely not.
Starting point is 00:42:33 Yeah, it made me, it was pretty frightening to read because of the obvious parallels with our current predicament with American politics right now. And it made me wonder if habituation would help explain why it can seem like we are sleepwalking into a second Trump term. I think it's completely fair to say that some of the, you know, almost surreal terribleness that we've observed is made possible through habituation. That the brain signals it as a part of normal American life. And that wouldn't have been possible in, say, 2011 when giants roamed the earth. So I guess my question is, I remember in 2016 after Trump won, and in 2017 and 2018, there was a lot of,
Starting point is 00:43:20 this is not normal, don't normalize this, don't let this become normal. And then that didn't really do that much. That kind of faded away. And what do we do in the context of saving democracy? What do we do about the fact that people in this country have maybe habituated to not only the sort of creeping threat of authoritarianism, but Trump himself and all of his antics. Well, let's talk a little bit about dishabituation in general, shall we?
Starting point is 00:43:52 So, there are some people in American history who have been, a long phrase, dishabituation entrepreneurs, where they take a practice that's normal and they either literally or become, and they hold a bright spotlight on it and make it seem anything but normal. Martin Luther King Jr. was a disabituation entrepreneur, said if we're wrong, then the Constitution of the United States is wrong. Catherine McKinnon, who named sexual harassment, she was the most important person behind that. She was a disabituation entrepreneur. People in the arts are often disabituation entrepreneur. People who run podcasts are often disabituation entrepreneurs. They hold a mirror up and get people to see something for truth in a way that you might not in your normal day. So if you measure reality against what happened, let's say, yesterday or the day before yesterday, and let's say something terrible in politics, then you might not see it. It might seem like a hill and you're a little further down
Starting point is 00:45:07 the hill than you were. But if you hold up a mirror compared to, let's say, founding ideals or let's say contemporary ideals, like the ideals of, I'll name someone, George Bush Sr. or I'll name someone else Barack Obama. And if you compare their understandings of what our country was about with certain other understandings, then the other understandings start to look really, really peculiar. And I think that people have to keep doing that i mean when you think of how things have gone right in countries that were at risk in europe let's say including over the last decade that's that's how it happened people hold up a mirror held up a mirror to uh an existing uh stephen king novel or philip roth novel in the making and then said, I don't want to live in that novel. Yeah, it made me think of the January 6th hearings,
Starting point is 00:46:10 which before they happened, I wondered if they would be successful because, you know, people, we went through January 6th and now we've moved on and voters and American people tend to not like to look in the past about things. They want to move forward. But it was really powerful. The hearings were powerful and well done, partly because of what you describe, right? Which is, it held a mirror up to both what happened on January 6th and what America at its best stands for. And I think even
Starting point is 00:46:43 just having, you know, a Republican Liz Cheney on the committee and Democrats together, right? Like it showed something bigger. And I do wonder if, as we move forward in 2024, it's going to be more important to sort of tell a story about how far we've drifted since Trump came to power in 2016 versus versus just yelling about how this isn't normal. I can say that Representative Cheney, the artist formerly known as Representative Cheney, she was my student at the University of Chicago. Really? I didn't know that.
Starting point is 00:47:17 She was a memorable student. She was terrific, and what was memorable about her was her unimpeachable integrity. And I didn't use the word impeachable as a pun, but that's just how she was and is, where she was definitely in quite far right of center in a way that was just principled. So that's just how she is. So she was kind of bound to be, whether it was against the left or the right, a person of principle who would be not going to habituate to terribleness because of her strong moral
Starting point is 00:47:54 grounding. And you're completely right that that's what that hearing did. And I'm thinking of McCarthyism as you talk. We don't talk maybe a whole lot about it, but McCarthyism was on the way towards being normalized. And in some big segments of our country, it was normalized, where McCarthy was like an icon of wisdom. And then the have you no shame, that line broke a spell, really. It disabituated people, too. In real time, the have you no shame, people felt a little like they were waking up from a dream, and they were themselves a little ashamed, many, for being McCarthy adjacent. Related to this is the propaganda that authoritarians use and the repetition of propaganda that they use. You write about that. You have a chapter on misinformation. How does habituation explain why we are
Starting point is 00:49:11 so willing to believe things that aren't true? Okay, this is cool and concerning. So, I don't know if you heard that Tiger Woods, it's very surprising this morning, Tiger Woods announced his retirement from golf and he's actually running for president okay so you you know it's false and the people listening know what i just said was false but and i chose something that was uh palpably false but everyone who heard that including i fear b is going to have in some part of the brain a question. Is Tiger Woods running for president? So this is called truth bias, where the idea is if people hear something, they tend to believe it in some measure just by virtue of the fact that it was said, and it's connected with something
Starting point is 00:49:58 even more dramatic called the illusory truth effect, which is if a falsehood is repeated multiple times, and we'll get to why this is so, people tend to believe it's true, even if they have no reason to think it's true. So if I told you 17 times that Tiger Woods is running for president, the likelihood that you'd think, well, maybe he's running for president would jump. And the illusory truth effect, that is repetition breeds a belief in truth has been observed in people have a lot of education people who don't have much people are young people of old men and women every demographic category the only people who don't show the illusory truth effect is people with alzheimer's and that's understandable because they don't remember so
Starting point is 00:50:42 they don't have the phenomenon looking. Okay, why is there the truth bias and why is there the illusory truth effect? It's that if something is easy to process in the head, this is the neuroscience of it, we tend to think it's true. Easier processing leads to a belief in the truthfulness of something. And I confess there was something in my mind, I hope not in yours, at one point where the notion of Hillary Clinton's emails created a kind of sinking feeling associated not just with the propagandistic nature of it, but associated with, I think, the false view that she actually did something quite wrong, which I don't believe. But because I heard it so many times, it's easy to process, and some part of my head I kind of thought, well, maybe. Okay, so the brain processes easily something
Starting point is 00:51:33 that's said at least twice, and that means people will tend to think it's so. And if you think something about health, maybe something about vitamin C, or if you think something about politics, maybe about President Biden, it might have no resemblance to the truth. It might be just you've heard it a number of times, and it's hard to get out of the head the thought that that's just a frequently repeated lie. So with all this research in mind, are there lessons for people who are fighting misinformation and trying to develop messages that people believe? Completely. So here's an experiment that helps explain what you do and what you don't do.
Starting point is 00:52:15 If people see on a glass the following sign, no cockroach was ever in this glass, they don't want to drink from the glass that's because the word cockroach and this glass are in the same sentence and people associate the two so to repeat a lie isn't a very good thing even in the context of debunking it. If it's debunked in a way that doesn't include the content of the lie itself, that's smart. If the source of the lie is challenged rather than the content of the lie, that's smart. On social media, if they say, here's the thing, it's false,
Starting point is 00:53:06 that's better than not having the false label, but it's not better than not circulating the thing as much or than taking it down because it's false. So, I mean, you have, if I may say, a completely brilliant way of repeating lies in a way that makes them seem hilariously ridiculous. That's smart also, because then people are laughing and thinking, my gosh, rather than, oh. And if they laugh and think it's ridiculous rather than, oh, that's effective. As you say that, I'm thinking about our old boss,
Starting point is 00:53:41 Barack Obama, and I went through many speeches with him where he would say, there's all these conspiracies out there. There's these lies that Republicans are telling. And I know you're not supposed to repeat them, but I want to take them on. We did this most notably in the speech to Congress about the Affordable Care Act. And he wanted to go through death panels. You know, the healthcare is going to be for undocumented immigrants. He wanted to go through death panels. You know, the healthcare is going to be for undocumented immigrants. He wanted to go through each one and debunk it, which we ended up doing.
Starting point is 00:54:09 But the whole time I was trying to think, I get his concern about he has to address what's out there already because people are believing it. And so I think that impulse was good, but also I don't want to repeat it for people because of this very problem that you're talking about. Yeah, it's a risky strategy. Now, our old boss, in my view, can do no wrong. Same.
Starting point is 00:54:33 But it might have been wrong. Yeah. But what I think is what's interesting is that you point out there's ways to address people's concerns that develop because of misinformation directly without repeating the actual lie or falsehood itself and still be effective. Do a poison pill that's directed at the purveyor of the lie rather than a kind of serious engagement with the content of the lie. That can be more effective. Can you talk about the experiment that was done with the trust and distrust options on social media? I found that fascinating. So this was an experiment which created a little social media platform in which people
Starting point is 00:55:19 could press trust rather than like, or they could press distrust rather than dislike. And it turned out that the potential trust button both got more truthful things circulated and got people more reluctant to circulate things that weren't truthful. So people were incentivized to say trustworthy things, and trustworthy things spread domain would be much more effective than like and dislike if the goal is to get truth out there. Yeah, I have mostly criticisms for Twitter since Elon Musk has taken over, but I do think that the community notes feature is possibly beneficial for some of the reasons you're talking about? Yeah, might well be. So what we want to do is incentivize people to communicate, not whether they're excited about some statement,
Starting point is 00:56:34 but whether they think it's credible. So you end the book by arguing that seeing the world from new perspectives is one way to overcome habituation. And note that, quote, more than at any other point in history human beings can be placed in contact with people dissimilar to themselves and with modes of thought and action unlike those with which they are familiar which sounds right and yet like as the world has become more connected um it hasn't seemed to improve
Starting point is 00:57:03 mental health lessened our screen addiction, slowed the spread of misinformation, prevented the rise of authoritarianism. And it seems like in many cases, the opposite has happened. Why do you think that is? Okay, let's talk about two people, one of whom is famous. And the famous one is Julia Roberts, who is a hero of our book. Julia Roberts was interviewed not long ago and asked, what's a perfect day? And she said, a perfect day for me is I wake up, I make breakfast for my kids, I take them to school, I start to get ready to have lunch with my husband. I do that, then it's starting to be time to get ready to pick up. And then she stops herself and she says, it's boring.
Starting point is 00:57:46 She says, because I'm an actor, because of my job, I go away. And when I come back, it's surrounded by pixie dust. It re-sparkles. And so what she's saying is that her going away makes what she takes for granted, what she would otherwise take for granted, that's amazing. You know, she has a great life. She sees it as fantastic. Okay, the other story is not a famous person.
Starting point is 00:58:14 Someone in China. I taught in China a few decades ago, actually. And I taught a U.S. Supreme Court case about the right to travel. And as I taught it, it's a pretty, you know, ordinary Supreme Court case. I taught it because it was kind of ordinary and simple. And I felt something was happening in the room. They were getting very upset and sad. It was like I had told them something very tragic and terrible. And I asked them after, they were all in the Communist Party, by the way, I asked one of them afterwards, why were you all so sad?
Starting point is 00:58:52 They said, as we learn about America, China seems very dark. And that was the theme of my weeks there, where I wasn't trying to do anything. I was just trying to deal with American law. They thought a right to travel, freedom of speech, freedom of religion. This is like a dream of possibility. And a number of those things aren't perfect in China. Can we agree on that? But some of those people are now working on things that make things, let's say, a little further from terrible than they would otherwise be.
Starting point is 00:59:28 Some of those people now are in their 50s and 60s. And so there are people in Turkey and in, well, even Russia and in countries where things aren't going so well for whom exposure to other things have created possibility and practices. And in the U.S., you know, for all the serious what's-the-right-word challenges we're facing now, if you look at how things are now compared to how things were, let's say, in 1955, that's because of exposure to stuff so i clerked for thurgood marshall who went to howard law school and was exposed to a thousand and one things and thought you know segregation what's that about and the idea that that was normal and you know something to which we should habituate he was kind of a rebellious type anyhow but but he was exposed to stuff that made him think this could be a rebellion that would work. So the day is young. And it sounds like the lesson here is a very old one and a moral one, which is it is valuable to step in someone's shoes and walk around in them and try to learn about different perspectives, experience different things, get outside your own mind, try to figure out, have empathy, develop empathy for other people.
Starting point is 01:00:54 And that would make, so it's really sort of an individual call to action more than anything else to help solve some of the problems that we're facing right now. Completely. Both with respect to individual things that are awful that people are facing that maybe some of us aren't subjected to but can help with. And also, we've known each other a long time. For me to write what is in some ways a self-help book is really against type because you don't know how to write a self-help book or to engage in self-help nonetheless this book is it has a little bit of a flavor of that where uh for people to you know to be thrilled by something that they take for granted which might be someone who likes them who's willing to be married to them or a house where the heating works or the air conditioning works
Starting point is 01:01:46 or something outside like a tree that's pretty fantastic outside. Yeah, well, that's why I thought it would be, as soon as you told me about it, I thought it would be perfect for this podcast because we get a little self-helpy on Offline. And so this was, look again, fantastic book. It's out February 27th. Cass, thanks for stopping by Offline. This was fun. Thank Again, fantastic book. It's out February 27th.
Starting point is 01:02:05 Cass, thanks for stopping by Offline. This was fun. Thank you. Great pleasure. Offline is a Crooked Media production. It's written and hosted by me, Jon Favreau, along with Max Fisher. It's produced by Austin Fisher. Emma John Favreau, along with Max Fisher. It's produced by Austin Fisher. Emma Illick-Frank is our associate producer. Andrew Chadwick is our sound editor. Kyle Seglin, Charlotte Landis, and Vasilis Fotopoulos provide audio support to the show. Jordan Katz and Kenny Siegel take care of our music. Thanks to Michael Martinez, Ari Schwartz,
Starting point is 01:02:40 Madeline Herringer, Reid Cherlin, and Andy Taft for production support. And to our digital team, Elijah Cohn and Dilan Villanueva, who film and share our episodes as videos every week. The Assignment with Audie Cornish is all about smart journalism and fascinating topics. Each week, Audie Cornish pulls listeners out of their digital echo chambers to hear from the people whose lives intersect with the news cycle. New episodes come out every Monday and Thursday. Monday focuses on politics and takes a closer look at how the 2024 election is shaping up. Thursday's episodes dive deep on a particular story in the headlines and how that story may impact your life. Listen to The Assignment with Audi Cornish from CNN Audio wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.