Offline with Jon Favreau - Trump’s Stock Crash, AI Gets Junkier, and Paying to Delete Social Media

Episode Date: April 14, 2024

Is FOMO the only thing keeping you on social media? Have we already reached peak artificial intelligence? And are Max and Jon too old to enjoy Glorb, a Spongebob Squarepants AI that’s become the hot...test rapper on the internet? The guys cheer on the nosedive of Trump’s media company stocks, break down the latest research in why your friends want you to quit social media, and answer mailbag questions like “will Jon ever stop getting in Twitter fights?” For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.

Transcript
Discussion (0)
Starting point is 00:00:00 I think we're going to go one phone on the trip. Wow! Are you going to do it so that if we use the phone, we both have to be there to use the phone? Because otherwise, Julie's just going to be using the phone when she's by herself. She's going to be like, I'm going to go take a walk with the phone. Good luck!
Starting point is 00:00:14 Well, you know what? If she does, then I am off the phone. So it would be just the idea. I mean, we'll see what happens. But relationships are all about keeping score. That's true. And I'm going to fucking win. I'm Jon Favreau. I'm Max Fisher. So we skipped talking about the news last week
Starting point is 00:00:36 because my conversation with Eitan Hirsch about political hobbyism went long. By the way, go check it out if you haven't yet. Really interesting. Really cool. So this week, we're going to cover a few stories that caught our eye and take some of your questions. We're going to talk about whether the current iteration of AI is a bunch of junk or awesome because of an AI rapper named Glorb. It's both, it turns out. We're also going to take a look at a new study that suggests the only thing that's keeping us on social media is FOMO.
Starting point is 00:01:05 Boy, do I have some thoughts on this one. I do too. I figured you might. But first, the stock price of the Trump Media and Technology Group, the new holding company of Truth Social, is in free fall. Yeah, man, I took a bath on this one, John. I went long on Trump Media and I am dying. Boy, boy, are you regretting. I'm going to have to step out and call my broker again. So after hitting a high of $79 a share on March 26th. Which is nuts. That's too much. It's too much for not a real company, right?
Starting point is 00:01:38 The company's stock has plunged by more than 50%, slashing the worth of Trump's stake by more than $3 billion in less than two weeks and knocking the former president out of Bloomberg's list of 500 wealthiest people. What's he going to do now? Start another media company, probably. The stock's initial inflated value, $6.7 billion for a company that reported revenue of just $4.1 million. Against $52 million in losses. That has prompted what I'd say are apt comparisons to GameStop, AMC, and other recent meme stocks. They're way too generous to Trump media.
Starting point is 00:02:15 GameStop is a company that produces goods and services. They have revenue. You're not a consumer of Trump's truths? You don't think that's something you look forward to? Even if I were, that would not give them anything on their balance sheets, which it turns out do not exist. Because those truths are free. Post-the-posts are still free. Information wants to be free, John.
Starting point is 00:02:34 Max, you covered this on How We Got Here this week. Is this just a new frontier in the MAGA grifting economy? That's what it seems like. So I think it is an attempted new frontier in the grifting economy but it is following importantly the tech grifting economy which is the greatest grifting economy many of us have ever seen so like my big takeaway from digging into like how did this company come to be valued at seven billion dollars is that the stock market is fake and money is fake and all of these valuations are just like it's whatever people say they are worth which honestly makes it the most perfect place for donald it's true yeah
Starting point is 00:03:10 that's right the king of invented imaginary value yeah um because this is just like it's not just that it's a bad company like it's the literally the value of the stock is not worth the paper that it is printed on this is a company with no revenue, with no prospect of having any revenue. Multiple investigations, audits. Right. When they first went public, their auditor had to put out what's called a going concern notice, which means we, the auditor of this company, say that it's not a viable company. So it was like, how did this company everybody knew was dog shit? How did the stock market value it at $7 million? Because a bunch of speculators were
Starting point is 00:03:49 buying it. And that's like my initial concern. And like, you heard me kind of freaking out about this when this happened. This is like, all of the people who are forever have been buying whatever gold off of Fox News, daytime ads are like, I need to go buy the Alex Jones hair pills. Instead, they're gonna buy trump media stock and i was like this is scary because the right-wing grift economy has figured out how to like leverage the stock market and leverage all these wall street guys in the wall street economy to make even more money off of stealing from their supporters basically but the fact that like which is their business model and their political model. That's right.
Starting point is 00:04:28 It just works no matter what. Yeah. But the fact that they lost half of that value within like two weeks has made me calm down a little bit because it turns out if you get a bunch of those people to buy in at first right away, but then they don't keep buying, then that value drops immediately. So you think this is sort of it? Like that's not going to go back up? So I was wondering if it's going to rise and fall based on Trump's political standing and like if he's doing well and he jumps ahead five points in the polls, then it goes up. And then if God forbid he wins in November, then then it becomes a slush fund for influence.
Starting point is 00:04:58 Right. Then foreign governments are pouring money in their corporations that are looking for something out of the White House. Well, that was what I listened to all these interviews with, like, Wall Street insiders to be like, who the hell is buying this stock thinking it was just going to be, like, Fox News people who want to be like, I support the president. So I'm going to, like, buy stock in his dog shit company. Right. But they were saying that, like. Also the Saudis. Yeah. Well, that was the thing is that, like, if he becomes president again, God forbid, then, like, obviously the Saudis and the Kuwaitis will just give a bunch of money to this company or do a bunch of business with it. Like buy out a $100 billion truth social contract to build Nome, their like giant castle in the desert.
Starting point is 00:05:35 So if you are a like ruthless investor with no morals, I'm sorry, if you are an investor is what I meant to say. And if it looks like he's going to become president, then like why would you not get in on this because the value of the stock is going to go up if it's just going to become a grift machine, which is horrible and disturbing. But it seems like that's not happening yet at least. And it does seem like this is all, aside from all the investigations
Starting point is 00:05:59 into a whole bunch of other chicanery around this company, the idea that you can have this sort of grifting set up with a fake company is legal, right? I mean, sure. There are all of these like right wing companies. There's like Public Square, which is like right wing Amazon clone. There's like Rumble, the right wing YouTube club. And they go public. And the idea is like own the libs by buying a share in our company that will drive up the value and the shareholders can sell and make all this money. But it has never worked because that is actually, you need sustained business on the stock in order for someone like Donald Trump to get his money out. You need the Ponzi scheme to keep going.
Starting point is 00:06:33 Exactly. Right. And this is what Silicon Valley has always been so good at, is they figure out ways to keep selling it to the next person. I read, I think it was a Bloomberg story about this, a comparison that it's like a Trump coin. Isn't it more like a Bitcoin? It's a lot like a Trump coin. Right. And it's in the fact that the Trump stock, much like a Bitcoin, has no inherent value. Right. Because it is not tied to an asset that is worth anything.
Starting point is 00:06:54 Right. And it's just pure speculation. It's pure speculation. It's pure like projection of identity. Right. Like people make this comparison to GameStop, which was the first big meme stock in 2021. But GameStop like which was the first big meme stock in 2021. But GameStop is a real company. And people had a legitimate investor-based reason to want to drive up that stock, which was
Starting point is 00:07:13 that they thought that it was undervalued, that all these Wall Street firms were heavily shorting it, and that actually it was going to turn its business around. And they were right, and it did. And this is really, truly just like you're saying, it's Bitcoin. It's just purely speculative. But because there are all of these other market forces, you can short a stock. You can't short Bitcoin. There are a lot of pressures pushing it down, which is why the stock keeps dropping because people keep buying shorts on it. How many shares do you own?
Starting point is 00:07:41 Well, that's going to depend on how low it drops and then how high we think we're going to go up again. Okay, well, we'll check back in on that. Speaking of junk on the internet, it's been a minute since we checked in on the state of artificial intelligence. And after some initial hot takes from everyone on the internet about the future of AI... Not from us, though.
Starting point is 00:07:57 No, lukewarm at best. Which is where the current era of AI seems to be right now. I does feel like they've caught up with us. It really does feel like that've caught up with us. Yeah? It really does feel like that. We were the pioneers. Just this week, the Financial Times published a piece about the rise of AI skepticism.
Starting point is 00:08:13 Ian Bogost, who's been a guest on Offline, wrote a great piece in The Atlantic titled AI Has Lost Its Magic. And neuroscientist Eric Hull published an op-ed at the New York Times titled AI-Generated Garbage Is Polluting Our Culture, Is It Ever? We've certainly talked about some of that junk here, like when you tried to buy all that cookware from Selena Gomez and it did not work out for you.
Starting point is 00:08:35 No, they sent me a shipping update. It's coming. It's coming. Okay, good. That's good to know. With the Snoop Dogg whatever thing that I bought. I don't even remember what the Snoop Dogg scam was. I don't either. We've also talked about some of the ways that AI could be transformative, frighteningly so. Sure.
Starting point is 00:08:50 So what do you think here? Give me a read on the state of AI right now. So I do not think that it is like all going to be garbage, but I think that we, and certainly I, like really underestimated how much we were going to be just completely oversaturated with AI produced junk. Just like this fire hose of shitty, cheap, obviously fake content. Like every time you scroll a social media feed, especially on Facebook, which is really disturbing because it's I think it's like, you know, it's a lot of older people who are on there.
Starting point is 00:09:22 Maybe can't differentiate. You see a ton of shitty AI-generated images. Google results now, you just get pages and pages of AI-generated pages. I haven't noticed that. Oh, yeah. It's full of, because, you know, people have figured out that you put some prompts into a chat GPT, whatever. You generate a trillion web pages for zero dollars. And then you try to, to like juice Google search results with
Starting point is 00:09:45 it so that people click on it. You sell ads against that. Teachers talk about being flooded with AI papers. I feel like every week you hear another story about like some high school teachers like I told my students they could have a chat GPT amnesty where they would not get in trouble if they admitted if they plagiarized their paper with jet gpt and like every hand goes up in the classroom and there are all of these studies that like even academic like peer reviews it turns out like they're a lot leaning on ai um and i just like i keep thinking about do you know the great pacific garbage patch have you heard about this what is it refresh my memory so garbage ends up in the ocean yeah right they're like like microplastics, whatever junk. And because of the ocean currents,
Starting point is 00:10:28 it all ends up in the same place in the middle of the Pacific. So there's this giant, it's this like continent made of garbage, three times the size of France in the middle of the Pacific Ocean. And I feel like that's what I'm swimming in. I feel like I'm in the great Pacific garbage patch
Starting point is 00:10:43 of AI content. I haven't encountered it as much. Like, I see the images pop up here and there, and you can tell it's an AI-generated image, and you can see the ChatGPT stuff a mile away. But I have not been exposed to it as much as I thought. Maybe I'm just overly sensitive to it because it offends me on some basic level.
Starting point is 00:11:00 Oh, it's definitely offensive. I mean, my view on all of this is, like, of course there's going to be glitches. The FT article was all about how these, like, large language models keep having these glitches and problems. And they, like, freeze up. Right.
Starting point is 00:11:15 And then there's some people who think that, like, this is going to, like, poison them because then as they scrape the internet, they scrape more. It gets fed back into it. Yeah, it gets fed back in and it's just horrible. So, like, that's obviously going to happen, especially at the beginning. It gets fed back into it. Yeah, it gets fed back in and it's just horrible. So like that's obviously going to happen
Starting point is 00:11:25 especially at the beginning and there's going to be failures. Sure. I think it's, I also think it's like way too early. We thought this from the beginning that it's like way too early
Starting point is 00:11:33 to conclude that the robots are taken over. Sure, right. That the apocalypse is nigh. Right. But like I don't think you have to buy into the more apocalyptic
Starting point is 00:11:41 grandiose versions of like what AI could be to think that it could like significantly reshape more apocalyptic, grandiose versions of what AI could be to think that it could significantly reshape society. Sure. Our friends at Hard Fork just did an episode about all the companies that are
Starting point is 00:11:56 already using AI to do all or at least some of the jobs that humans do. So there's already AI-based layoffs that are happening. And of course, the companies aren't saying that it's because of AI. So there's already AI-based layoffs that are happening. And of course, the companies aren't saying that it's because of AI.
Starting point is 00:12:08 So I think that we are in the very early stages of this. And it's way too early to draw conclusions. But one of the big themes in this show has been the original promise of the internet
Starting point is 00:12:21 and social media connecting humanity together has not come to pass. I wouldn't say if I feel more connected to alt-right trolls. Right. Yeah. Instead it's like mostly fueled our, our, our, uh, current hellscape, but it has, it's been somewhere in the middle. Like there have been good parts, there have been bad parts and then there's been plenty of junk. Right. Right. That's been the story of the internet and social media. Yeah. It's possible that that could be the story of AI as well.
Starting point is 00:12:50 Some really bad parts. Yeah. Some really helpful parts. And then just a lot of shit in the middle. I think that's something. I think that's exactly right. And something that I keep trying to go back to as a kind of like starting point heuristic is that these companies are all just trying to make money. And they are all. I don't say that as it like therefore they're evil and we hate the companies um although it doesn't create
Starting point is 00:13:09 great incentives but i think what that tells us is they are going to trade to try to make money in the established ways that you can make money like this is why we saw these social media companies that created this technology that was so powerful that can like really peer inside our minds can deduce what our like impulses and instincts are and what will most please and outrage and satisfy us and what they geared all that towards was selling ads because that's where the market was and where the market is for ai currently is um consulting and outsourcing to businesses and advertising and that's a good point that's that doesn't necessarily mean that it's going to be a terrible thing or going to be a great thing, but I think that I always try to remind myself
Starting point is 00:13:49 there's a baseline assumption that they're selling like fancy Microsoft Word tools, Microsoft Office tools for office places, which is fine, but it's going to be a lot of stuff like what you talked about was on the hard fork episode of like replacing people who are doing kind of menial tasks. I will say that I have tried to use a lot of these AI chatbots to do blunt force research. Like this week we did a big episode on how we got here about student loans. And I was trying to, I was like, oh, this is perfect for AI. I want to learn about student loans.
Starting point is 00:14:22 Well, I'm looking at all this stuff,, you know, I wanted to know, like, what was the average cost of college in the U.S. in 1958 in today's dollars? Or like, what was the rate of change in student loans? Seems like a perfect question for. Right. Yeah. It's just like digging into that. And it was garbage. Like, I mean, not even giving me like, oh, it turns out this data was off a little bit, but it like didn't understand the questions. It was pointing me in the wrong places. It was like all this work to try to get the right thing out of it. Now I might just be a fucking moron who was using it incorrectly. Like seriously, I would not rule out the possibility that you just have to be smarter about using it. But the like amount of knowledge work that it was going to do has not borne out, but we know that it is good at spamming and we know that spamming
Starting point is 00:15:02 has a lot of commercial potential to it. So I think that no matter where AI goes, for better or worse, bigger or smaller, I think the spamming is going to be with us forever because it's successful at that, it's cheap to do, and it makes money. And I think because it's still relatively early, that it could very well improve by leaps and bounds. For sure. The LLMs, right? The large language models. We just don't know yet. I also think it's very possible that both the best and worst consequences of AI
Starting point is 00:15:33 have not even been conceived of yet. What do you think? What are the kinds of things? I don't even know. That's what I'm saying. When technology advances like this, and especially advances this quickly, you start off thinking oh it's going to help this problem or it's going to cause this problem right and then just something that no one
Starting point is 00:15:49 has thought of yeah just could come to pass it is always a big like even when social media was first becoming really big in the like early 2010s and it was like oh it's going to bring revolution or is it going to bring chaos i did not at foresee, and the people I was talking to did not foresee, that actually its big effect was going to be magnifying certain innate tendencies that we had in us as individuals and as societies. It was going to be kind of pulling us in certain directions. And that was when...
Starting point is 00:16:15 Didn't think we were going to break our brains. Right, right. And in the very specific, consistent way. And looking back, the evidence was there. You can go back and be like, oh, this is what the news feed did. And you can like trace ahead. But you're right.
Starting point is 00:16:28 It is very hard to see where it's going to go when the technology is this powerful and is this kind of everywhere in our day-to-day lives. of social media that we've already talked about by continuing to deepen the relationship between humans and the illusion of connection with someone else that is now not going to be there. So a lot more people just like lonely, using their AI bots to help them work and not like communicating and having relationships with other people. But it's all like that part worries me.
Starting point is 00:17:03 This information problem, obviously we've talked about, like every problem that exists with other people, but it's all like that. That part worries me. Yes. Information problem. Obviously we've talked about like every problem that exists with social media, I feel like could be magnified by the advance of AI. That's a good point. And I often, it's making me think of when I'm often talking to people about who were skeptical of the idea that social media could be bad for you. Something I try to emphasize is like you log onto your platform, Twitter, Facebook, facebook whatever and you think that you are using this app to experience a relationship with people on the other side of it but in fact what you are having is a relationship with a social media algorithm that is using real people as like sock puppets to say what it wants to say or to show you the kinds of content that it wants
Starting point is 00:17:38 to show you so to some extent we already live in a world where we all have relationships with these powerful ais that are manipulating us to try to get us to do certain things. So yeah, the idea of what is the LLM driven social network? I don't know. I mean, when Sora style models start mass producing videos for a TikTok like platform, what is that going to be like? Last week, NBC News published a piece about Glorb. Oh my God, I'm excited about this. An artist that has been remixing popular rap songs with AI versions of Spongebob Squarepants characters. Who cares, you may ask?
Starting point is 00:18:25 Well, Glorb has millions of streams on Spotify, TikTok, and YouTube. Half a million followers on YouTube alone. To sum it up in a tweet from Amular93, quote, AI is trash, but I can't stop listening to that Glorb, question mark, Spongebob song on TikTok. Austin, can we play the Glorb, question mark, Spongebob song on TikTok. Austin, can we play the Glorb? Wow. Heads are bopping in the studio. Kind of a banger.
Starting point is 00:19:09 Do you like it? I don't like it, but you have to understand. I drive my child to school twice a week now, and I'm forced to listen to the Hot Wheels Let's Race theme song, Peppa Pig, and Octonauts on repeat, right? That's what I listen to. I listen to it, and I was like, oh, I'm old. That's what's happening here. It on repeat, right? That's what I listen to. Yeah, I listen to it and I was like, oh, I'm old. That's what's happening here. It's an earworm. That's what I'm hearing. This is what you listen to at a club with bottle service.
Starting point is 00:19:32 We were talking about bottle service before the show started. That's what we talk about usually before the show starts. That's our warm-up. Bottle service clubs. Yeah, that's right. This is wild. Now, you can see the iteration of this where, again, mass-producing crap and mass-produced crap. Sure.
Starting point is 00:19:49 Traditionally, historically, very popular in America. That's all I'm saying. So, okay. First off, can I read a line from you from this article that I really enjoyed? Quote, Glorb, who declined to be interviewed, isn't publicly affiliated with Nickelodeon. Nickelodeon's like... Yeah, they were happy. They were happy that Glorb, who declined to be interviewed...
Starting point is 00:20:10 I wonder if Nickelodeon's going to send a cease and desist to Glorb. I think Nickelodeon probably loves it. Yeah, that's probably right. So, okay. Glorb was not to my taste. This is not my music. That's not what you listen to now. But I think it's cool.
Starting point is 00:20:22 I think it's really cool that this is happening, that someone is finding a creative way to use AI to do something that you wouldn't anticipate. They're using it as like a tool. Like it really reminds me of the first days of synths and drum machines. When like everyone
Starting point is 00:20:37 was really afraid that like, oh, this sounds like a bad piano or a bad drum. People are going to use this to replace drummers, replace piano players. And music is just going to suck now. It's going to use this to replace drummers, replace piano players, and music is just going to suck now. It's going to sound like this shitty drum machine. But in fact, what happened was that like, creative people figured out how to use it to do things
Starting point is 00:20:53 and create kinds of music that you could not do before. And that's how we got like craft work. Yeah. Now this is going to be, I think it's, the tool is the right world, right? Like, and in the right hands, it can enhance creativity. And you can have silly things like Glorb, but I would bet that this use of AI
Starting point is 00:21:12 is probably going to produce some good stuff. I think we're going to see a lot of really cool stuff with AI in music, partly because we've talked about this before, the music industry is really embracing it. Like they're really trying to work out the rights question with how to use people's voices. That's the big challenge, right?
Starting point is 00:21:28 Sure. And again, in all this stuff, there's pros and cons. And the con that they're going to have to overcome here is the copyright, the voices, all that kind of stuff. Right. But I think part of what makes me optimistic about it is that the uncanny valley-ness of AI. If you try to make an AI movie and the script sounds like it was written by chat gpt it would suck if the visuals
Starting point is 00:21:49 look like they were made by sore it would suck but with music you kind of you're okay with the fact that it sounds weird or artificial and that can actually be cool and can add to it so i think this is where a lot of the like really exciting work with ai is going to take place. Right, listening to Glorb on the way home. Banging that Glorb. In other news, this is some surprising news. A recent study published in the National Bureau of Economic Research by Leonardo Burstyn, Benjamin Handel, Rafael Jimenez, and Christopher Roth has confirmed that nobody actually likes being on social media. I sure don't.
Starting point is 00:22:23 I know of one podcast. You don't need a study. That's a fancy study. You're a highfalutin study. Give us the grant. This one has some interesting findings. 53% of Instagram users would prefer to live in a world where the platform does not exist. Over 90% of iPhone users would prefer to live in a world where a new iPhone was released every other year.
Starting point is 00:22:42 That was wild. And that 79% of Instagram users cite FOMO as their main motive for social media consumption. What did you think of the study? Is FOMO the only reason any of us are still on social media? So I think we, let's get into the, like the FOMO thing. And because I think they a little bit misstate when they use FOMO, I think they're a little bit misusing that word. I thought it was really valuable about this study. We have had studies before that show that people want to be off of social media but have a hard time making themselves do it. And whenever some social science experiment comes along, gives them excuse to do it, they're much happier.
Starting point is 00:23:16 Their life satisfaction goes up. They love it. They want to stay off forever. A lot of times they will stay off for a really long time. I think the contribution of this study is that it shows us that people don't just want to be off social media themselves. They want everybody else to be off of social media because we all increasingly understand social media as something that we are collectively obligated to. We feel it as a collective social obligation. So we kind of know that social media
Starting point is 00:23:42 addiction is not just an individual problem, but it is a problem that as long as society itself was on it, like I'm going to feel compelled, pressured, obligated to be on there. I thought it was interesting how they did the survey. So they asked all these students, like, would you prefer that we keep everyone's social media usage as they are? Or do you prefer that we actually went ahead and deactivated the accounts of your classmates and yourself for four weeks? And if so, how much are you willing to pay for us to implement this collective deactivation? So the same students that you would have to pay for them to quit social media alone, they asked that question and they said, oh, you'd have to pay us.
Starting point is 00:24:17 They were willing to pay them to remove social media from their network. And it switched that fast. It's amazing that not only are these services free, which is what they're worth to us, but we would actually pay money to be off of them. That is how much we hate social media. But not if it was just us. That's right, right.
Starting point is 00:24:37 Then you have to pay up. Isn't that funny? It's not if it's just you as an individual. I mean, it's not the traditional use of FOMO, but it's this obligation, like you said. Right, yeah. I think it's not the traditional use of FOMO, but it is a, it's this obligation, like you said. Yeah. I think that's,
Starting point is 00:24:47 I think when they use, they use it in kind of a tongue in cheek way. And I think what they really mean is this sense that like, that's where society is. Like that's where our friends are. That's where our family is. So we feel like we have to be there as well. Or we feel like it's normative,
Starting point is 00:25:00 like it's expected of us. Like it really made me think about how, I think important it is to set norms with kids now who are in school about not being on social media, not being on your phone. And again, like you read all of these anecdotes. I like cannot get enough of these stories about like some school in Texas made everyone give their phone up and the kids are all like, hallelujah, this is amazing. They love it because it has to be that collective shift. You just tell one kid, put away your phone, then they're just missing out they feel the sense of loss first thing that popped into my head when i read this was um the tiktok ban the potential tiktok ban yes and obviously
Starting point is 00:25:35 there's a lot of people like do not take my you know that we've talked about this right they're calling in congress but if it was done if it just I wonder, like, it's tough to even look at polls for this right now. Because I think like, it's, I think, I think the asking the question now versus, okay, you wake up tomorrow and there's no TikTok. How does everyone feel? Or there's no Twitter, there's no Facebook. Well, that was, I thought a really, like pretty funny finding from this study because they conducted it while all this TikTok ban stuff was happening. If you ask people on Instagram, would we be better off if everyone was off of Instagram? Everybody said, yes, absolutely. And with TikTok, suddenly there's a big drop among people who use TikTok.
Starting point is 00:26:13 And I think that's because it's been made very real for them. It's very salient. It's not just a hypothetical. And also that means that you picture it now. It's like Uncle Joe is taking my TikTok away rather than like we as a society are agreeing to give it up, which, you know, I think is relevant for whether a ban happens or how it happens. But it did make me think, it made me feel slightly differently about the TikTok ban,
Starting point is 00:26:34 even though I still don't think it's a good idea. It did make me think like, probably if everyone felt bought into that instead of it being forced on them, I think we would like it a lot more. Right, and if everyone took the leap, right? And if it's like a- Yeah.
Starting point is 00:26:45 Right. It also shows like, this is what governments are for actually, because sometimes... Not to be all nanny state, but it's like, yeah, sometimes you don't know what is good for yourself. Well, that is the other thing with the example of schools, is like the schools that have tried this, is like, just take away everybody's phones. Or just say, don't have a phone for eight hours during
Starting point is 00:27:01 the day. Like, does the nanny state, does the like effectively TikTok ban on a micro scale, not only does it work, but everybody is happier. Right. We all, we kind of want that collective action. Did everyone want speed limits? Did everyone want to get pulled over when you're 10 miles over the field?
Starting point is 00:27:17 Probably not, but like, yeah, we're pretty happy they're there. Right, yeah. All right, in a second, we're going to answer some questions submitted by offline listeners and the Friends of the Pod Discord community. But first, we got some quick housekeeping.
Starting point is 00:27:29 Love to keep the house. All right, if you're here in LA, Pod Save America will be live at the Festival of Books on April 21st. Hysteria's very own Aaron Ryan, Tommy, Dan, and me, we're going to have a great show. Grab your tickets at cricket.com slash events. Also, it's all hands on deck this election year.
Starting point is 00:27:44 We're even putting kids to work. Jesus. In a fun way cricket.com slash events. Also, it's all hands on deck this election year. We're even putting kids to work. Jesus. In a fun way. In a fun way. With merch. I love it when they trick you into committing a felony on Mike. Pick up a brand new I Can't Vote But You Can onesie and toddler tees for all the kids
Starting point is 00:27:59 in your life. Both my kids have one. Okay. Yeah. We have the t-shirt for Charlie. We have the onesie for Teddy. I'm going to try to squeeze into it for swimsuit season. You should. Get my beach bod.
Starting point is 00:28:12 There's no better reminder of what's at stake than a baby who might not be able to vote when they turn 18 because our country elected a lunatic obsessed with ketchup and fascism. That's...
Starting point is 00:28:20 Wow. That's dark. Again. Didn't know where that sentence was ending. Ketchup. But there it is Shop all Cricket Kids merch by heading to
Starting point is 00:28:27 cricket.com slash store to shop And finally Max Right So we wanted to remind people to check out my new series with Aaron Ryan
Starting point is 00:28:36 How We Got Here where every week we explore a big question behind the week's headlines tell a story that answers that question Our question this week How did student loans
Starting point is 00:28:44 go from something that most people agreed as we once did that they were a big net positive for students in the country into a crisis dragging down the whole u.s economy check out new episodes of how we got here every saturday john on the what a day podcast feed and as you all heard uh this one is also brought to you by ChachiPT. That's right. All right, we'll be back with questions. So, uh, this week, uh, our producer, Emma, asked the friend of the pod Discord for some questions. We got a lot of great responses, but of course, we only only have so much time so we're going to just answer a couple um sudoni asked will you ever consider doing another offline challenge or are there any personal goals that y'all are trying to keep in order to not get to that we're always at that
Starting point is 00:29:40 i'm going to live the rest of my life at that point of needing an offline challenge yes that is me too not Not even joking. No, I'm back at four hours a day. I was just checking before we did this. Yeah. I'm at like two and a half. But I don't have kids. It's election year too.
Starting point is 00:29:54 It's just there's too much news. What app are you on? I'm not on Twitter. It's just my drug of choice. I know. I've been spending more time on Instagram, which I don't feel good about. It's not healthy time, but I feel like that's where I'm going to de-stress rather than going to Twitter to stress scroll.
Starting point is 00:30:08 Can I just say, Instagram is crap. It's good. Well, this is why I'm seeing so much AI garbage. It was just, well, I liked Instagram when it was like- Your friends. My friends and family. And now it's just like, there's just feeding me so much shit I don't want.
Starting point is 00:30:22 This is the problem with every social network is it becomes successful enough that everybody wants to use it because it's actually doing something useful in helping you connect with people, then they have to get that fucking infinite growth, so they start shoving you in front of content, and they start shoving content in front of you that you don't want, don't care about, but to keep you scrolling. Social media,
Starting point is 00:30:38 it turns out it's bad. It's bad, and I do think we should do another offline challenge, just to like, refresh. Are there any particular challenges? No, it's just that I, I mean, the good thing about the first one that we did is even though I'm back to four hours a day, like in my mind, when I'm feeling frustrated at home and I'm like trying to parent and trying to do this, I'm like, wait a minute, am I just looking at my phone?
Starting point is 00:31:02 Am I trying, am I distracted by a couple different things? Maybe I should put it down. And sometimes it works. I stop myself now all the time when I'm pulling my phone out of my pocket to look at it. Now I'm aware like, oh, I'm looking at my phone because I just had a thought that stressed me out. Not because I want to look at my phone. And that is really helpful. I really try when I'm home to like put it in one place. Where do you put it? In my office at home. Okay. The challenge now is we've implemented with Charlie, like, okay,
Starting point is 00:31:28 we're going to go to bed now. Well, I want to watch another show. I want to play five more minutes. And then we do a timer, five more minutes. So I have a timer on the phone. So I need the timer. Oh, yeah.
Starting point is 00:31:37 I carry around the phone now as a timer to help parent my child. Unfortunately, there's no other way to measure time. Someone get me a stopwatch. Did they really? No, I said, someone Someone get me a stopwatch. Did they really?
Starting point is 00:31:45 No, I said, someone please get me a stopwatch. You heard them, listeners. I just realized, man, that would probably work. Yeah, I'll get a stopwatch. It turns out
Starting point is 00:31:54 the problem of telling time had actually been solved before the Apple Corporation came along. It is. I just need, I do need to go back to the, and I got one of those phones,
Starting point is 00:32:03 the light phones. Yeah, D.V. used it. My brother and sister-in-law got me one for Christmas and I just have not opened the box yeah Julia got one for Christmas and it has been sitting like in the box
Starting point is 00:32:15 in her desk ever since I should just do that I should open it the amount of time that she and I have spent saying you should open the light phone
Starting point is 00:32:22 you should try using yeah you should use the light phone we should switch over to the light phone it's so hard that's what addiction is we're all addicted well it's a big step and like i so i feel like one of the things we learned last time is that permanent changes are really really hard like i did stay on black and white for like a few months and that did it a ton for me but i'm fucking back i'm addicted to the color i'm a junkie for color um and I like Emma was showing me this new thing iPhone assistive access that some people are using as it's like
Starting point is 00:32:51 kind of turns your iPhone into a light phone but it sucks it's not really designed for that it's not very good at it I think what I'm actually gonna do is I'm taking it like a week off with Julia in a couple of weeks I think I'm gonna leave my phone and just use hers I think we're gonna go one phone on the trip wow yeah I think because you don't need are you gonna do it so that like if we use the phone we both have to be there to use the phone because otherwise like julie's just gonna be like using the phone like when she's by herself she'll be like i'm gonna go take a walk with the phone good luck well you know what if she does then i am off the phone that's good so that's so it'd be just the idea like i mean we'll see what happens you know but relationships are all about keeping score that's true that's true and i'm gonna
Starting point is 00:33:27 i'm gonna cream her um but anyway yeah we should do it we should do an offline challenge again i'm i'm for that i'm ready uh alphonse asked what are your personal online rabbit holes i mean you all know mine it's twitter Twitter. Specifically, these days, it's election nerd Twitter. Oh. So, like, I have a list of, like,
Starting point is 00:33:49 election, you know, because my, like, of course, I'm also obsessed with the polls. Everyone knows that. But I'm not necessarily obsessed with,
Starting point is 00:33:58 like, the horse race polls anymore. Right. But I'm big on, like, my nerdiness is, like, demographic change
Starting point is 00:34:03 and the electorate. I've been like that since college. You know, that's what my thesis was about. So, big on like, my nerdiness is like demographic change and the electorate. I've been like that since college. That's what my thesis was about. So I go deep on that if I'm in a rabbit hole, which is very lame. YouTube videos. Who's your preferred Nate? My preferred Nate is Nate Cohn. Yeah.
Starting point is 00:34:17 For sure. All the heads now. Nate Silver is an example of someone who has become the caricature that people depicted of him. He needs to log off. At first I was like, maybe it's unfair. Now I'm like, no, no, now the caricature that people depicted of him. He needs to log off. At first I was like, maybe it's unfair. Now I'm like, no, no,
Starting point is 00:34:27 now you've become that person. All right. YouTube videos of people assembling children's toys. What's the best one? What makes for a really excellent toy?
Starting point is 00:34:38 We just watch Hot Wheels all the time. Charlie's so into Hot Wheels. Although he's moving from Hot Wheels, thankfully, to Legos because Legos are a little bit more creative. It's probably more fun for Hot Wheels. Although he's moving from Hot Wheels, thankfully, to Legos, because Legos are a little bit
Starting point is 00:34:45 more creative. Sure. It's probably more fun for you. I spent a lot of my last weekend just building Legos. Oh, yeah. I saw the one that you posted on Instagram. It was very practical. They're Octonauts Legos. Octonauts is a show on Netflix. It's a bunch of sea creatures. They talk about conservation. It's like some educational
Starting point is 00:35:01 benefit to it. But he's really into octonauts and the toys so we're building that and then my other rabbit hole is um piano tutorials for pop songs that's what i do on uh tiktok if the extent that i go on oh okay that's not as much yeah i would when i was addicted to tiktok i did have some it was like cooking tiktoks and stuff that was i enjoyed it and i i told myself that it useful. I was never doing any of these fucking recipes. I would bookmark them. I never made any one of them, but I was like,
Starting point is 00:35:28 oh yeah, I'm learning a lot by watching the guy do another thing with cool Taiwanese noodles, whatever. Yeah, that's cool. That's a good way to pass the time. My worst rabbit hole that I don't do so much anymore, but I wanted to bring it up because I spent so much time on it, is somebody did a bad tweet. I better look at their profile
Starting point is 00:35:45 and read all of their bad tweets and get mad about a tweet they did three months ago that was bad. Don't we all do that? That's what I mean. I think that's a like, you know, it's the main character rabbit holing. That's a good one.
Starting point is 00:35:59 I mean, that made me think of, we have this, as hopefully you all know, we have this show Terminally Online for Friends of the Pod subscribers. and you're supposed to find something that makes you that proves that you're it's the opposite of this show i do both shows uh it's important to keep balance in the force and i had like last night because i did it today we did one today and i was up for like an hour and a half just like just deep dive scrolling through Twitter. And I was like, it's not a political thing that I'm looking for.
Starting point is 00:36:29 So now I'm like in all different parts of Twitter. And you get into that where you're like, you look for bad tweets and then crazy people. And it just takes you down a rabbit hole. You're like, oh, it's also I feel like the algorithm is getting so effective at identifying those people. Now, when I do this, I get like two scrolls deep and I'm like, oh, this person is actually like struggling with something that is coming out
Starting point is 00:36:48 through bad tweets and it doesn't feel good that they're getting dunked on. Yes, I have come to that conclusion. I have started looking for like good rabbit holes too because I've just accepted I'm going to do
Starting point is 00:36:57 social media rabbit holes. Yeah. I like started doing a lot of like letterbox dive rabbit holes. Like watch a movie over the weekend that I really like. So let me like read everything my friend said about it. And then like click on best director and all of his other movies. And what did my friend say about all those movies.
Starting point is 00:37:13 And it's great because there's no outrage and there's a bottom. I like doing that too actually. It's nice. And I don't get to watch good movies, television shows as much as I'd like to. But when I do and I really like one, I start doing the like, read the reviews, read what people are saying. Like that is fun. It is very, it feels low stakes
Starting point is 00:37:30 compared to everything else we do. It feels low stakes, but you'd get that feeling of social because the people you know, whether it's people you follow or your friends reviews. Yeah. It's nice.
Starting point is 00:37:37 That's nice. Okay. David Trowbridge asks, I live in a blood red part of Michigan, but because of work and life, I've built up a small amount of trust slash social capital, even though everyone around me is conservative. Good for you.
Starting point is 00:37:50 How can I effectively use social media? Don't. David. Should I share political articles with my thoughts? Should I respond in the comments? What can I do that will actually help move votes to Biden in my critical home state? David, David, David.
Starting point is 00:38:06 You know the answer to this, David. And we're not saying this is like high and mighty here because I'm on there posting too. I'm on there posting. I do think, yeah, I think the best thing to do here is go have some real conversations with people. Get off social media. You don't even have to do it in person. You can use technology, text conversations, email conversations again, this is what Eitan Hirsch's book is all about
Starting point is 00:38:29 we talked about last week politics is for power, go check it out but I would say if you're going to post if you're going to post about politics post every piece and comment as if you are talking to someone or at least trying to reach someone
Starting point is 00:38:44 who might vote for Donald Trump or be okay if Trump wins and they didn't vote. And like really try seeing things from their perspective and showing some empathy because if what you're trying to do is move votes, posting the piece that simply reaffirms your biases and priors or to show how right you are
Starting point is 00:39:05 or the pieces that are likely to get the most RTs and likes and whatever the fuck they're called these days, that's not going to move any votes in a red state. That's not going to move any voters. That's going to get a lot of liberals who are already voting for Biden to be like, yeah, you're fucking right. But if you're going to post, do it in a way where, and I still don't think it's going to work that much.
Starting point is 00:39:24 I do think the in-person conversations are going to work or the conversations one-on-one. But that's how I would do it. polarized subject like Israel-Palestine and I want to post it in a way that will reach people who I think are polarized in the opposite direction of like I want to persuade people who I think will disagree with the premise of this article basically yeah and where I came around is that after spending like years trying to like craft the right tweet that would pull them in and like sound like enticing but not too scary is that it just can't be done that I just like you can write that tweet you can do it and there are like ways to too scary is that it just can't be done. That I just like, you can write that tweet, you can do it. And there are like ways to frame things, but that the just the structure of social media is so polarizing. And there's like this, there's all of these studies that show that if you read the same sentence, literally the same sentence, however, it's written on a social media
Starting point is 00:40:19 platform, or in some other context, let's say an email, like say news article, you see it on the social media platform, you immediately read it through an ultra polarized lens. And you either say this affirms my identity, I agree with it, or this, you know, cuts against my identity and I disagree with it. And that does not apply to nearly the same extent on any other platform. So just, if you're going to use social media, use it to organize a bowling league and that's it. And then go meet people, seriously, and then go meet people in person. Well, this, this, uh, is a great segue into the final question, uh, from Polly Seinert who asked, will John ever learn to not get in Twitter fights? The answer is absolutely not. Absolutely not. I love this came from Polly Seinert because I like
Starting point is 00:40:59 to imagine that this is part of like a political science paper. Like this is being rigorously studied. Is it possible for Jon Favreau to not get into Twitter fights? Okay. So Jon, let's hear your most cancelable take on the eclipse. Let's hear it. The eclipse needs to learn the value of organizing.
Starting point is 00:41:14 I know I've gotten in a few Twitter fights that have gotten a little out of hand recently. No. No. But here's what happened. I really don't... I have improved a lot since... It's true.
Starting point is 00:41:26 2013, 2014, fresh out of the White House, finally got my own Twitter account, can say what I want. I went nuts, right? I went nuts. I have matured since then. But what gets me now, from the last question that we were just talking about,
Starting point is 00:41:39 when I did that long thread about Ezra's piece or thought that Joe Biden should step down. Like my thinking was, I don't necessarily agree with this. A lot of people are freaked out and might agree with Ezra. So I'm going to write the thread in such a way that empathizes
Starting point is 00:41:56 with Ezra's argument and takes it at face value, but then tries to tell people what the downsides are. That did not work. People misunderstood what I was trying to say and they didn't get it. And then that got me mad
Starting point is 00:42:08 because it was social media. And so then I jumped in and was like, well, you know, I'm like, that's why it's bad. But it did get you sending in like a voice memo for Azure podcast, right? Well, so like to like bring it back to your point about like social media for all of the bad,
Starting point is 00:42:22 like all forms of technology also has good and bad. i just read this time story like a week ago that i'm sure you saw this that it was talking about biden and gaza and i quoted you and ben and like from tweets that you had written that i thought were like i totally agreed with and were really good and i was like i'm so glad this is in the fucking newspaper of record and that would not have happened without twitter so there is some value to like throwing your sharp elbow takes out there if you were a former presidential speechwriter that was a good lesson and because this is also like when you don't use twitter as much anymore and you don't get in the twitter fights because i'm like we have podcasts right right ben and tommy have made those criticisms of joe biden on almost every episode of pod save the world not only just on the
Starting point is 00:43:04 episode but if you don't listen to the episode we clip deep parts of the episode and then we put those on Twitter. But then like, I mean, I'm not the foreign policy expert, but I'm like, I feel very passionately about it now. I've educated myself on it. Listen to Pod Save the World, read a lot about it. And I'm like, you know what? I feel strongly that I'm going to say something. Suddenly it's on like Playbook and the New York Times. And I'm like, oh, okay, well, there we go. Now everyone's like, Obamabook and the New York Times. And I'm like, oh, okay, well, there we go. Now everyone's like, Obama people are just jealous of Joe Biden. I was like, what the fuck?
Starting point is 00:43:29 I mean, sometimes there is value to having a platform that for all of the 99% of the ways that it's terrible, and I think 99% is an accurate representation of how horrible Twitter is, it is still useful as a text-based platform where for better or worse, everyone can issue press releases at a moment's notice, and sometimes that's good. Well, and we've talked about this before, but it is still like an assignment desk for reporters. That's true. For a lot of reporters, especially political reporters. And I've sort of forgotten that over the years a little bit because, you know, it's sort of fallen apart and there's a million other means of communication and stuff like that.
Starting point is 00:44:04 But still, you put something on Twitter, that's what reporters are looking for. There's such an irony that politics reporters who are as phone-addicted as ever, like, yes, you can see the one-to-one pipeline from Twitter to the articles, but tech reporters, the early adopters, they're all off their phones.
Starting point is 00:44:22 They all have grayscale on their phones. You don't see tweets in their stories so much anymore, do you? That's very interesting. Anyway, everyone unplug. That's right. That's our lesson for today. That's what I'm doing. All right, Max, that's our show for today.
Starting point is 00:44:34 And we'll be back next week. See you then. Offline is a Crooked Media production. It's written and hosted by me, Jon Favreau, along with Max Fisher. It's produced by Austin Fisher. Emma Illick-Frank is our associate producer. Mixed and edited by Jordan Cantor. Audio support from Kyle Seglin and Charlotte Landis.
Starting point is 00:45:02 Jordan Katz and Kenny Siegel take care of our music. Thanks to Ari Schwartz, Madeline Herringer, and Reed Cherlin for production support. Thank you. The spread of misinformation has fueled our cultural divide and increased our collective anxiety about the future. Tackling misinformation isn't a simple task, but it is important. And that's why I'm so excited to tell you about Conspiratuality, a podcast that's dismantling new age cults, wellness grifters, and conspiracy mad yogis. On the show, a journalist, a cult researcher, and a philosophical skeptic dive deep into current events like Project 2025, the Heritage Foundation's dystopian vision of the future, the rightward drift of former leftists,
Starting point is 00:46:04 and RFK Jr.'s conspiracy theory-fueled presidential run. They crowdsource, research, analyze, and dream up answers to the problem with proven science as their ultimate guiding light. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.