Pivot - SCOTUS and Social Media, Reddit's IPO, and Guest Barbara McQuade

Episode Date: February 27, 2024

Kara and Scott discuss the Supreme Court cases that could change social media as we know it, and the recent disturbing investigations into parent-run accounts on Instagram. Also, Reddit is finally goi...ng public in the first major tech IPO of the year. Will it be successful? Plus, Kara's memoir "Burn Book" is hitting the shelves, featuring fascinating and entertaining stories from her reporting days in Silicon Valley. Finally, our Friend of Pivot is former U.S. Attorney, Barbara McQuade, who's written a new book, "Attack from Within: How Disinformation is Sabotaging America." Barbara explains how she thinks disinformation can be defeated. Follow Barbara at @BarbMcQuade Follow us on Instagram and Threads at @pivotpodcastofficial. Follow us on TikTok at @pivotpodcast. Send us your questions by calling us at 855-51-PIVOT, or at nymag.com/pivot. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Support for Pivot comes from Virgin Atlantic. Too many of us are so focused on getting to our destination that we forgot to embrace the journey. Well, when you fly Virgin Atlantic, that memorable trip begins right from the moment you check in. On board, you'll find everything you need to relax, recharge, or carry on working. Buy flat, private suites, fast Wi-Fi, hours of entertainment, delicious dining, and warm, welcoming service that's designed around you. delicious dining and warm, welcoming service that's designed around you. Check out virginatlantic.com for your next trip to London data, and a matching engine that helps you find quality candidates fast. Listeners of this show can get a $75 sponsored job credit to get your jobs more visibility at Indeed.com slash podcast.
Starting point is 00:01:00 Just go to Indeed.com slash podcast right now and say you heard about Indeed on this podcast. Indeed.com slash podcast. Terms and conditions apply. Need to hire? You need Indeed. Hi, everyone. This is Pivot from New York Magazine and the Vox Media Podcast Network. I'm Kara Swisher, and it's Kara Swisher Day here in the United States of America.
Starting point is 00:01:27 Hi. Hi, Scott. Isn't it Kara Swisher Day every day? It is. Why is it your day? You seem happy. I like that. My book comes out today, and I'm excited.
Starting point is 00:01:38 I've started my book tour, and it's really fun already. I already hugged Kate Winslet, and hung out with I'm like, and hung out with Chance the Rapper. So now I'm very happy. All right. So you're on your book tour. You still do those. I don't do book tours. Yes, I do.
Starting point is 00:01:52 I have a huge book tour. I'm going to like 27 cities. And in each city, I'm being interviewed. You're going to 27 cities? Something like that. Yes. So I'm, but in each city, I'm being interviewed by, I made it into an event. They're all sold out. And like in Austin, it's Mark Cuban. In LA, it's Bob Iger. In here in DC,
Starting point is 00:02:11 in New York tonight, it's Don Lemon at the 92nd Street Y. So, I tried to make it interesting. In Boston, it's Maura Healy. And so, it's really, it's a, I made it into an event. You know, I'm good at events. So, I made it into something else. I can help you with yours. So yes, right. It's racing up the charts at Amazon, which is great, which is good. That's great. Yeah. So I'm very excited.
Starting point is 00:02:32 And thank you for defending me from people who say I got all my scoops yesterday on the social media. I appreciate it. I got one of the reviews was like, it shouldn't be a memoir. And it is a memoir. So the criticism here is a memoir, and it shouldn't be a memoir, but it is a memoir. Thank you for saying that. or essentially insinuated that. And I was 10 years into my career before I met my ex-wife. So that's kind of confusing to me, but it was very sexist. But otherwise, but you defended me and so did Walt Mossberg and others. And I really appreciated that on that issue, at least.
Starting point is 00:03:17 Well, good. Anyway, we've got a lot to get to today, including the Supreme Court cases that could bring major changes to social media. It's a big case and Reddit is officially going public. And we'll talk about whether it's going to be a successful IPO. Plus, our friend of Pivot is former U.S. attorney Barbara McQuaid, who has a new book, Attack from Within, How Disinformation is Sabotaging America. But first, we're going to have a little excerpt from Chapter 4 of Burn Book. All right, let's listen to this excerpt. I've always hated the phrase, speak truth to to power because it assumes all power is bad. It should really be speak truth to power when the power is false or damaging or even just plain bizarre. In the bizarre camp was when I found myself staring at an ice sculpture of a woman whose breast was oozing white Russians, a Kahlua and cream concoction.
Starting point is 00:04:06 I was a guest at a baby shower party for Google founder Sergey Brin and his wife, Anne Wojcicki, who were expecting their first child in 2008. Naturally, they decided to celebrate with a huge party in the factory district of San Francisco. Before you could lift a glass to the icy nipple to get a sip, guests had to brave a jungle of dangling baby photos of Sergei and Anne at the door. The club's entrance was manned by the kind of prenaturally ebullient and hyper-organized women
Starting point is 00:04:33 that always seemed to surround the rich of Silicon Valley. Would you like a diaper or a onesie? asked a young woman with amazingly swinging blonde hair and a very sincere smile, as if the question were not even slightly fucked up. There you go. What do you think? I don't understand. You say that like it's a bad thing. I want to be invited to those parties. I did. It was so weird. Everybody was dressed up as babies, and it had a weird sexual implication, and yet they were just juvenile. So it was a very strange party. And it starts to talk about sort of the, you know how Google was aggressively quirky,
Starting point is 00:05:10 you recall, right, in the early days? I was talking about the need for juvenileization that these people had in extreme, in extremists, and how everything had to be forced fun. And so that was sort of talking about that. I don't know how many of those parties you went to, but I had to go to an indescribable amount of them. And they were always as if we were 12 again, slipping slides or, you know, pogo sticks or bouncy houses and things like that. So I was trying to make a point on that issue. Yeah, I was never part of that community or invited to it.
Starting point is 00:05:41 What were you doing? You had Red Envelope. I was working around the clock. Well, New York wasn't like that. New York wasn't like that. It didn't have that same kind of crazy aspect to it. Well, I was in San Francisco in 92 to 2000, but I just wasn't that dialed in. I was living in San Francisco.
Starting point is 00:05:57 I was married, working all the time, had a dog. But I didn't go to that stuff. I wasn't part of the, I wasn't on the A list. I was sort of B minus list. I wasn't, I just wasn't privy to that stuff. Well, it was crazy. It was time they would hire, as they got money, they started to hire really, you know, like Elton John or I went to, I was invited to a party and I didn't want to go. And they said, oh, you have to come. It's my kid's 16th birthday party. And I said, I really don't want to come now. And they finally convinced me.
Starting point is 00:06:29 And I walked in and it was Maroon 5. They had hired Maroon 5. It was crazy. And I was like, they said, what do you think? I said, I think your kid's going to need therapy later. That's for sure. Yeah, I think your kid's going to be fucked up and a little bit entitled. Yeah, it started very adorable.
Starting point is 00:06:44 And then it got kind of weird. And, you know, rich people's like, I don't know, it was weird. It was it was a weird time. And so I wanted to sort of depict that in the in the book. And there's lots of stuff like that. For me, I don't know how much of it is the industry, because I'm curious what you think. I am sort of I have access to what you would call, quote unquote, the tech elite. I mean, the difference or one of the big differences is I don't I purposely won't meet with any of these people.
Starting point is 00:07:09 I'm not a journalist and I don't especially enjoy them. And there's just other people I'd rather hang out with. And it's sort of your job to meet and listen to them and, you know, establish relationships such you can get good news or, you know, good scoops and or just be on the right side of people. Yeah, that they feel comfortable with you. Yes, there's a seduction element. Yeah, you have a dialogue. Well, there's a seduction element to journaling. You want to be charming, same time a little scary. You want to get them to talk. You want to get them to talk. Yeah, for sure. But I've always, on the tech side, I think San Francisco, and I never knew if it was
Starting point is 00:07:43 regional or industry, and I think you would argue it's industry. I've lived in LA think San Francisco, and I never knew if it was regional or industry, and I think you would argue it's industry. I've lived in L.A., San Francisco, Miami, New York, and London, and the only place I wouldn't live again is San Francisco. And I think San Francisco is beautiful. I think it's the most beautiful city in the Union. It doesn't fit me personally because it's all about the day and the outdoors, and I'm all about the night and the indoors. It's very hard to get a good drink or find a good bar after 10 p.m. It doesn't fit me personally because it's all about the day and the outdoors, and I'm all about the night and the indoors. It's very hard to get a good drink or find a good bar after 10 p.m. It just never really fit me. But the thing that turned me off the most was the whole tech culture and the tech bro and the VCs.
Starting point is 00:08:17 And I thought during the day, these were some of the most rapacious, aggressive, I don't want to say unethical, but I just had so much full-body contact, watch your back, don't trust anybody experiences, mostly with the venture capital community. Whereas in New York, they kind of stabbed you in the front. And then what I found so duplicitous about this, during the day, let's wash out founders and do rights offerings. And at night, we're going to throw a party to save the whales. It just seems so disingenuous. Whereas New York is like, we're here to make a shit ton of money. We'll give you a straight answer. And now let's go drink a lot. I just thought these are my people. Yeah. Well, they're more adult. They just love to live in childhood and they have money. And that was, you saw that everywhere you went, there were toys.
Starting point is 00:09:05 That was, you know, I talk, there's a scene with at Excite. You remember Excite? And they had a fake garage door as their door. And I was like, well, okay, you started in a garage, I think. Probably you didn't. I mean, Google definitely did because I was at that garage. But, you know, they had a fake garage at their headquarters. They built it because it was so quaint, which really, for some reason, irked me terribly.
Starting point is 00:09:27 And then they had this slide where, between the floors, and they were always like, get on the slide, Kara. And I'm like, I'm not getting on your fucking slide. Like, I was such a bummer to them. But I was always struck by not just the juvenileness of the toys in the place, the colors, the primary colors everywhere. The clothing was always really, it felt like everybody was dressed in Gap for Kids, right? That kind of thing. And then what was really, and the food was soft. And one thing I found in an
Starting point is 00:09:56 old story that I wrote right at the beginning, two things. One is about their titles. Their titles were always silly, like Chief Yahoo or Chief Experience Officer. You know what I mean? And they were always the CEO. Like the only person who had an honest card, which I found the other day, was Mark Zuckerberg's, which was I'm the CEO bitch, which is also juvenile to do that. But at least you got what he was doing. And then I wrote a whole story about their food choices because they always like to pretend they were regular people, even though they were starting to become very wealthy. And so they're like, we only like burritos. We don't go to fancy restaurants. And, you know, you'd be like, why do you need to prove to me like, you know, internet moguls are just like us. It was a really interesting. And except the Apple people never did this, right? There were certain companies where this did not happen. Microsoft was one of them. Apple, they were older. So anyway, it was an interesting time. Yeah. And I found, and again, I might've, I went to Berkeley and
Starting point is 00:10:54 I absolutely loved it. And some of it was a life thing. You know, I just wanted to change my life. You're a New Yorker. You're a New Yorker. Yeah. But also this notion of changing the world. I've never seen I've never met a group of people more obsessed with money who evaluate other people based strictly on their estimate of how far in the options their money are and what employee they are and what's their AUM at their VC fund. I literally think there is the, in New York, you have to be something. You have to be super intelligent, rich, or hot. In the Valley, what I found, it's just all about how much money you have from tech. Stock. Full stop. It's tech. Yeah, you're right.
Starting point is 00:11:34 Well, that's why the first line of the book is, I think, if you don't have to read any more than the first line, which is, so it was capitalism after all. You know what I mean? It was so irritating to be told they were changing the world, and then they had cashmere hoodies in full control of their companies, right? That nobody could have a choice. Anyway, it's interesting. I hope you like it. I hope you'll like it. I hope you'll read it. Yeah, I'm going to read it. I'm excited. I am actually excited to read it. You're actually in the book, so maybe you can. There's no index, so you can't find you. As I should be. They call it The great reprisal rejuvenation.
Starting point is 00:12:06 No, it's about our meeting. There's a lot about you. You have more. You have more words than most people. The man, the man who saved my career. Yes, that's what it is. No, it's a very funny. It's a very funny section about how we met.
Starting point is 00:12:17 And and then I put you in acknowledgement. So you're just going to have to read. That's the way it is. You're in there. You're in there. I'm excited. Anyway, it's good. So let's get to our first big story because the beat goes on. That's the way it is. You're in there. Thanks for doing that. I'm excited. Anyway, it's good. So let's get to our first big story because the beat goes on.
Starting point is 00:12:28 The beat goes on and on. And this case has early beginnings for the internet. And we'll move to that. So our first big story. The Supreme Court is hearing two First Amendment cases this week, important ones that could have major ramifications for the future of social media and what rights platforms have to moderate content, the continuing saga of content moderation. The cases deal with Florida and Texas laws from 2021 that were passed to combat what supporters considered censorship of conservative viewpoints on social media platforms. Texas and Florida officials argue that laws regulate business behavior, not speech, which is a very fine line.
Starting point is 00:13:10 But the lobbying groups representing tech companies say the laws infringe on First Amendment rights. Let's talk about these cases in specific. Let me explain to people. Florida law prohibits social media sites from, quote, willfully deplatforming a candidate. The Texas law prohibits censorship based on a user's viewpoint. Supreme Court is taking the cases on after federal circuit judges came to the opposite conclusions on the constitutionality of the laws. A decision by the court is expected by late June,
Starting point is 00:13:32 just months before the presidential election. I can't believe this, but I'm on the, although I'm not on anybody's side, but I have to say I think I'm on the tech company's side. This is an infringement of the First Amendment by states deciding content moderation issues is a real problematic. People can censor you if you're a private company. I'm sorry. Stop with this public square crap. It's not a public square. What do you think? This makes no sense. So, CNBC, I hauled my ass down to the New York Stock Exchange to talk about, you know, big tech over and over and over every Wednesday morning for like two years on CNBC.
Starting point is 00:14:13 And one day they decided not to have me back. So am I being censored? Yes, Scott, you're being censored. I talk about politics. Yep, yep. I talk about politics. This makes no sense. They're allowed to do this.
Starting point is 00:14:24 Yep, I agree. Do we have to have Ted Cruz and AOC on our show every episode? I mean, this just makes no fucking sense. They're private companies. Yeah. They're trying to do that as a business thing. It's just they're hurt that they get thrown off because they act badly. You know what I mean? And by the way, Elon Musk runs one of the bigger ones. So you've got your man. So everybody gets their man, essentially. You know, this was interesting because drilling in on whether judgments, it's going on right now, these arguments, drilling in on whether judgments about what content to host are,
Starting point is 00:14:54 quote, expressive and thus potentially eligible for First Amendment protection. Justice Elena Kagan raised the example of X, formerly Twitter, and how its rules changed after Elon Musk bought it in 2022. Twitter users one day woke up and found themselves to be X users. The rules had changed and their feeds changed, and all of a sudden they were getting a different online newspaper, so to speak, every morning. And the net choice attorney who's representing the tech company says it discriminates on the basis of content, speaker, and viewpoint. It does all of this in the name of promoting free speech, but loses sight of the principle of the First Amendment.
Starting point is 00:15:27 Again, the only thing is, then you leave it to these companies who are not regulated to do anything. So what are the regulations we can put on these companies so they behave properly? I think you can sue them. That would be the best thing, is to be able to sue them if they cause damage, and not through this nonsense of this, of these, you know, very conservative states that are trying to, with, if this passes, it is really bad. I just, I can't even. So, so when is someone a political figure?
Starting point is 00:15:56 I understand when there's, sometimes they have government funded campaigns in certain countries and they have to give them a certain amount of time, each of them a certain amount of, you know, because they're worried about political candidates being squashed or censored. I get that. But you're going to tell these, there's a lot of different platforms. And there's, I mean, granted, if you made the argument at some point that meta is a utility, I could see then maybe passing some laws that during elections, people have a certain amount of fair use or whatever it is. But how is this not get challenged? Does this mean that we talk about politics all the time?
Starting point is 00:16:33 Do we can we demand that we get a certain amount of airtime on truth social? I don't I don't I don't see how this is enforceable. I don't think it makes any sense. It's so dangerous. It's a violation. The law they're holding up. This is a direct violation of. I don't see how this is enforceable. I don't think it makes any sense. It's so dangerous. It's a violation. The law they're holding up, this is a direct violation of. I don't get it. They're just, you know, this is, let me read this from the Washington Post.
Starting point is 00:16:51 This is one of the several high-profile tech cases on the Supreme Court's docket as the court increasingly weighs how centuries of free speech precedent apply to the digital sphere. Next month, the high court will hear arguments in the case that weighs whether the first amendment precludes, which is the opposite, government officials from pressuring tech companies to remove content. So these guys are on the other side of that because they were talking about the Biden administration. Those are the same people. These people are so hypocritical. I just want to like scream. But they were the idea that you could warn, flag these companies for dangerous content. I think they can flag it. You know, that's, I don't find that to be a problem at all. I think it's the ability to take it down. And websites have a viewpoint, depending on who owns them. Media companies.
Starting point is 00:17:36 Yeah, they're media companies. And then Justice Amy Coney Barrett questions whether tech companies' content moderation practices are similar to editorial discretion of newspapers, citing hypothetical example of TikTok's algorithm, boosting pro-Palestinian posts over pro-Israeli posts. If you have an algorithm do it, is that not speech? So, it'll be interesting. It's such a partisan issue, and whatever side you're on depends on what side you're on, and you can shift your allegiances. But there's a lot going to happen here. The only veracity to their argument is that if they are exonerated from the legal scrutiny that traditional media companies are exonerated, if we started sexualizing images of children on this show, then I think a lot of organizations would have a legal, a rightful, credible legal case against us.
Starting point is 00:18:29 But that's not true on big tech platforms because it's Section 230. So I can see how they would say, well, if they're not going to be subject, if they have more free reign, then they have to fall under additional scrutiny or mandates. But this one doesn't make any sense. And we have a model here. Treat them like media companies. Fox is not obligated to bring on Bernie Sanders every day. When it ends up, there's the quote-unquote credible witness providing information on the Biden crime family, quote unquote, is actually a Russian spy. Fox, it's reprehensible. They don't report on it. They're allowed to do that. They don't have to.
Starting point is 00:19:10 Yeah, exactly. And then you can decide whether or not you want to watch it or not. Yeah, exactly. At the same time, when they spread lies about voting machines, they are liable. We have a model for this. It's called a media company. We do. But they run from being a media company. They're going to have to embrace being a media company. But also, let's talk about sexualization. Let's discuss two disturbing investigative stories published in the last few days concerning a number of parent-run accounts on social media. also to online abuse from men, older men. What a creepy frigging story that was.
Starting point is 00:19:45 And the Wall Street Journal, it was followed by a report's Meta safety team warned internally last year, as they have before, that new paid subscription tools on Facebook and Instagram were being misused by parents trying to profit from exploiting their kids. I mean, like any of these tools will be abused and they cannot abrogate their responsibility as they make them.
Starting point is 00:20:04 I don't know if they'll do anything. Meta took down some accounts and acknowledged enforcement errors after the Wall Street Journal flagged some of the exploitative behavior. Parents certainly have a responsibility. Meta spokesman said in a statement to the Times that parents are responsible for these accounts and their content and could delete them at any time. They're not their nannies, parents' nannies, but boy, they have a much, if kids were using, I'm trying to think, something else in such an irresponsible,
Starting point is 00:20:31 if parents were using it in such an irresponsible way, the government would come down hard on all sides, including the parents. And so, these stories were deeply disturbing to me. I don't know about you, if you read them. Well, first off, just, you know, we're critical of them, but shout out to the New York Times. I thought the article was really powerful. It was. The journals was, too. They both were great. And it shows the power of long-form investigative journalism and why it's important that it continue to be economically viable.
Starting point is 00:21:00 I've said this. You've heard me say this. Instagram begins from a place of perversion. The algorithms encourage young women, preteen girls, to sexualize themselves. The algorithms love it. They'll elevate the content. The strangers out there will love it and encourage you. And at your most vulnerable time time in terms of your own self-image, coming into your own sexuality, hormones taking over, you get this weird... I mean, it is so insane that one of the most valuable companies in the world, the epicenter of its growth property, Instagram, is the following. Find minors, encourage them to
Starting point is 00:21:43 sexualize themselves such that their peers and strange men around the world can evaluate them and comment on them. And then the knee-jerk reaction is, well, it's up to the parents. Well, I'm sure there are parents that for money will have their 12-year-old kids engage in a pornographic film, but that doesn't mean the theater running it shouldn't be hammered or taken to court. So yeah, the parents here are strange, desperate, whatever it might be. We know how to age-gay. Well, now here's the thing.
Starting point is 00:22:18 It's parents' accounts. It's not the kids' accounts. Parents just put that content on there. That's one of the problems. These are adults who run these sites, which is sick. But the platform knows what's going on. Okay, the parents know that their kids are engaged in child labor. You and I approving our sons to go to work at the age of 13 doesn't make the employer void of violating a crime.
Starting point is 00:22:49 It all comes down to the same thing. There's no reason anyone under the age of 16 should be on Instagram. And they could figure it out in 48 hours. Or their images. And anything, if it's family photos, one. But anything like this, commercialization, people say, well, there's a healthy industry for modeling for kids. We need them. Fine.
Starting point is 00:23:12 The traditional industry has put in place safeguards. Meta has not. And let me just go. I've said for a long time I thought Sheryl Sandberg was just terrible for women. I've said for a long time, I thought Sheryl Sandberg was just terrible for women. And I thought that it was a total globalist tactic, accuse the other side of what you are guilty of. I felt for her to be marching around the world talking about leaning in, which as far as I could tell, let me summarize the book for women out there. Act more like a man.
Starting point is 00:23:40 That's how I read it, and two, to have figured out a business model that results in algorithms and incentives to sexualize young girls, this company begins from a place of perversion. All right, but let's put the onus on the CEO. Again, I know you like to focus on Cheryl, I get it, but this is Mark. I focus on Mark too. I focus on both of them. All right, I'd like you to mention... Posei. I focus on R-Mark too.
Starting point is 00:24:02 I focus on both of them. All right. I'd like you to mention it. Posei. Posei. Posei. I find this to be so obvious in terms of how to deal with this. It's really, you know, accountability matters a great deal.
Starting point is 00:24:15 And again, my book's all about that. Like, this should be so dead obvious that this should not be like. And the problem is it's expensive. Their businesses look a lot less pretty once the cost to fix this goes in because it's expensive. Their businesses look a lot less pretty once the cost to fix this goes in because it's nearly impossible. You know, whatever, 300 hours every second on these sites, right? Video or photos or whatever. Too bad. Don't build a business like that. If you can't control it, then make it smaller. I don't know what to say. I mean, because their whole thing is we can't, there's too much of it to monitor. Why did you build a business where there's too much of it? Like, again, they have an answer for everything. And in this case, it should be dead. This
Starting point is 00:24:53 iterates. This isn't just child porn. This is, you know, misinformation, anti-Semitism. Just put any word in here for this stuff. But if we can't do it with this stuff, you're fucking kidding me. Like if we can't do, you cannot make your tools. Look, one of the people arguing about storage, like that Apple is exploitative too, because they allow storage, right, on these sites. I think we talked about that. Anything in the world, a cardboard box is full of pornography of child pornography is a problem, but it's not the box's problem, necessarily. These are not boxes. They like to think that they're like cardboard boxes where people keep pornography or a safe
Starting point is 00:25:35 that people keep it in. It's not the safe. That was never intended for this. These people have tools to stop it. It's very different. They have to stop acting like they're a cardboard box. You know what I mean? Like they have no culpability. And that drives me crazy. They make an active decision to program algorithms. Then when it finds content with a 16-year-old girl in a bathing suit, it's very provocative. They make an editorial decision.
Starting point is 00:25:59 This isn't much different than a producer at MSNBC or CNN Fox going, you know what? different than a producer at MSNBC or CNN Fox going, you know what? There's a lot of desperate men out there who like looking at 16-year-old girls who are fucked up in the head and don't realize how wrong it is. And in the anonymity of their home, we'll like it. And the algorithm will sense that. And we as humans have decided to elevate that content. We have decided, we have made an editorial human decision to program the algorithms this way. And the moment content is elevated algorithmically, you're an editor and you should have the same responsibilities and the same accountability as any other media company. And when we're talking about they always throw up their arms and they plead complexity, we're not talking about the realm of the possible. We're talking about they always throw up their arms and they plead complexity, we're not talking about the realm of the possible.
Starting point is 00:26:47 We're talking about the realm of the profitable. Because here's the bottom line. It's expensive to moderate. It is expensive. That's what I said. It is hard. It's not such a good business. It's not such a good business. To have a total free-for-all where you just claim complexity and First Amendment and deploy your hundreds of lobbyists.
Starting point is 00:27:07 And, oh, okay, if images of nooses and pills get sent to girls who are in the midst of suicidal ideation. If we have a ton of content that we run ads against that sexualize young girls. If we have 54 to 1 pro-Hamas videos for every pro-Israel. I mean, and we radicalize young men. Okay, not our fault. But boy, it is an unbelievable business model. It is. It is. It is. We have a decision to make. Economic value and shareholder value is an incredible thing. It's important. They pay taxes. They give a lot of people opportunity. They create ecosystems. They
Starting point is 00:27:44 create economic growth. That counts for a lot. The question we have to wrestle with and come to some sort of decision around is, has it become too costly? Is it no longer worth it? responsible in these situations. Like, let's not let them off the hook. And I won't. I was, it was so, I could not believe parents would do this to kids. But of course, I'm always surprised by that kind of thing every day of the week. So anyway, we do understand parents have a role in here, then they need to be prosecuted also, if they're exploiting their children sexually in an imagery. Now, what's interesting here is there was also a story, just to finish up, that AI will take the place of these AI videos or pictures. There won't be any people involved, they'll just imagine it. And then I don't know if that's worse, because then people will get addicted to this kind of content. But nobody will be in harm's way, but all of us will be in harm's way. So that's an interesting debate. What if it's AI and not people anymore? I don't want to go down that path. Again, they have the ability to tag stuff
Starting point is 00:28:52 that's AI generated, and then they have a responsibility. The easiest thing here, and they know how to do it, but they don't want to do it. Agegate. People under the age of 16. What if there's no age to the AI image? It just looks like a kid. There's so many things coming down the pike. Anyway, that's a topic for another day, but there's so much stuff because AI doesn't have an age. So anyway, we'll see. Big topic, very important to Scott and I, obviously. We care a great deal, but you should read these stories. All right, Scott, let's go on a quick break. We come back. Reddit's IPO is finally happening, and we'll speak with a friend of Pivot, Barbara McQuaid, about how to defeat disinformation.
Starting point is 00:29:41 Fox Creative. This is advertiser content from Zelle. When you picture an online scammer, what do you see? For the longest time, we have these images of somebody sitting crouched over their computer with a hoodie on, just kind of typing away in the middle of the night. And honestly, that's not what it is anymore. That's Ian Mitchell, a banker turned fraud fighter. These days, online scams look more like crime syndicates than individual con artists. And they're making bank.
Starting point is 00:30:10 Last year, scammers made off with more than $10 billion. It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands, of scam centers all around the world. These are very savvy business people. These are organized criminal rings. And so once we understand the magnitude of this problem, we can protect people better. One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple. We need to talk to each other.
Starting point is 00:30:50 We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do if you start getting asked to send information that's more sensitive? Even my own father fell victim to a, thank goodness, a smaller dollar scam, but he fell victim. And we have these conversations all the time. So we are all at risk and we all need to work together to protect each other. Learn more about how to protect yourself at vox.com slash zelle. And when using digital payment platforms, remember to only send money to people you know and trust. Thumbtack presents the ins and outs of caring
Starting point is 00:31:27 for your home. Out. Indecision, overthinking, second-guessing every choice you make. In. Plans and guides that make it easy to get home projects done. Out. Beige. On beige. On beige. In. Knowing what to do, when to do it, and who to hire. Start caring for your home with confidence. Download Thumbtack today. As a Fizz member, you can look forward to free data, big savings on plans,
Starting point is 00:32:03 and having your unused data roll over to the following month. Every month. At Fizz, you always get more for your money. Terms and conditions for our different programs and policies apply. Details at Fizz.ca. Scott, we're back. Reddit has filed to go public finally. This will be the first major tech IPO of the year and the first social media IPO since Pinterest in 2019, if you can believe it. The platform's bankers are looking for a valuation of at least $5 billion, not too big.
Starting point is 00:32:30 It has 73 million daily active users and one of the most visited websites in the US. In an unusual move, Reddit will reportedly offer 75,000 power users the chance to buy shares at the IPO price. I think other companies have done similar things. First off, Reddit has been around since 2005. This will be a test for IPOs in general. Their business is largely advertising. Now they just, probably the most controversial thing is this licensing,
Starting point is 00:32:58 data licensing agreement. They just announced a $60 million deal with Google last week, which I think people inside who use the company are not happy about. So any thoughts? Talk to me about this IPO. Well, first off, everyone's sort of waiting for the market to thaw, the IPO market. And it probably is beginning or the atmospherics are lining up just because of the acceleration in the markets. The markets are reaching new highs and NVIDVIDIA has totally inspired kind of the animal spirits appear to be returning.
Starting point is 00:33:29 And there's several IPOs lined up, and everyone's waiting for one to really pop, whether it's going to be Reddit or Shein, which is kind of the biggest now, the biggest fast fashion company in the world. I like Reddit a lot. I actually think this IPO could be a real winner because, I mean, the first thing I do, just very basic, I type in most traffic sites in America. Number one is Google. Number two is YouTube. Number three? Reddit. Reddit. Yeah, my kids use it like crazy. And number four is Facebook. Number five, Amazon. Number six, Pornhub, number seven, Wikipedia, and number eight, Yahoo.
Starting point is 00:34:06 Yahoo over Twitter. Isn't that crazy? I'm a shareholder in Yahoo. But my thesis around Yahoo is the same as my thesis around Reddit. And that is, remember those charts they always used to put out 10, 15 years ago saying, newspapers command 7% of people's attention, but they get 18% of the advertising revenue, whereas the Internet is 30% of people's attention, but only 9% of money. And over time, the two calibrate. They normalize. They come in line with each other.
Starting point is 00:34:37 And so when you look at the most trafficked sites in the world, they're usually, you know, one-plus-trillion-dollar companies. They're usually, you know, one plus trillion dollar companies. And so this company is going to go out supposedly at a valuation of five billion. And it's the third most trafficked site in the United States. Yeah, it's a bargain. It's a bargain. They should be able to they should be able to figure this out. They still rely on advertising quite a bit like that. If you look at the numbers, it's a really heavy.
Starting point is 00:35:00 And so they're they're subject to the vicissitudes of the market. I get it. I get it. You're right. So does Google. Alphabet's all advertising. This is a much smaller company. I've interviewed its CEO, Steve Huffman, quite a bit. I like him. He was really first on figuring out content moderation. They, of course, had a terrible back around some really heinous sites there on the site. It's always been a little bit of a shaggy dog of a social media site. But they did the $60 million deal. They have a little bit of that AI spice on their meal. They got $60 million. I think that's going to go up, or they might take the content in-house. This is their risk. They can't be Pinterest and Snap that have kind of gone sideways and always end up being the little engines that couldn't.
Starting point is 00:35:48 But if they continue to be one of the most traffic sites in the world, they can put an AI spin on this at $5 billion. I mean, let me just be transparent. I'm going to try and get stock in this thing. I think this thing, I think this could be either Shein or Reddit are going to set the IPO market on fire, I think. Shein is a retailer, though, of course, right? But Shein, I mean, it's another one. We'll talk about this another day. But Shein is now the largest fast fashion company in the world.
Starting point is 00:36:12 Yeah. Yeah, it's interesting. One of these, one or both of these is going to have a big pop because the market, the IPO market has been so thawed that I bet these get priced fairly conservatively. Five billion is low. It's very low. I was like, oh. That's nothing. Yeah, it really is $5 billion is low. It's very low. I was like, oh. That's nothing. That's the problem.
Starting point is 00:36:27 If it sticks at $5 billion, it's not going to get a lot of analyst coverage. No one really cares about $5 billion. It doesn't make money. I think they're cash flow positive, I believe. What's interesting, here's one that 9% of the voting power at Reddit, OpenAI CEO Sam Altman, there he is, pops up. He invested, at the time when they were struggling
Starting point is 00:36:45 a little bit, $60 million in shares, because they were going to go IPO and then they weren't. And then they were sort of owned by Condé Nast. And they were, you know what I mean? They went through a lot. This company has gone through a lot of iterations and, you know, sort of coming out, maybe they should be bought. People thought about buying them. Then there was a controversy over those terrible, the Trump forums and everything else. And then I did an interview with Steve and he said, he was the first CEO to say to me about content moderation, you know, we were very, they're very, they were the first, they were, you know, the free speech wing of the free speech party, right? Those of the people I interviewed, but he was also the first person who said to me,
Starting point is 00:37:23 some people are just malignant actors and we're going to throw them out. And nobody had ever said that and they had no resources to do it. And that was a big deal when he said that to me, I was sort of like this, the company with the least ability to do anything about it was articulating something that just made perfect sense. Right. And this was a company that was so free-speechy, you couldn't, I mean, and they got in trouble for it very early on. Tell me about Altman. What do you think about Altman? His investment gets this thing at least another billion dollars
Starting point is 00:37:54 in valuation because it's more of that AI sprinkling. It's more of that AI pixie. Everybody thinks Sam Altman, maybe with the exception of Jensen Huang, is the smartest person in the world. And he's an investor and a shareholder here. One of these companies is going to have a triple-digit opening day in the IPO market, and we're going to be off to the races. You like it. I think this is a great— You like it.
Starting point is 00:38:15 Yeah, I like it a lot. You know why that? Reddit, it's a global brand. People know it. It's a global brand. How many global brands can you pick up for $5 billion right now? But not just that. It's a youth brand, too. My kids, I was surprised. They use two things,
Starting point is 00:38:27 YouTube quite a bit, but Reddit just as much. And they don't use it for chit-chatting. They use it for videos and consumption, which they don't like. They don't like TikTok. I know it sounds crazy, but they find it too performative, and they think Reddit feels more real. It's an entertainment vehicle for them, which is interesting to me. So you know what's a really interesting hack that I didn't discover? Type in your search and then add the word Reddit and see what comes up. Oh, interesting. Well, you know, there's a whole thing on why we're so annoying on Reddit.
Starting point is 00:38:57 Do you know that? Oh, really? They vote about which one of us is more annoying. I'm sorry. That's a tough one. I'm insufferable. That's not an easy call. No, but really.
Starting point is 00:39:09 That's a photo finish. I'm insufferable, but interesting. You are the most annoying. I'm sorry to tell you. But some people love you. Oh, you've already decided? No, I looked at it. I'm looking at Reddit.
Starting point is 00:39:21 There's whole sections talking about whether we're annoying or helpful. For those of you who were wondering, we could go 10 minutes without talking about us, tell me about your book, Kara. No, no, no. But let me tell you, I went there and it popped right up. I was looking for book reviews from- Have you heard about our therapy session? No. Listen to me. People love that. People love this. It is the Scott and Kara show. But I'm saying it was really interesting. It's just endlessly interesting information,
Starting point is 00:39:45 but there's a whole section of whether we're annoying or not. That's really. Well, the answer is yes. Yes, we are. But as long as we're entertaining, that's fine. That's what I said. I was like, I almost popped in and went, hey, girls. I'm a techer as a podcast.
Starting point is 00:39:57 You don't like me, but you keep voting for me. Yeah, that's true. Keep listening. No, whatever. Anyway, you have a lot of fans, too. Anyway, let's move on. We like this, Reddit. Reddit, watch out.
Starting point is 00:40:08 I'm calling it now. Reddit, first trade. First trade, we're way up. Okay, we'll see. All right, let's bring in our friend of Pivot. Barbara McQuaid is a former U.S. attorney and the author of Attack From Within, How Disinformation is Sabotaging America, which is one of our favorite topics here on Pivot. Barbara, thank you for coming.
Starting point is 00:40:33 Thank you, Karen. Great to be with you and Scott. Great. So this book really delves into how disinformation has spread and become weaponized in recent years. You mostly talk about legal issues when you appear and things like that. Talk a little bit about it and how we got here. You write, we are not just living in a post-truth world. We are living in a post-shame world, which of course, both at the same time. So talk to me a little about how we got here and why this part interested you. Yeah. So my background is as a federal prosecutor in the national security space. Before I was a U.S. attorney, I was a national security prosecutor. And now I teach national security law at the University of Michigan Law School.
Starting point is 00:41:11 And one of the things I have seen is the threat evolve from al-Qaeda to ISIS to cyber intrusions and now to disinformation. And at one time, this was something that was coming at us from Russia and other hostile foreign adversaries. And now it is a problem are deliberately going along with the con. And that's because there are polarizing figures in politics who have convinced us that there are only two sides to any equation. There is the far right and the far left. There is good versus evil. By so demonizing the left and the woke, there are people who are willing to choose tribe
Starting point is 00:42:08 over truth. And that is what my book is about. So you're saying democracy dies in the bright light of day. That's the famous wash of democracy dies in darkness. That's something I have noticed is that before it was quite furtive, whether you're talking about national security, whether it was the Chinese or the Russians or the Iranians, all these different factors that you have used digital means. Now it's explicit. It's not implicit or it's not hidden in that regard. Donald Trump, where he just goes on and on and on talking about how he won the 2020 election. And every time there's an interjection of facts, he is just relentless and won't stop. And so he is just following sort of the age old adage of repetition. If you say things enough times and people hear them enough, they will believe them. But I think that it's something more than that. It is that people have chosen their tribes and they don't care what the truth is. You know, people talk about Donald Trump has been indicted on 91 counts of
Starting point is 00:43:09 crimes. They don't care because they have aligned their fortunes with their team. They have been convinced that there is only this or that. It's something referred to as the either-or fallacy, and they don't want to be convinced otherwise. And so we find ourselves in this polarized world where truth doesn't matter. And if we are going to be a democracy that governs ourselves, we have got to recommit ourselves to truth. Why is America particularly vulnerable to disinformation? You say that. I do. And I think it is because of our cherished values of free speech. And, you know, don't get me wrong, free speech is incredible and is essential to democracy. It is what protects minority voices and unpopular voices from sharing their views and criticizing those in power.
Starting point is 00:43:59 But there is also this idea that any suggestion that we should restrict speech or restrict social media gets labeled censorship. And when we hear censorship, our heckles go up and we say, no, that's a bad thing, that's a bad thing. But that's what permits content moderation and prevents some of the social media platforms from becoming toxic hellscapes. I think that people forget that all fundamental rights can be limited as long as there's a compelling governmental interest and the limitation is narrowly tailored to achieve those interests. So, for example, we could regulate content on social media for that which is paid. If you pay on social media, then maybe the social media platforms can reject certain things that are false or lying.
Starting point is 00:44:55 Or we can require at least disclosure of paid content online without worrying about violating the First Amendment. without worrying about violating the First Amendment. And so I think sometimes our devotion to the First Amendment blinds us to common sense restrictions that are time, place, or manner restrictions akin to yelling fire in a crowded theater. Nice to meet you, Barbara. So the level or the magnitude of the problem is a given. It's obvious. It's a threat. I think obvious. It's a threat.
Starting point is 00:45:26 I think people perceive it as a threat. I guess the question I would have is what solutions do you propose? And I'll put forward a thesis that there's always going to be individuals who will take advantage of these porous platforms and are willing to lie and expecting better people to emerge, there will always be people willing to lie and repeat lies. And I don't think shaming is working. We could sue the platforms or try and go after them. My sense is they're too big. Do you think at some point we need to incorporate more into our educational institutions, point we need to incorporate more into our educational institutions, critical thinking, and then to restore that there is a truth, that the truth is an actual thing. But let me finish where I started. What solutions do you propose? So in the book, I do propose some solutions. And
Starting point is 00:46:16 Kara, I'm looking forward to reading Byrne's book about your take on social media as well. I'm sure you've got some ideas in there. We're fucked. If you'd like the cliff notes. Go ahead. Kara has a book coming out? I didn't know. You know what? I got fans, Scott. Go ahead.
Starting point is 00:46:33 I do propose some solutions. And Scott, I think they have to come from both sides of the equation. I think our government can do some things to restrict social media. Social media is an incredible tool, despite what Kara says. I think it connects people in wonderful ways. But I think we've been really naive in the way we've let it grow without any sort of regulation whatsoever. It's like growing a baby alligator in your bathtub. It's adorable when it starts, and it's very exciting until it turns into a man-eating predator, which is where we are today. And I think we could do some things. You know, one of the things that Frances Haugans, who was the
Starting point is 00:47:08 Facebook whistleblower, said during her testimony to Congress is it's not the content, it's the algorithms. And so there are algorithms online that will push people to content that is most likely to generate outrage because that's what keeps eyeballs on the platform. And so we can regulate algorithms. That would be something we could do that would not run afoul of the First Amendment. We could either prevent algorithms that drive us to that which outrages us, or we could at least require disclosure of algorithms so people know when they are being manipulated for outrage. So that's something we can do. I also think that we see a lot of content on social media that does not play by the same rules as advertising on
Starting point is 00:47:53 television and radio. You know, if you listen to an ad on the radio or see something on television, you'll hear, you know, I'm Joe Biden and I approved of this ad. That does not exist on social media. So you could have Russia or some wealthy individual posing as the red, white, and blue grandmothers of America posting ad content online without disclosure. So I think there are a number of things we can do with the law to change the way disinformation is allowed to percolate. Proliferate. Proliferate. There's a good word word online. Right now, as we're talking, the Supreme Court is arguing these two cases, and there's another one coming around the limits of
Starting point is 00:48:29 the government into warning, you know, telling people to take certain content down, companies. So talk a little bit about the cases currently there and the one coming, because they're all very important. It's sort of a rethinking of what, and these are Texas, we just talked about them, Texas and Florida laws. I happen to be on the side of the tech companies here, although I think they're wildly irresponsible and under no liability or anything else. So that's the problem on that side. But I tend to be for the tech companies on this one. Yeah. And it's important, Kara, to differentiate. You know, you can't be all for, you know, companies on one thing and all for states on another. You have to look, you know, elementally at these things, you know, point by
Starting point is 00:49:11 point. And on this one, I think the tech companies are right. The idea that states can prevent tech companies from moderating content, again, in the name of censorship, you know, they throw out that word and it sounds so frightening, but imagine what this would look like if they could not do that. If they cannot have community standards that are enforceable by contract. These are private companies, and so they are not bound by the First Amendment. They have the right to say and not say anything they want. What's going to happen here? What do you think from listening to what they're saying, the justices? You know, it's always difficult to predict what might come out in the end, but I'd have a hard time seeing a majority of these justices
Starting point is 00:49:54 ruling in favor of these states and against the tech companies and telling them what they must allow on their platforms. These are private actors. And so I think that, you know, although some of these justices are no friends of the political left or even of these big tech companies, I can't imagine that in light of their views of the First Amendment, that they would allow these states to tell them what to do. In fact, I'd like to think that the only reason that they took up these cases is not because they consider them serious laws.
Starting point is 00:50:26 But they want to knock them down. Yeah, because, you know, the Fifth Circuit Court of Appeals upheld this statute in Texas. And so there's a circuit split. And so they really do have to step in here to knock them down. What about the Biden one? What about the one coming up after that? They also took that one. Yeah. And so this is an interesting one. This is the one where the Biden administration is working with tech companies. And in the light most favorable to the Biden administration, I think they would say, we have to jawbone with tech companies to engage with them and tell them what is false and
Starting point is 00:50:56 ask them to take it down. We're not ordering them. We're not demanding. We're trying to work together with them to protect public safety. If we see something that is a COVID remedy that might be dangerous to public health or something else that is... Or Al-Qaeda up to its usual tricks or something like that. Yeah, there you go. A terrorist organization that is recruiting online.
Starting point is 00:51:17 It is helpful to the public for that sharing of information to go on. Now, I could imagine the courts structuring something to ensure that this is an ask and not a demand. And so maybe they put some roadblocks in place to make sure that there is no coercion going on when it comes to government and no repercussions coming on. But I can't imagine, again, that we could live in a world where the government is forbidden from sharing with social media companies dangerous things that are harmful to public safety. So I think that this will come out in favor of those sorts of communications. And the most restrictive thing I could imagine is some guardrails there to prevent coercive situations. Well, you're a U.S. attorney, Barbara. My sense is the fines can't be big enough. could imagine is some guardrails there to prevent coercive situations.
Starting point is 00:52:09 Well, you're a U.S. attorney, Barbara. My sense is the fines can't be big enough. They have an army of lawyers. It's just the cost of doing business. Does any of this really get any better until someone does a perp walk? That's Scott's favorite. He wants to see one of them in jail, The perp walk. And do you mean one of the social media companies? Or do you mean? Yes. Meta was fine. $5 billion.
Starting point is 00:52:30 That was 11 weeks of cash flow and they keep on trucking. Well, I want to get back to a point that I think could help us. And that is the point you mentioned earlier about responsibility of the users. And I do think we, responsibilities of individuals, but also as a society to educate the public about being critical consumers of social users. And I do think we, the responsibility of individuals, but also as a society to educate the public about being critical consumers of social media. You know, we have tended to believe what we read online, believe what we see online. And again, it is a wonderful way to share information at scale. But in Finland, a place where they have suffered from disinformation from Russia for decades, school children are trained
Starting point is 00:53:05 in critical reading studies to be able to identify disinformation when they see it, looking for a second source, not reading solely the headline, looking past artificial intelligence and recognizing that there is such a thing as deep fakes on audio and visual and those sorts of things. And I think, Scott, we can't end it at just critical media studies. We also have to look at civics education. And so to understand, you know, how our government works, the three branches of government, checks and balances we have. What about that perp walk or repealing or amending Section 230, that sort of liability exists in some fashion? So I don't know that we could completely repeal Section 230. I think it might be, I think if we were to repeal Section 230, we would put these social media companies
Starting point is 00:53:50 out of business. I think there's just too much volume online for them to do it. But I do think, we could have some amendments to Section 230 for things like algorithms that are manipulating people and pushing content that outrages us. I think we could have liability for false advertising where they've accepted money in exchange for the content that is online. So I think, again, it's this all or nothing thinking, the dumbed down, and perhaps social media has driven us to that sort of this or that, all or nothing thinking. But the nuance is in between, and we need our legislators to take this seriously. You know, a lot of times when we have those hearings and we have some of our senators and members of Congress listening to people like Mark Zuckerberg or Jack Dorsey testify, you can see them start to glaze over,
Starting point is 00:54:41 and they look like those people in the ads for progressive insurance. You know, you're becoming your parents. Am I hashtagging right now? But they can hire experts to help them work through this. They have. They're smart. They get smarter and smarter, I think. Sure. And they've got staff members who are smart enough to sort through this. And so I think we need to take a responsibility that we are going to look to some of these suggestions for regulating social media. It doesn't have to be a complete elimination of Section 230, but some specific amendments to try to reduce disinformation. It's not one thing that's going to solve this problem.
Starting point is 00:55:15 It's a series of solutions. And I think one is all of us have to make a commitment to truth that we are not going to pile on. We're not going to choose our tribe over truth. We are going to demand truth from our leaders and from ourselves. You're a former U.S. attorney. You're an academic. You're obviously very thoughtful.
Starting point is 00:55:33 You have domain expertise that is really relevant to our country. Why wouldn't you run for office? I'm not willing to make the personal sacrifices it takes, but I appreciate the vote of confidence. Why don't you run for office, Scott Galloway? The exact same reason Barbara won't. Yes, that's right. Yeah, that's it. And other things. Barbara, last question. These elections are coming up. How are you looking at this disinformation?
Starting point is 00:55:58 We're in a post-shame world, a post-truth world. What does disinformation play here and what, if anything, can be done about it? Really relevant at this time, Cara, you know, we just saw in New Hampshire, a series of robocalls using artificial intelligence to falsify Joe Biden's voice. I think we need to be on the lookout for those kinds of things and quickly debunk them. I think, you know, fact checking in real time is very important, and I think providing information when there are these false claims out there. We're going to see it. We're going to see text message campaigns. We're going to see AI campaigns, and I think alerting the public to
Starting point is 00:56:34 these things. And the other thing here that we can't lose is engagement in politics, because that is what authoritarians seek in Putin's Russia. It is to flood the zone with so much information that people don't know what to think. They become cynical and then numb and they disengage from politics altogether. And so I want to encourage people that the best way to fight back is to stay engaged in politics, talk to your neighbors, work hard for credible sources like the League of Women Voters and others that provide accurate information about voting. All right, Barbara, thank you so much. This is a terrific book. And Barbara's really sharp. She's all over the place. You're all over the place, Barbara. The book is called Attack from Within How Disinformation is Sabotaging America. Thank you, Barbara.
Starting point is 00:57:17 Thank you both for having me on. Really appreciate it. Nice to meet you, Barbara. Thank you. Nice to meet you both. I enjoyed your podcast. All right, Scott, one more quick break. We'll be back for wins and fails. Okay, Scott, let's hear some wins and fails. Do you have any? I'm all screwed up. It's like I'm bass-acquired. I thought today was predictions in this Joey Bag of Donuts production we have. All right, make a prediction, Scott, and then I'll do a win and a fail. How about that? Let's be bisexual about it.
Starting point is 00:57:50 My mind is a blank after you said that. It was like pressing a reset button. So I think that Alphabet is going to be the best-performing big tech stock, and I think it's going to attract an activist investor in the next 30 to 60 days. It's the only one of the magnificent seven that is trading, whose price-earnings ratio is trading below the broader S&P. They've had a series of stumbles. The underlying businesses are just unbelievable. YouTube is actually bigger than Netflix. It's got, I think, the second biggest cloud business. Obviously, search is
Starting point is 00:58:25 arguably the best business ever invented. And it's also where it's gotten hammered is similar to how Meta got, in my opinion, unfairly punished or overpunished because of this terrible rollout or this ridiculous consensual hallucination called Oculus and the headsets. The market ignored that the underlying business continued to be just this cash volcano. The equivalent of the Oculus for Alphabet has been its kind of innovator's dilemma, flat-footed, sitting on their hands around AI. And even the rollout of Gemini had some real hiccups where it was a lot of people, and I think there's some truth here, it got over-programmed around sort of DEI issues. And the rollout has been, it just hasn't gone very well. And if you look at the
Starting point is 00:59:13 history of AI, literally, Alphabet was at the center of it in terms of people, intellectual property, discovery. And it's a classic case of the innovator's dilemma where they didn't go aggressively at it. There's a few things that are going to happen here. One, the stock is going to outperform because it's cheap relative to the others given the strength of its businesses. Two, an activist is going to show up. And three, and I don't like to say this, they're going to have fairly significant layoffs because they have overhired. And the year of efficiencies has not really hit alphabet yet. And they're going to come under pressure to show a similar type of shareholder returns as the other six, given the strength of an
Starting point is 00:59:48 underlying business. And the easiest way to juice their earnings will be to trim their workforce. So anyways, I think Alphabet is about to get more attention. I think it's going to outperform layoffs, a different narrative around AI, and you're going to see an activist show up there. Yeah, it'll be interesting to see what happens to the government stuff, too. That's why they've been so quiet on that stuff. That's interesting. All right, I'm going to do a win and a fail, all right? Fail. I got to say, these comments by Donald Trump about Black people and that I like them because I'm a Meg shot, none of his ridiculous, crazy comments are going to get enough attention because he says one every five minutes. That's the problem with Donald Trump.
Starting point is 01:00:23 But that one was just particularly heinous. But of course, then you're like, oh, that was particularly heinous. He benefits from being heinous all the time so that it just all falls on deaf ears. But some of the stuff he's been saying lately has been particularly nutty. And for my win, I would say,
Starting point is 01:00:43 I think that I'm kind of heartened by Nikki Haley continuing. I'm like, I know, like all these men are like, get out now, quit. All these men. And they are. All these men. I'm sorry. I just, I've noticed. What if I said all these women wanted to stay in the race?
Starting point is 01:00:58 Well, all these women are writing columns, stay in. I think she's really interesting. I think she has no downside to this. And like, let her stay in. She's got the money. She's got the means. Let her stay in and make a case. And she's making, I don't know, she's burnt the boats and she's staying in. And I think it's great. I love competition. I about it. We like it because I don't see how it's good for it's nothing but downside for Trump. And I'm all for anything that is nothing for downside. I actually think at this point it's probably bad for her to stand. I think she's starting to alienate more and more people in the Republican Party. She's becoming a bit of a gadfly. I don't know. All right. OK. You like that she's in. I like that she's staying.
Starting point is 01:01:41 I like let me have my win. Don't insult my win. Take down my fucking win. The Koch brothers just withdrew their funding. Of course they did. But they were nice about it. They were like, we think she's great. It was fine. Of course they were going to do that. They got to put their money in local races. That, I don't care. Let me ask you, who do you think is going to be his VP pick? I don't know, someone heinous. Probably Tim Scott, because he's the most obsequious chode of all of them. Yeah, probably. You know when I thought Tim Scott was definitely going to be his pick or thought he was? So we got engaged. How come all these guys decide to like go full hetero or full like I'm the Marian man when they – I mean, didn't Cory Booker show up with a girlfriend when he was running for president?
Starting point is 01:02:19 Yeah, when there was rumors about that. Anyway. Yeah. But not our friend Lindsey Graham. He continues to. So ladies, if you want a ring, if you want him to put a ring on it, just get him to run for president. Just get him to think they can be president if they only show up with a wife. You know, allegedly. That's all we have to say. Allegedly. Allegedly. Look, he just looks like he's dying inside. Every time he smiles, I'm like, oh, I'd love to hear the, if you could
Starting point is 01:02:41 broadcast the internal. Monologue. I would love to hear that. Oh, God. I would love to hear the, if you could broadcast the internal. Monologue. I would love to hear that. Oh, God. I would love to hear that. Let me just be the VP. I know. Let me get inaugurated and let's hope this sack of shit drops dead. He could be president and all of a sudden. Easy. And all of a sudden.
Starting point is 01:02:56 And then we have, you know, there we have it. Let me go to this. When he started his campaign, I think the odds were like 200 to 1 that he'd be president. If he's Donald Trump's vice president, five years. Five years to 77. One year. Well, no, no, no. No, no, no.
Starting point is 01:03:13 I'm saying between now and the end of the, if Trump were elected. So call it five years for a 77-year-old obese man. Literally, Tim Scott, if he's made VP, has like a 40% chance of being president. Yeah, it's true. It's true. I think he's picking Tim Scott. That's what I would say. I would.
Starting point is 01:03:28 That makes the most sense. He's a sense. In weird ways, Trump can be very sensible. And that's the sensible choice. He's obsequious. Yeah, I agree. He's obsequious. And he's a black man.
Starting point is 01:03:38 And he's, you know, he's friendly. So he gives, you know, he's a friendly guy. He's very affable. He's a likable guy. He's good on his feet. Religious, et cetera, et cetera, et cetera. And he's willing. He he gives, you know, he's a friendly guy. He's very affable. He's a likable guy. He's good on his feet. Religious, et cetera, et cetera, et cetera. And he's willing to do whatever it takes to make Trump look good. Anyway, we'll see.
Starting point is 01:03:53 We want to hear from you. Send us your questions about business, tech, or whatever's on your mind. Go to nymag.com slash pivot to submit a question for the show or call 855-51-PIVOT. Okay, Scott, that is the show. I am off to party with Don Lemon. Don Lemon. We'll be back on Friday with more. Read us out. Today's show was produced by Larry Naiman, Zoe Marcus, and Taylor Griffin. Ernie and Ruttat engineered this episode. Thanks also to Drew Burrows and Neil Saverio. Nishat Kerouaz, Vox Media's executive producer of audio. Make sure you're subscribed to the show wherever you listen to podcasts. Thanks for listening to Pivot from New York
Starting point is 01:04:28 Magazine and Vox Media. You can subscribe to the magazine at nymag.com slash pod. We'll be back later this week for another breakdown of all things tech and business. Cara, have a great rest of the week.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.