Pivot - The X Factor, AI Guardrails, and a Barbenheimer Post-Mortem

Episode Date: July 25, 2023

Kara and Scott discuss a huge weekend at the box office, and Jeff Bezos' renewed interest in The Washington Post. Plus, WTF is Elon doing with his Twitter re-brand? Also, Big Tech agrees to some AI s...afety commitments, will they be enough? Friend of Pivot Alondra Nelson led the development of the Biden Administration's Blueprint for an AI Bill of Rights, she joins us to discuss. You can find Alondra on Twitter…or X…at @alondra. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Support for Pivot comes from Virgin Atlantic. Too many of us are so focused on getting to our destination that we forgot to embrace the journey. Well, when you fly Virgin Atlantic, that memorable trip begins right from the moment you check in. On board, you'll find everything you need to relax, recharge, or carry on working. Buy flat, private suites, fast Wi-Fi, hours of entertainment, delicious dining, and warm, welcoming service that's designed around you. delicious dining and warm, welcoming service that's designed around you. Check out virginatlantic.com for your next trip to London data, and a matching engine that helps you find quality candidates fast. Listeners of this show can get a $75 sponsored job credit to get your jobs more visibility at Indeed.com slash podcast.
Starting point is 00:01:00 Just go to Indeed.com slash podcast right now and say you heard about Indeed on this podcast. Indeed.com slash podcast. Terms and conditions apply. Need to hire? You need Indeed. Hi, everyone. This is Pivot from New York Magazine and the Vox Media Podcast Network. I'm Kara Swisher.
Starting point is 00:01:24 And I'm Scott Galloway. I'm in San Francisco, Scott. Where are you? I'm in Aspen. I spent the weekend in LA, and we had a wonderful time. I told my youngest we could do whatever he wanted. So we did two malls. We did the Century City Mall, which is one of my favorite malls, and I kind of grew up there. And then we did The Grove. Where Barbie was filmed. But anyway, go ahead. There you go.
Starting point is 00:01:48 Then we did The Grove. We saw two movies, Mission Impossible and Oppenheimer. We did In-N-Out Burger. We did Universal Hollywood. All of this in about 36 hours. We had a really wonderful... I was just telling our producer, Lara, I'm at that point where the end of my kids being kids is now painfully visible.
Starting point is 00:02:10 So I'm really enjoying this kind of stuff. Oh, it's going to be great. I'm here in San Francisco with Alex, who's 18. We're on a little mom-son kind of thing. And then next week, I'm going to do one with Louie before they both go away to their various and sundry fall activities. Louie to Argentina and Alex to Michigan. It's not like the wonder of, you know, oh, my God, they put chocolate on this ice cream. It's like there's definitely something about 12.
Starting point is 00:02:36 I mean, he will literally, he finds things just so fascinating, the little things, and that kind of wonder if you if you will or i don't know you've read out you know alex he talks about the most esoteric things and like he's like did you know about this particular factual thing and don't be so fucking mature i'm trying to be melancholy don't be melancholy they have you have new relationships with your sons it's really nice you you have to develop relationships over time you have like you should be thinking about our relationship for example. But let me just say, speaking of relationships, you know what I did this weekend? Why don't you ask me what I did this weekend?
Starting point is 00:03:10 Amanda and I went to salsa lessons. I gave it to her for her birthday. I hate dancing. She loves it. We went again, and I'm not bad. I wasn't bad this weekend. That shocks me that you're better than you thought of something. That shocks me. No, I just didn't think I sucked. That shocks me.
Starting point is 00:03:25 You and I should do salsa dancing together. I will lead, obviously. It would look like, it would literally, I told you this, it would look like Ichabod Crane and Herve Villachez having an epileptic attack at the same time. No, I think we'd be good. By the way, I'm sure that triggers
Starting point is 00:03:37 the National Epilepsy Protection League. Yeah, okay. You're getting so Ron DeSantis these days. Everything is woke, woke, woke. He got COVID right. That should be a tagline, his tagline. But then he didn't. But then he didn't. But then he didn't. But he's not. He doesn't want to own that. Oh, you read that article about the Times. Yeah, I don't know. Whatever. He shifts back and forth depending on the way the wind blows. His campaign is a disaster right now, apparently, from all the different stories I've read. But we should do things in our relationship. We should find something we can do together.
Starting point is 00:04:05 Rock climbing, perhaps? I don't know. Something together. Or we could just have a podcast and make a shit ton of money. All right. How about that? We'll do that. All right.
Starting point is 00:04:12 Okay. I just want those special moments. We were in Cannes together. We were in the south of France together. Did we have a good time? We had a good time. We had a nice time with you swimming. My favorite moment was when we were at the Hotel du Cap and we were with Amanda.
Starting point is 00:04:23 And you just looked at me like what you are, a 60-year-old mother of two infants. And you go, you said to me, can you take the kids to the pool and just leave me alone for like 30 minutes? You looked at me like, you literally looked at me like your last wish. It's true. You're like, we're friends. I can say this to you. And you said it out of earshot of Amanda. You're like, take the kids somewhere and just keep everyone away from me for 30 minutes.
Starting point is 00:04:48 It was true. Let me have some rose. I don't even drink wine, but I'm going to order it. I just lay here. You need that moment as a parent. Oh, you need a lot of those moments. I took the kids to visit an employee of mine who had just had a baby and let Amanda have a little of that time. She went and got her nails done, which was nice. Yeah, you need those moments.
Starting point is 00:05:06 Anyway, we have a lot to talk about besides our relationship. We'll talk about Twitter's rebrand, which, of course, as usual, Elon inserting himself into the news. He's trying to get one over on Barbie, I guess. A new agreement between the White House and tech companies on AI. And our friend of Pivot is Alondra Nelson, formerly the White House Office of Science and Technology Policy. She led the development of the Biden administration's blueprint for AI Bill of Rights.
Starting point is 00:05:29 So we'll talk about that. But first, Scott, it was a huge weekend at the box office. Barbenheimer brought in a combined grand total of $244 million in ticket sales in the first three days. That's a big number. Barbie accounted for $162 million of that, while Oppenheimer brought in $82 million, well over expectations for both of those movies. That's 12.8 million tickets bought for Barbie and 5.8 million for Oppenheimer. It's set to be the fourth highest box office weekend of all time, and Greta Gerwig is now the most,
Starting point is 00:05:57 has sold the most tickets for female director in, I think, history. The people have spoken. Barbie is clearly the winner here, financially speaking. You have not seen Barbie still, even though it's about men and women and the things you talk about of men being lost. I will ask how you thought of Oppenheimer, but Barbie was the clear winner financially and is leading the way and dragging and bringing Oppenheimer, which is also a fantastic movie, with it. And combined together, people were very
Starting point is 00:06:25 excited and made it a appointment viewing, which hasn't happened in a long time for both of them. And they both spurred interest in each other. So, tell me what you thought of Oppenheimer. Well, first of all, The Weeknd and The Box Office Receipts are really a win all around. I mean, if there was an industry that needed good news, it's not only just Hollywood, it's specifically movie theater. And something that was really inspiring, we did dinner before Mission Impossible at the Century City Mall. I'm not exaggerating. The mall was electric, and everyone was wearing pink. And there was- For Oppenheimer then. No, go ahead. For Oppenheimer. All the women were in pink. There was a lot of men wearing pink. And I thought it was nice.
Starting point is 00:07:10 I think anything that creates sort of a collective is nice. I think we need more of that. We actually, our plane got delayed on Saturday by four hours. So I said, all right, let's bomb to the movie theater. And we went to the Van Nuys Regency. And we were five minutes late to see Barbie. So we saw Alpenheimer, which was starting in 10 minutes. I'd love to talk about it a little bit.
Starting point is 00:07:31 Yeah, let me talk about that. I think it's really, I mean, first off, I was going to love the movie before I even walked in. It's World War II. It's science. It's kind of this torture genius. It's just like everything I want in a movie. And also, I think Christopher Nolan is a genius. It's just like everything I want in a movie. And also, I think Christopher Nolan is a genius. I think a lot of its lessons apply to today, and that is people take for
Starting point is 00:07:51 granted science really wasn't incorporated into government, much less the military's plans, until World War II. And the integration of scientists and their sort of elevation to kind of near gods, which continues today, really happened around World War II and this project with the recognition of, you know, whether it was Turing and breaking the Enigma codes or whether it was Oppenheimer and he developed a celebrity-like status. And both of whom suffered after they had had their accomplishments. Hugely.
Starting point is 00:08:23 And, you know, one of the lessons to take away is this notion of being a genius, it comes with downsides in terms of your own kind of self-torture. But I thought it was a really interesting lesson in a few things. And the first is Oppenheimer, as much as he was a genius, he invented a new type of charisma. And that is everyone who met with Oppenheimer felt that there were these brilliant insights that he had yet to reveal. And they wanted to be around him, and they wanted to be near his work. And he was able to bring together just unbelievably different cultures of, you know, a culture
Starting point is 00:08:55 of physicists, a culture of the military, a culture of the government to pull off something we didn't, we thought at the time we were literally in a race, and it ended up that the Germans had taken a wrong turn, and we were really the only sole contender for the bomb. Something else that people didn't realize is we only had two. We ran out. And one of the great philosophical questions that they talk about in every philosophy class now is, okay, maybe you can justify dropping the first bomb, but could we justify dropping two? And it was meant to say, hey, we have a lot of these. But also, there was no reason why a scientist wouldn't logically think, now that we can do this, it's really the end of the world.
Starting point is 00:09:30 I mean, you could understand how they would have thought that. And I see it as really a victory for society globally, because we were able to destroy a city in a flash. And we've, in the last, really, you know, in the last 75 75 years we haven't had more detonations because the world has decided this is bad go ahead okay
Starting point is 00:09:47 I want to know what you thought of the movie so what did I think of the movie yeah thank you for that
Starting point is 00:09:52 history lesson I thought it was a master class I think Christopher Nolan it clearly there were about a dozen stars
Starting point is 00:10:00 in this movie who are used to being the star and they were given I mean Robert Downey Jr. will win an Academy Award he was the biggest star in the movie yeah are used to being the star. And they were given, I mean, Robert Downey Jr. will win an Academy Award. He was a big star in the movie, yeah.
Starting point is 00:10:08 Remy Malek, who is a big star, had a small role, but he was so excited to work for Christopher Nolan. There was just these master performances at every turn. Emily Blunt was just fantastic. Superb. It's novel. You know, there were just all these incredible cameos, I would almost say, by these brilliant actors in the cinematography. So you didn't think it was too long? You know what's interesting?
Starting point is 00:10:33 My 12-year-old said it was his favorite movie ever. Oh, okay, good. And I didn't think a 12-year-old would be able to get through three hours of it. Interestingly, Alex and Louis went to see Barbie this weekend. And what did they think? You know, I think that they liked it. I think they, you know, I think it's, it is aimed at women a little more, but it certainly provoked a lot of conversation. I think Alex called it seven out of 10 because he's an Oppenheimer or a Stan, no matter how, and he has to compare them, which they're not comparable.
Starting point is 00:10:58 They're actually complimentary. And Mission Impossible and Indy are dropping off rather significantly. Mission Impossible and Indy are dropping off rather significantly. And Sound of Freedom, which was another original movie, even though the guy who stars in it and pushed it out is very Trumpy. He thinks Trump is like Jesus, essentially, and has QAnon edges to him. That did very well, too. All these are original movies, original, if you think about them, in a lot of ways, versus a sequel or a superhero movie? Well, look, the industry will constantly play up successes. The reality is there might be a blip up, but movie theater box office is in structural decline and it'll maintain its decline.
Starting point is 00:11:37 So good for them. Enjoy the moment. But just, I always like to try and relate this to some sort of business learning for a young executive. One of the real, I don't know, lessons or takeaways from Oppenheimer is that hubris can best genius. And this was a guy who ultimately got a security clearance taken away. Unfairly. But taken away. And returned, but go ahead. Jesus Christ. Literally, give go ahead. Jesus Christ.
Starting point is 00:12:07 Literally, give me some running room here. Let me acknowledge that Barbie's the greatest movie ever before seeing it. Okay. So, also, just the notion that he wanted to, he had an important message that we should stop the arms race before it really hits its stride. And Edward Teller was a big proponent of moving to an H-bomb testing, and the arms race just bursted wide open. And his message was undermined by his hubris,
Starting point is 00:12:32 and that is, at the end of the day, if you're an asshole, if you don't respect other people, if you embarrass them publicly, if you sleep with their wives, it will come back to haunt you, regardless of what a genius you are. And I think that has a lot of lessons today that there's people who take science and their prestige garnered from science is convincing them that they're almost godlike. And hubris can undo any genius. It can undo any genius. I just thought it was, I can't stop thinking about it because I'm interested in that era. And also, I think he was a pretty deeply unhappy person. Sounds like it. It was interesting because one of the things that I thought from the movie was he wasn't as arrogant as many science people we know and science and tech people we know for sure.
Starting point is 00:13:19 And he was very thoughtful and had a lot of good relationships. I do think the carelessness and thinking that Russia or whoever wasn't going to steal the secrets. Now, I thought they would have gotten stolen no matter what. And Russia would have gotten to it, the Germans, whoever would have gotten to it. I thought it was a beautiful movie. It was a very beautiful and moving movie. I'm glad it stuck with you. Is there one scene that stuck with you?
Starting point is 00:13:39 And then we're going to move on to another topic. I thought when they tested the thing, it was really powerful. Because I mean, when you think about what they did, it was so visionary and strange. He brought everyone to this remote place in the desert and said, we need to build a church, restaurants, a post office, and ask everyone and their families to move to this place. That's just so incredible. And I don't know. I'm trying to think if there's really, I'm not sure there's one scene that stands out.
Starting point is 00:14:12 Is there one scene that stands out for you? Well, the bombing was just his breath. All you heard was the breath, not the noise of the bomb. Yeah, it was very dramatic. I'll tell you, you know who I thought was great? Matt Damon, another under-sung role. There were five amazing, there were literally a dozen great performances in the movie.
Starting point is 00:14:27 It was a real team effort. Yeah, it was. It was a great role for him. It was a great role for him. And there were a lot of complex topics and people and relationships that they managed to keep going. I'm going to go back and see both Barbie and it and Mission Impossible because I think there's a lot there.
Starting point is 00:14:41 The line that stuck with me, I won't say it was the scene, but the line that stuck with me, Gary Oldman. Again, another amazing actor playing Truman. Crybaby? Well, that. But he has this line, you know, Oppenheimer says, sir, I have blood on my hands. And he looks at him and he goes, no one gives a shit who built this thing. They care who dropped it.
Starting point is 00:15:01 And I thought that was very powerful. And he was right. He was right. You know, it's the decision to drop it, not to build it. And I thought that was very powerful. And he was right. It's the decision to drop it, not to build it. I mean, you can literally go, if we talked about this for longer, which we can't, we'd go, oh, and this amazing performance and this amazing performance. I thought Ullman was great. He was great. Don't let that crybaby back in here. Yeah, don't let him back in here.
Starting point is 00:15:21 Yeah, I know. It's interesting that he could have really been celebrated more, and he just decided to say no. All right, speaking of moguls who have big responsibilities, 10 years after, 10 years after buying the Washington Post, I can't believe it's that long, during which his levels of interest in it have varied,
Starting point is 00:15:36 mostly not very interested. Jeff Bezos is rolling up his sleeves again. The company's down a half a million subscribers, which is a lot. It's set to lose $100 million this year, incredible amount of money. He's reportedly planning for 2023 to be, quote, a year of investment, but not a year of profit. He appointed, as we've discussed before, Patti Stone-Seifer, who I have great regard for, as interim CEO last month after the publisher, Fred Ryan, stepped down and Bezos gave him an easy exit by funding his civility efforts. an easy exit by funding his civility efforts. Meanwhile, he's weighing in personally on a project for the opinion section. Full disclosure, my wife, Amanda, works for the opinion section, but is not involved in any of this. While greenlighting an overhaul of the style section and online redesign, what thinks you of this? I still think he's going to lose interest. I don't
Starting point is 00:16:19 think he's, I think he just sort of ended up with it. It's not that energetic, but he put someone in who is very good. Any thoughts? I think this is an example. I mean, it would be ideally we could have long-form journalism and companies that were traditional newspapers and bring a level of fact-checking and reverence for long-form journalism or gumshoe reporting. You'd like to think that that's an industry that's self-sustaining. It's not. Yeah. reporting, you'd like to think that that's an industry that's self-sustaining. It's not.
Starting point is 00:16:49 And so the bottom line is, the best you can hope for is a benign billionaire, someone like a Bloomberg. But basically, that's what's happened. Sam Zell, whoever you look at, newspapers, the Tribune, the Times, they buy these things. Billionaire Democrats buy newspapers, and billionaire Republicans buy football teams. And when I was involved in the New York Times and we hit some ruck patches, I think I heard from every Democratic billionaire in the nation saying that they were interested in getting involved. In other words, they were interested in owning it. They would wake up in the morning and say, hello, publisher in the New York Times. And the reality is these things just economically don't work. And also there's a tell here, and that is they're going to lose more money this year because Patti Stonecipher wouldn't have taken this job unless she had a commitment from Bezos to make the requisite investments. She's a smart woman. And this is why the Grams probably sold it to Bezos. To continue that level of journalism, you're just going to lose a shit ton of money. And here's the good news. He has a shit mega ton of money.
Starting point is 00:17:45 So even if he loses a quarter of a billion dollars this year on 60 billion in wealth at 8%, he's going to make 5 billion more this year in his investment. So it really doesn't matter. And I think he has proven so far to be a very responsible steward. Because what I have found, I've actually been in some newsrooms. I do find the journalists, when they see this billionaire come in, they kind of, I don't want to say they ignore the business realities, but they expect that person, they rightfully believe what they do
Starting point is 00:18:16 is noble, but they expect that person to just invest. They don't expect this to be profitable. They're like, come on, this is more important than that. And there's no one who can invest like Jeff Bezos. I think Jeff Bezos is a gift to the Washington Post. I agree. I agree. I think, you know, it would be interesting if he was even more involved. I don't think he has as much interest in it. He has obviously put some of his values in that spot, which I thought was, that took far too long. Fred Ryan was really floundering toward the end of his tenure and hadn't been decisive the way, say, Meredith Levian has been at the New York Times or the owners, the Sulzberger family has been in terms of adding on things
Starting point is 00:18:54 that make it profitable. They're not very profitable, but they're profitable now. And these things will, one of the things they talk about is, oh, New York Times, the runaway success. I'm like, it's not that profitable. It's just a little profitable kind of thing. But you're right. It takes a commitment. They could certainly make this profitable, I think. I suspect there's room for two big national news organizations that own a lot of different properties, not just the print newspaper itself.
Starting point is 00:19:21 But you're right. For the Washington Post to be profitable, it'd be a shadow of itself. They would have to cut so much spending. And also, the New York Times, really over the last 30 years, it's gone from a regional to a national to, quite frankly, a global newspaper. It folded, what was it, the International Herald Tribune in? It's just got global reach. The Washington Post, to get to where the New York Times is, would need to lose a couple billion dollars or invest a couple billion dollars and it'll take them a decade. It will never... So what would you do? Give me something you would do. So just keep at this
Starting point is 00:19:50 minor level of loss or possibly get it to break even. That's what you would do. What I would do is realize that I'm going to be dead soon, regardless of the amount of andro and testosterone I'm taking. And regardless of how hot my girlfriend is, I'm going to be dead soon. My mark here is to do something really wonderful for democracy. An incredible evangelist for Western values is the Washington Post. And if there's anything that's worth investment and worth one, two, three percent of my wealth, it's not a giant clock or some giant dildo into space. It's the Washington Post. So what would I do? I would continue to make big investments in this wonderful property that, again, espouses Western values. And get it to break even. Get it to maybe break even. Well, it has to be. It can't be hemorrhaging money because at some point he will die.
Starting point is 00:20:40 Yeah. And his kids aren't going to go. Yeah. And someone else aren't going to go negotiate the release of one of our journalists from the Taliban. And I'm like, that's what we're doing. Meanwhile, Google has just a bunch of engineers that they slip a pizza box under the door. It's stealing our content and making a shit ton of money while we try and negotiate the release of our journalists. And they're trying it again with AI. They're trying it again. There he is. So, yeah, I think it's wonderful Bezos is involved here.
Starting point is 00:21:25 I think he can make a big difference. I think these things matter. And the reality is these things need benign billionaires that are willing to invest. The benign billionaire Scott theory of things. All right. Well, speaking of not so benign billionaires, let's get to our first big story. Twitter is called X now. The bird is dead. The new logo is an X on a black background. I thought it was terribly designed, by the way. Elon says tweets are now called Xs. The rebrand was announced
Starting point is 00:21:55 about a day before it was implemented. Musk said it would only happen if the logo was good enough, which it is not. Go look at it. As of Monday morning, Musk hadn't even secured the at X handle. The word tweet is still all over the site's UI and the domain x.com redirected to a GoDaddy page. This isn't entirely out of the blue or the black. In October 2022, Musk tweeted a rather X'd, I can't do it, that buying Twitter is an accelerant to creating X, the everything app. I'm told that he's told a lot of people about it. He told a lot of political people, including Donald Trump, from what I heard early that he was doing this. Linda Iaccarino, X'd that it's exceptionally rare thing in life or business, you get a second chance to make another big impression. She also had a word
Starting point is 00:22:39 salad that I could read for you if you really want to hear it. It said it had all the words in it except for synergy. Can you read what she said? Can you read it? All right. Linda Iaccarino, among other things, said it was going to do everything. But she tweeted, I'm going to read this whole.
Starting point is 00:22:54 X is the future state of unlimited interactivity centered on audio, video, messaging, payments, slash banking, creating a global marketplace for ideas, good services, and opportunities. Powered by X, X will connect us all in ways we're just beginning to imagine. She said it will be everything. I don't even know what to say. So it's pretty clear at her Toastmasters class, she sat next to Adam Neumann.
Starting point is 00:23:20 Yeah, her son. I mean, let's literally read that. And it's such consultant speak and such nothing. Yeah. She's literally, she's showing up. I think she looks ridiculous. I agree. This is the first time I was like, whoa, Linda, dial that.
Starting point is 00:23:32 She clearly has absolutely no input into anything. And is there trying to make chicken salad out of chicken shit and pretend that she thinks any of this makes sense. A lot of words. Even Aaron Levy, who runs Box, which also has an X in it, said, what do these words mean? And then he took it down because everyone was like, huh? The big question about X is, of course, Y or WTF. Talk about this. He's fired a bunch of employees, made a hugely unpopular change to the platform, scrapped all of the brand equity now by changing the name.
Starting point is 00:24:06 I don't know if Linda knew about that he was going to do this. Maybe she did. Maybe she didn't. But she's certainly trying to back him. Talk about the brand equity, because Mark Zuckerberg changed the name of Facebook to Meta, but they kept Facebook. They kept Instagram. They kept these important things. Can you talk a little bit about the brand change? Yeah, they're totally different, because Facebook kept a consumer-facing equity. It just changed the corporate name, which is really for employees. Google did the same thing. Google with Alphabet. Right, to Alphabet. So you could kind of justify those. The first thing is, just to acknowledge,
Starting point is 00:24:38 Elon Musk has built or been the driving force behind two of the most ascended brands the last decade. First, Tesla and two, SpaceX. And he did it in very innovative ways. He didn't use advertising. He actually used Twitter as his primary vehicle. You know, the guy, no matter what, whether it's something stupid, something innocuous, something controversial, something repugnant, he's figured out that if I'm in the news every 48 hours, my brands will have global awareness. And he also, to his credit, built amazing products, incredible breakthrough innovation, even things like opening a store in a mall for Tesla rather than a dealership on the outskirts of town. He has proven to be one of the most thoughtful seminal brand builders in history. Changing, doing away with Twitter to Axe will go down is absolutely one of the most thoughtful seminal brand builders in history. Changing, doing away with Twitter
Starting point is 00:25:26 to Axe will go down is absolutely one of the worst brand decisions in history. Probably a third to 50% of the world's population know what Twitter is and they recognize the logo. And about 97 to 99% of all money spent every day is on brands you've heard of before. You're not going to return the email of someone you haven't heard of. You're not going to buy a tennis shoe you haven't heard of before. So just awareness is a massive asset. And it's also really expensive to build. Also, the logo, it really connotes something very distinct.
Starting point is 00:26:00 You know exactly what it is when you see the bird. If you were to try and replace this and say, all right, we want to have a logo that everyone recognizes, they know exactly what it stands for. It has actually quite a few, if not positive associations, relevant associations. It's differentiated. It's relevant. It has moats. And half the world knows it. If someone said you have a decade and $10 billion, can you do this?
Starting point is 00:26:27 You wouldn't be sure you could do it. So at a minimum, he's taken $10 billion worth of brand equity and taken it into the street and lit it on fire. And what it says to me is, I mean, I'm not, I believe in billionaires. I believe in capitalism. I don't like the idea of a wealth tax. But when I was thinking about this over the weekend, this really is another signal or indication that income inequality has gotten to a point where an individual in his spare time can make a dumb decision to buy something he doesn't want for $45 billion, use it to not wreak havoc but create a lot of tumult. I know there's a lot of stands out there, but so far Twitter in the last six months has not been accretive for the world.
Starting point is 00:27:08 And then make a decision like this, which means there's nobody on his board that he listens to. Typically to buy a $45 billion company, you would have investors and other people that would weigh in on a decision like this. And there is no rational justification for this decision. No, he just decided to do it on a Saturday. It feels like that. Linda just tweeted, X is here. Let's do this with this very unattractive logo. It looks crypto. The design looks crypto. It looks uber male, but in a bad way. It's like a flag I don't want on my car or don't want to be near people with this flag on their car. It feels very... It feels like Confeder this flag on their car. It feels very...
Starting point is 00:27:46 It feels like Confederacy. Yeah, exactly. It smells like Confederacy. Yeah. Yeah. Oh, wow. It's not in the design looks bad. It just feels very...
Starting point is 00:27:57 And they're talking about bringing all these things together, but where are all the things that are bringing them together? Why not just have the bigger corporate thing? I mean, you know, it's interesting because Walt Mossberg wrote, companies change names all the time, but popular products, rarely the words tweet and Twitter have become part of the language. So brand equity, Musk is giving up a huge. I know he wants to add many functions and services too, but why did it require a name change? And for some reason, JR, who is usually intelligent, say, when you see how successful Alphabet and Meta have been with their name changes, you have to do it. He said they were corporate name changes, not product name changes. Alphabet didn't change the name Google. Meta didn't change the name of Facebook. Yeah, I don't know why. I don't know Alphabet. No one's saying that. He could have created his own financial brand or Twitter money or whatever.
Starting point is 00:28:35 And he also was saying that it's going to be like most of the global financial network. I just was like, are you high? And then I thought, well, it's probably. The answer is yes. Yeah. Yeah. From a brand decision. And it goes back to hubris, and it goes back to not enough of these billionaires have guardrails in the form of other investors, much less regulation.
Starting point is 00:28:53 Anyone who had anyone around them who had any credibility or any authority would say, we need to rethink this decision. Right. Because if he put the brand up for sale and just said, you have access to the logo and the name, someone would probably pay a billion dollars plus for it.
Starting point is 00:29:10 Right. You don't take this kind, this type of equity takes decades and billions. What's the argument? You know, this is sort of a shoot the moon kind of thing. Like someone was like,
Starting point is 00:29:20 I appreciate him shooting the moon. I'm like, why? Why shoot the moon? I don't even understand it. I mean, does it get rid of like the badness around Twitter by calling it X? I mean, I know he loves the letter X. Fine, whatever. He called one of his kids X.
Starting point is 00:29:33 One of his early companies was called X. I get it. You like the letter. It's great. I'm fond of the letter C. I don't know what to say. But the branding discipline is really not strong here. This isn't shoot the moon. it's shoot yourself in the foot.
Starting point is 00:29:47 And it's so clear that there's nobody home here. The site is a mess. Clearly, it just wasn't rolled out well. The name changes very rarely work because usually it's a company trying to escape something. There's a disaster, an air disaster in the Everglades. They merged with another company. You know, Norwest changes its name to Wells Fargo
Starting point is 00:30:10 because it's a better brand in M&A. But trying to like, but to take, I've never, I don't think we've ever seen this before. You could argue maybe a little bit with HBO to Max. Yeah. But to take a brand that's globally known and turn it to X and do it kind of incrementally, like, oh yeah, change that now. And oh wait, I want the, he's asked users to design it.
Starting point is 00:30:33 It feels like you gave a 16-year-old boy a couple hundred billion dollars and said, have at it these decisions. And maybe that's part of his genius that he has no card rails. I don't know. But this one, every brand strategist, every head of an ad agency, any academic in the world of marketing is going to go at this and cock their head and go, they'll wait for a minute because a lot of the stuff he does ends up being not only just crazy, but crazy genius. But this just seems crazy. And it doesn't, it feels like it's not very well thought through. And he's decided, this is my toy and I like the term X. And Linda Iaccarino putting out these, like you said, this word salad trying to, I feel like she's the circus clown behind an elephant scooping up shit every 30 seconds.
Starting point is 00:31:19 Yeah, I do too. It just feels, anyways, I'm sad. I wish he would put it up for sale because I'd love someone to buy it. I think it's a – it's a global brand. I would bet it's one of the top 50 most recognized logos in the world because this logo has been at the bottom of every media company on TV because it says – Well, they're talking about them, right? They're talking about it. Well, he always manages to do that.
Starting point is 00:31:41 Yeah, even as, you know, the threads numbers are going down, because that's what happens in these cases, the usage numbers. Although I'm using it more, I have to say. I guess he just wants to put his name on it. I don't, I'm trying to give the nicest case scenario. In other words, he just thought of it because he was, he had a bad trip one night and said, let's do this. And he likes the letter X and he's rich. Well, but that's my point. Honestly, when people can just sort of buy $45 billion media companies and then start retweeting conspiracy theories and then just go,
Starting point is 00:32:13 oh, let's just call it X. And I'm like, okay, we have gotten to a point where some people just have too much money. And when they have no guardrails and they can buy big companies like this and make these sorts of decisions errantly based on their blood sugar level and this idea they have, and clearly this just has not been thought out. It's clear most of the people that come to them have. Is there, will people forget it? Because no one likes Metastill. I mean, we just sort of go along with it and people still call it Facebook. What positive could it be that he gets to start fresh? Look, I like how you try to, you're a journalist and you want to see both sides of this.
Starting point is 00:32:52 There's not a lot of positive here because it's yet another opportunity for people to sort of abandon it. Yeah. To say, I don't understand it. I don't get it. I'm going to try this threads thing. But name changes very rarely work. The reason most companies go do a name change is in an acquisition, they acquire a
Starting point is 00:33:10 company that's stronger than them with a brand. You know, Norwest, a company known for mortgages that was a very healthy bank, acquired Wells Fargo. And they're like, Wells Fargo is a stronger brand. Dean Witter acquired Morgan Stanley. And they're like, you know what? Morgan Stanley is a stronger brand than Dean Witter. We're going to go with this one. Or someone says, our corporation, everybody hates us. Let's call the whole thing meta. And I think the metaverse is going to be a big deal. But Mark Zuckerberg was smart enough to go, the people who use Instagram do not want a different logo or a different name. We're not going to risk that franchise. And that's what he's done here. Just the downside here is exponentially greater than the upside.
Starting point is 00:33:44 The downside here is exponentially greater than the upside. Yeah, I will read from Lou Pascalis, a very well-known ad person for years and years and still is. A very close friend of Linda's, by the way. Latest in the unending series of idiotic edicts from Elon Musk announced last night and uninformed by any user insights. Worst of all, his all the birds comment is likely to reinforce the perception that a few remaining Twitter 1.0 team that they're unwanted. The Twitter blue bird logo is beloved, ubiquitous, and has nearly 100% unaided awareness globally, something that most brands never come close to achieving. Virtually every newscaster, reporter, and byline features the logo exclusively giving the brand millions in free marketing. I think he's right. I think he has a very good point. No brand discipline whatsoever and just shitting on what is, even though they have troubles, it's a great brand. Great brand.
Starting point is 00:34:36 I'm literally going to take $10 billion plus, it's impossible to put a number on it, and just set it on fire and give people yet another excuse to think, well, maybe I'm just done here. I don't get it. He's got a vision for X, he likes it, he's allowed to do it. This will go down as one of the strangest moves in the history of brand strategy. It is very hard to justify this.
Starting point is 00:35:04 Yeah, I'll end with Rick Wilson. Desperate rebrands are inherently weak strategy when it's a product itself that needs the upgrade. Elon has worked diligently to make the site less fun, interesting, and easy to use. Grunting out my first amendment on a private platform is like screaming Hanseatic League at the Denny's. You get weird looks, but they're not connected issues.
Starting point is 00:35:22 He can name it whatever he wants, but it diminishes an already wounded brand. So I think he's right. The product matters. The product matters. Okay, well, that's that. That's really smart. Linda, call us.
Starting point is 00:35:34 Seriously, you're much smarter than this. But I guess she has to go along. She has to go along. Not good. I thought she could get out of this cleanly, but I'm not so sure. And by the way, one more point is, I'm not giving this guy my credit card after these kind of decisions. I mean, there's no way in hell I'm going to do financial stuff with him. But maybe the rest of the world is, as people point out. This isn't
Starting point is 00:35:53 having as big an impact anywhere but the U.S. But the U.S. is its most important market. Okay, Scott, let's go on a quick break. When we come back, Big Tech agrees to AI safety commitments, and we'll speak with a friend of Pivot kind of typing away in the middle of the night. And honestly, that's not what it is anymore. That's Ian Mitchell, a banker turned fraud fighter. These days, online scams look more like crime syndicates than individual con artists. And they're making bank. Last year, scammers made off with more than $10 billion.
Starting point is 00:36:46 It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands, of scam centers all around the world. These are very savvy business people. These are organized criminal rings. And so once we understand the magnitude of this problem, we can protect people better. One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple. We need to talk to each other.
Starting point is 00:37:21 We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do if you start getting asked to send information that's more sensitive? Even my own father fell victim to a, thank goodness, a smaller dollar scam, but he fell victim and we have these conversations all the time.
Starting point is 00:37:38 So we are all at risk and we all need to work together to protect each other. Learn more about how to protect yourself at vox.com slash zelle. And when using digital payment platforms, remember to only send money to people you know and trust. Thumbtack presents the ins and outs of caring for your home. Out. Indecision. Overthinking. Second-guessing every choice you make. In. Plans and guides that make it easy to get home projects done.
Starting point is 00:38:12 Out. Beige. On beige. On beige. In. Knowing what to do, when to do it, and who to hire. Start caring for your home with confidence. Download Thumbtack today. Scott, we're back. The Biden administration announced that seven tech companies have voluntarily agreed to a number of AI safety commitments. The companies include OpenAI, Google, Microsoft, Meta, Amazon, and startups Anthropic and Inflection. That's pretty much all of them, all the big ones. Much of the agreement revolved around testing new AI models to ensure public safety and also to avoid bias and discrimination.
Starting point is 00:38:51 Companies agreed to share information with each other about security risk and to implement bug bounty programs. They agreed to watermark AI-created images and audio, I think probably the most significant thing. The White House announced it's also working on an executive order and bipartisan legislation to codify rules around AI. What do you think about this? Just a press release? I think it's important. I'm a bit of a cynic, and I hope it's not just an attempt to prophylactically avoid real regulation. And they're calling on their better angels and realize that it's probably a good idea for us to share this. The thing I like is the idea of a watermark around AI.
Starting point is 00:39:25 Me too. And I caught up with a friend yesterday I hadn't seen in several years. Do you know who Daniel Lubetzky is? Oh, the guy from Kind Bars. Yes, of course. He's a lovely guy. Daniel's one of these guys that's always led with civic concern first. He's from a young age.
Starting point is 00:39:41 He started a company called PeaceWorks that meant to bring Israeli and Palestinian workers together to work on companies together, thinking that commerce was a great unifier. And he kind of, I don't want to say accidentally fell into kind, but building a billion-dollar confectionery company couldn't happen to a nicer guy. And I mean that literally. And, of course, I'm here in Aspen. He's here. One of the wonderful things, by the way, about this podcast is when I'm somewhere, people reach out to me because they hear about it on the podcast. He said, let's get together for a walk.
Starting point is 00:40:08 So we're talking in Aspen, and immediately we talk about 230. Jesus Christ, in Aspen, talking about 230. I think that's the widest thing I've ever said. Anyways, and I would like, I think there's some potential here. I wonder if you really wanted to have teeth here. First off, there needs to be carve-outs from 230 around certain topical domains, whether it's elections or health. I think there should be, in addition to sex trafficking, which has been very effective, there should be additional carve-outs. Another idea for a carve-out was that anything that is algorithmically elevated is effectively, you're serving as an
Starting point is 00:40:45 editor. You're no longer just a benign platform, a neutral arbiter. You've decided to elevate certain content. You have certain obligations around fact-checking. It's similar to every other media company, and that content becomes absolved or no longer has the shield of 230. Another interesting thing I think politicians should think about, and I was speaking to one this morning, is that if it's AI-generated, should that not have 230 protection, which would massively encourage investment to spot, screen, and filter AI-generated content and images on these platforms? Because this shit could get very scary very fast. And there's already been AI-generated images that have convinced people that the Pentagon is under attack, and we're just getting started. So, I like this. I think it's a step in the right direction. I hope they're not using it as a
Starting point is 00:41:31 head fake and a weapon of mass destruction. Yeah, that's my word. They have to pass actual legislation and give money and staff to these agencies, teeth to these things. And very, not say the, you know, we're talking about the merger guidelines last week, they're using previous law to try to get a stronger hand for themselves, these regulators. But they really do need to define this as a different and new technology. And I think they've done that earlier than before, but the stakes are so much higher in this thing, that they have to not just have executive orders, you know, you could see the right go after this, saying they're trying to tell these tech companies what to do and make woke AI, that kind of stuff. You can just see all the attacks that could happen here.
Starting point is 00:42:15 And, you know, in previous tech booms, they've done nothing. And social media and the Internet, if you want to use those, and I would put mobile in there. Very little done around privacy, very little talk and tackle. And my worries are going to do a lot of these announcements and then nothing, actually. That's my biggest worry here. But, you know, the watermarking, this is the kind of stuff they have to do to cooperate together, especially around public safety, bias, discrimination. And of course, to me, provenance is the most important part of this. We give it a, what, a thumbs up, a thumbs up, a medium thumbs up. Yeah.
Starting point is 00:42:47 Not as good as Barbie and Oppenheimer, but we- Cautiously optimistic. Cautiously optimistic. All right, let's bring in a friend of Pivot. Alondra Nelson was formerly the deputy assistant to President Joe Biden, acting director of the White House Office of Science and Technology Policy. She was the first African-American and first woman of color to lead U.S. science and technology policy in her role there. Among other things, she led the development of the blueprint for the AI Bill of Rights.
Starting point is 00:43:16 She is currently the Harold F. Linder Professor at the Institute for Advanced Study, an independent research center in Princeton, New Jersey, where Oppenheimer was, by the way, and also Einstein. Thank you so much for being here. Thanks for having me. All right. I want to start, Alondra, about your thoughts on this new agreement between the White House and major tech companies. I'm sure you're very familiar with it and the different things they went through. We just talked about it a second ago, a lot of around watermarking, AI images, but also other things. So give me a little rundown of how you think about it a second ago, a lot of around watermarking AI images, but also other things. So give me a little rundown of how you think about it. Well, let's say at the top that it's, you know, it's generally a good thing. I think the fact that it's voluntary, that we can, are justified in being
Starting point is 00:43:56 grumbly about that and wanting a little bit more. But there are some good things here. I mean, I think the high level good thing is that the White House has been for government kind of moving at a clip around this. So they've been really serious and things have been moving. That it's been happening since, you know, the October when I was in government in October of 2022, we did this blueprint for an AI Bill of Rights. And so, you know, the Biden administration has been well engaged in this for a while. And I think there are things that I think all of us would have wanted. So, you know, there's some fundamental things. I would point people to the actual commitment document. There are fundamental things in there, like companies have a duty to make their products safe. You know, it's like these are kind of fundamental assertions that are really important, that the companies have to earn
Starting point is 00:44:45 people's trust is what part of the commitments document says as well. And so I think we're at a place that's better than the social media moment at the same time. And I think that that's encouraging. It's an increment, it's voluntary, and there's a lot more to do. But I think in general, you know, it's good. So you said grumbly about voluntary. These companies, obviously, people would like to see laws in place that never happened with social media. These companies have the proper incentives to follow them because some of these could feel like a press release, right? As you know, you've been in government and that can happen. Yeah, I think we're getting to, I think we're getting there.
Starting point is 00:45:21 And I think that the fact that, you know, these seven companies even agreed to this, I think, suggests that they know that legislation is coming and that they need to sort of get in line, get information. But there's obviously lots more to do here. But they've agreed on things, like seven companies that are fierce competitors have agreed on things that are more than just principles, right? So we've got a level of granularity around red teaming that we haven't had before in the tech policy space, around basic sort of consumer privacy. At the same time, on the other side of this, we already have, you know, the FTC, other kinds of regulatory bodies saying that the law is the law and that you've got to protect consumers. So it's not like we're only waiting for the companies. You know, you've got, for example, the FTC saying we've got to protect consumers. You have CFPB, the Consumer Finance Protection
Starting point is 00:46:15 Bureau saying the same. So, you know, I think that we're getting to slowly, incrementally, a kind of perfect storm or a cauldron of things that will get us to a better place than we've been in the last decade. Right. But legislation is really where it's at, whether it's merger guidelines or anything else. Guidelines are one thing, but real law is another. Scott? Nice to meet you, Alondra, and thanks for your good work. So I'm curious where you are on the spectrum of people who think that AI is just another tool that we'll use, and it can be used for good means or bad means to the AI, what I'll call catastrophists who think this is the end of the world, that this is now beyond any point of turning back. Where are you, like, what kind of
Starting point is 00:46:58 existential threat do you think AI presents with relative to other technologies? It is just a tool, but it's a very powerful tool. And I think that there are potentially threats on the horizon. We know that there are threats that exist right now. So, we know that there are, you know, part of what I thought was great about the White House commitments document is that it felt like an attempt to kind of harmonize kind of various perspectives that have been emerging. So, there's, you know, societal risks, risks around discrimination, bias and harm and these sorts of things, in addition to thinking about bioweapons and cybersecurity and the risks that are sort of far term. So, you know, I think that's what's good also about the commitments document. I think for me, particularly because I've just come out of government in February,
Starting point is 00:47:50 I'm in the realm of the practical. So while I think that there's both near-term and far-term risks, we need to be able to do something about it. And I think it's not entirely useful to be living in the world of speculation and science fiction. What are things that we can do things about and what are we going to do about them? What about the idea, and we were talking about this earlier, what about the idea of any AI-generated content no longer enjoys 230 protection? You know, listen, I've heard that Ron Wyden himself, who was one of the co-sponsors of Section 230 decades ago, saying exactly that. So, you know, I think there have been, you know, I'm not a lawyer, but I think we already saw with algorithmic amplification happening in the social media space that the social media platforms were doing more than just hosting. They were becoming much more than just sort of these neutral things that were hosting information. And I think when you get to advanced AI, generative AI, in which they are dialoguing with people, having interactions with people, you know, the systems are in tools are sorting information that they're going to send out to people. And we'll probably see a whole panoply of companies that are sort of choosing different ways to sort the information or to create layers on top of, say, GPT-4. So there's a lot of, you know, I don't know if we call it editorial, but there's a lot of choice and selection and things being made here.
Starting point is 00:49:09 Yeah, but it's not clear whether they're covered. No, it's not clear. I think there'll be lawsuits that will determine that. I would also say, you know, from the sort of Biden administration perspective, even going in, you know, the president had said that he was looking at this and, you know, was suggesting that we needed to do a lot of reform, if not repeal in this space anyway. So there seems to be a lot of sort of energy galvanizing behind doing something finally. Or at least an explicit bill, I think it's Warner and Hawley, about that it's not, that it includes it specifically. So one of the things that you said, we had been talking about Oppenheimer a little bit, and along with Barbie and stuff. You noted he is a cautionary
Starting point is 00:49:52 tale about the risks of AI and the discussions that need to happen in advance. Now, you had done this with the blueprint for AI Bill of Rights that you did, which was, I think, rather early, as you said. It was much earlier. I'd love your thoughts on why you decided to do it so early, and what are the biggest changes since it came out? Oh, great question. So, we came into office. I was a day one person. And, you know, the president and the vice president already said that not only Section 230, which we've just been talking about, but, you know, harms to children, harms to young people, thinking about competition and antitrust, you know, issues about the threats of social media to democracy. These were already kind of front of mind for the administration.
Starting point is 00:50:35 And so the question then became, what is the proactive vision? So we've got a lot of things that we don't like, that we want to be critical of, and that we want to change. At the same time, is there sort of a proactive vision for what we want to do? So what, you know, what are the sort of rights protections that the people should have in the best of world with these kinds of increasingly powerful technologies? And so that's how we went about the work. We spent a year talking to, you know, all sorts of folks from, you know, researchers to companies to high school students. And we also wanted to produce a document, the part of folks from, you know, researchers to companies to high school students. And we also wanted to produce a document. The part of me that's a teacher sometimes that was
Starting point is 00:51:11 readable. So it's a long document at 70 plus pages, but, you know, anybody can read it. It's a resource for people. I think it teaches people a little bit about AI and how systems work, teaches people a little bit about what you can do about it. So, you know, we introduced red teaming and auditing and risk assessment and these sorts of things. And so it was really to sort of to say, if we're going to sort of move ahead with, you know, a better vision and, you know, how are we going to do that? Also, I think it was really the White House kind of setting the table and setting the vision. So, you know, the document's been referred to as aspirational. It certainly is. Much like last week's commitments, it's voluntary, certainly true. But, you know, I think at its best, what the White House does and what the president does
Starting point is 00:51:54 is to kind of create a vision and expectation for American society at its best. And a lot has changed. So a lot has changed, but I think some fundamental things are true. So a theory of the case with the AI Bill of Rights is that technology is going to change really quickly. It's changing all the time. It's changing all around us. So what do we want to anchor on? Like, what are the things that we, you know, as we go from generative AI to AGI, as we go from a world in which we have generative AI and perhaps quantum computing that like ends encryption as we know it,
Starting point is 00:52:27 you know, there's all of these kinds of technologies happening about us. We can try to regulate the thing, which may not be the best idea, or we can try to regulate the outcomes or we can try to sort of level set on the society that we want. So whether or not you're talking about generative AI or quantum
Starting point is 00:52:46 computing, do we want basic privacy for American public? The answer is yes. The question is, you know, what are the processes, norms, how do we get there? I think will be different. But I think that we need to sort of keep in mind, you know, the kind of holy grail is not AGI, the holy grail, the thing that we are guarding with our guardrails, you know, the kind of holy grail is not AGI, the holy grail, the thing that we are guarding with our guardrails, our democracy, opportunities for people, public safety, and things like that. And so, the theory of the case with the AI Bill of Rights is that we're going to keep our eyes on that and think about the regulations, norms, standards, practices we need to get there. So, Alondra, I'm always heartened when I hear from insightful and intelligent people such
Starting point is 00:53:27 as yourself that outline the issues really well. And then it all leads to a giant nothing burger. We've never had an industry this big with this lack of regulation. And so it strikes me there's a ton of insight, a ton of really brilliant, articulate people outlining the issue, and it results in nothing. Why have we not been able to pass anything resembling regulation on an industry that is now, by market cap, the largest industry in the world, arguably the most influential industry in the world? What are the forces that you and your colleagues are facing that has gotten in the way of us being able to enact any of these very thoughtful views? I think that you had a wonderful conversation with Senator Warner a few months ago, and he really nailed it, which is lobbying.
Starting point is 00:54:16 So you've got big industry, big tech on the one side of their mouth saying, we need to be regulated. This is horrible what's going on. side of their mouth saying, we need to be regulated. This is horrible what's going on. And on the other hand, you know, spending hundreds of millions of dollars every year lobbying against regulation. And so I think that's, you know, that's a fundamental problem and government cannot compete with that in a fundamental way. I think the president has succeeded in doing some bipartisan things. He believes in it. He thinks it can work. I think the horrors of the last decade and of social media are such that people are exhausted.
Starting point is 00:54:51 They see that a lot of harms have been done. And, you know, I think that there's a space for bipartisanship around this. Like, people don't want children harmed and young people harmed. People understand that, you know, antitrust and competition is, might be a problem when you can do something like the threads transition and no one sort of needs to have a conversation about that. Certainly issues around privacy. I mean, I think even the fact that you look at the judiciary subcommittee that Senator Blumenthal and Senator Hawley are working, seem to be working really well together on that. So it does feel like a window of opportunity. I also think the like, the somewhat panic freak out of
Starting point is 00:55:32 the public release of GPT-4 and all of the kind of risk conversation has galvanized the public and got the public's attention in a way that hadn't really existed before. And then that gets legislators' attention. When the public is paying attention, legislators are paying attention. Alondra, where would you start? If you had a magic wand, would you start with privacy, age-dating, Section 230? Absolutely. Yeah. I mean, at this point, having been in government for two years and a bit, please get anything over the line. I mean, like, you know, nothing can hurt. Like, so if you do antitrust, if you do privacy, if you do protect protections for young people, none of these things are to the deficit of the larger goals that we're trying to reach here. But I think privacy is a key one.
Starting point is 00:56:16 And I would say, you know, to go back to Kara's earlier question, data privacy and data issues are the thing that's not in this White House commitments document. I mean, there's an assertion that privacy protections need to be strengthened, but we're not talking about the data piece, which is so important here. Yeah, I would agree. And one of the things is, of course, the private sector tends to be running the show for most of tech, as Scott said, and the government has lost its preeminent voice on the topic. But it's interesting. I was just looking. I'm on X or whatever we're calling it today. This guy, Martin Crowley said, if you're not using AI, you're falling behind. And then Sally Jenkins from the Washington Post
Starting point is 00:56:53 said, if you're using AI to write or create, you're plagiarizing. And so these kind of, I mean, like, what? Like, this is like, it's a really interesting back and forth and back and forth and back and forth going on, which I think people are surprised at. So, Scott, of course, and I don't think there should be a pause in AI development. How do you answer as a government that regulation inhibits innovation? Because that's what you're getting here. There's a gold rush on in terms of investment and money and start an AI company. No matter what you do, you need to learn it.
Starting point is 00:57:26 How do you make that argument that regulation helps innovation versus inhibits? Because that's been the tech industry's go-to for far too long. Yeah. I mean, one of the ways you make the case is, I think, just a business case. So there was a, I think it was a Goldman Sachs report that came out in March or April that was trying to think about the market cap and what might happen, you know, as AI gets up and running. And they were anticipating, you know, something like 7% increase in global GDP, you know, potentially sort of, you know, trillion dollar kind of investment and productivity globally. But as I said elsewhere, what I was really struck by was like a lot of equivocating language. So there was a lot of words like around uncertainty and it could potentially do and it might in ways that documents from big finance don't usually do. They're really
Starting point is 00:58:17 just often leaning into the hype cycle and like, let's go with it. And so I think the uncertainty around the technology, the hallucinations, like, you know, is it going to work if I use it this time? Is it going to work differently if I, you know, put a layer on top of it and try to use it with a customer in another instance? I think that uncertainty around the regulation does not create a kind of rich, robust space for innovation. So there's a business case for that. a kind of rich, robust space for innovation. So there's a business case for that. I also think as a researcher,
Starting point is 00:58:49 sitting here in my office in Princeton, where people have come to think about really hard problems, that the sort of challenges around the technology, how do you explain it? How do we get to a place where you can make sure it's transparent and all of that are real opportunities for innovation as well. So just the sort of intellectual puzzle to be a nerd about it are opportunities for innovation as well. I think the rest of it, a lot of the rest of it are, you know, the lobbying narratives that
Starting point is 00:59:15 lobbyists tell us. And I don't think that there's needs to be, you know, a natural tension between innovation and releasing safe, exciting, fun products that lead to productivity that don't steal people's livelihoods. Yeah, but government's so stupid, Alondra. Government's not stupid. I know that. I'm teasing you. It's bullshit. I know, I know. But I also say, I mean, you know, this was my first time working in government and, and, you know, like the congressional staffers are really good. They are. They're fantastic.
Starting point is 00:59:47 And there's a lot of good people working in government. I mean, you know, I think what it is is that folks are tremendously outnumbered and outgunned. So you're outgunned by the lobbyists and, you know, they're the sort of, you know, going from the Reagan period, that sort of, you know, government's the problem, it's too big. And so, you know, offices are way too small, budgets are way too small for the work that has to be done. And I think if we want to really have nimble, agile regulation around big tech and some of these other fast-moving industries, we really need to revisit that conversation. I mean, not that you should have excessive budgets, you know, for government salaries and the like, but you need people to do the work. Yeah. Scott, last question.
Starting point is 01:00:30 Yeah. So just to reinforce, anybody who's exposed to people in DC up and down the supply chain are, I think, universally surprised at how hardworking and how smart the people are and committed they are. But let me make an ageist comment. They're also really old. Our average elected official is 63, meaning for every 40-year-old elected, there is somebody who's dead. And I'm ageist around technology. I think younger people have an easier time grasping new technologies. I feel it myself. And my question is, one, do you agree with that? The part of the problem is our elected representatives are just too old to really grasp these new technologies. And two, how can we solve for that, if you agree with it?
Starting point is 01:01:13 As someone who's closer to 63 than not and is increasingly finding myself turning to, you know, a graduate student or a niece or nephew being like, how do I do this thing on this gym and the mob? You know, I fundamentally agree with you. But I also, but, you know, folks have young staffers. I mean, like, you know, I think it's not entirely incorrect, these stories that say 20-year-olds run Washington, you know. There are a lot of quite young staffers who really do get it. I think what you do have is a deficit of attention, particularly with legislators. They've got these huge portfolios, they're running, you know, local offices, and they've got a lot of
Starting point is 01:01:49 expectations. And they don't always snap to attention on the things that they need to. And so I think if you think about the difference between what I thought, you might not agree, Scott, but what I thought were like, really smart, sophisticated questions at this hearing that Sam Altman was at. I mean, there was a lot of celebration and praise of Sam as well. But, you know, relative to hearings we had even five or six years ago where people were saying, like, what are the interwebs and how do I, you know, just like fundamental things. How do you make money? Yeah. Certainly, I would make a pitch to folks who are listening that young people do need
Starting point is 01:02:25 to go into government. And it's pretty important and run for office. And I think I'm encouraged to see young people running for office as well. Thank you so much. I wanted to bring you on for a long time. I think you're one of, did an amazing job with that Bill of Rights. And it was way ahead of it, way ahead in so many ways. And I really appreciate it. Well, we tried and worked really hard. I know. You did. You did a great job. Anyway, I really appreciate it.
Starting point is 01:02:50 And thank you so much for being here. Thank you, Alondra. I'm a big fan. Thanks for having me. All right, Scott, aren't you impressed? I think Alondra's, every time I talk to her, I feel smarter. Yeah, it's, I mean, what I said, people, when you get to this level of government, what you find is remarkably intelligent, committed people who have a lot of options and decide that they want to go to work for their country on both sides of the aisle.
Starting point is 01:03:12 It's very heartening. Absolutely. It is. Anyway, one more quick break. We'll be back for wins and fails. As a Fizz member, you can look forward to free data, big savings on plans, and having your unused data roll over to the following month. Every month.
Starting point is 01:03:34 At Fizz, you always get more for your money. Terms and conditions for our different programs and policies apply. Details at Fizz.ca. Okay, Scott, let's hear some wins and fails. Well, the obvious fail is what will go down as the weirdest rebranding in history. Is it the weirdest? I don't think you've ever seen a brand with this global awareness and, quite frankly, positive associations. The singular dis-differentiated just be like overnight we're calling it X.
Starting point is 01:04:04 I think this is new. Yeah, their best hope is the rest of the world doesn't care. That is a business fail, and it sort of flies in the face. I mean, if he pulls this off, quite frankly, we're all going to have to rethink brand strategy. Right, yeah, we can do that. Basic tenets of it. That maverick. Everyone will go, that maverick. He's such a maverick. But also, I really do think we're at the point, and I don't like class warfare against billionaires. I do think this indicates that power makes you more stupid. Everybody needs guardrails, and corporate governance is important. And this isn't going to damage the world,
Starting point is 01:04:36 I don't think. But it's clear that he has absolutely nobody around him that he listens to. And I'm now at the point where, okay, when we have five people worth more than the bottom half of America, we probably have gotten out of control in terms of what it means for people with this, what they can do with this kind of money. Anyways, that's my fail. My win is I think every high school senior should be, I don't want to say forced, but I think every history teacher or science teacher should do what my teachers used to do in the fourth and fifth grade when they were terribly hung over, and that is just put a film in front of us for three hours. I do think Oppenheimer, I thought the movie, I just think that is such a difficult story to tell with that kind of
Starting point is 01:05:20 insight and that kind of visual. It was just so right. I love that it had no CGI. I love that he wasn't wearing tights or a cape. I love that it wasn't a sequel. I love that it isn't a movie franchisor. We're not going to have Oppenheimer 7. My 12-year-old is asking me all sorts of questions about war and about arms control. And I think that we have lost a sense, because the majority of America now has never been drafted. The majority of America has never known an economy that has really, really been disrupted through hyperinflation. We've never had an existential threat to our physical safety from a competitor. We had 9-11, but I don't think most people in the nation were actually physically worried about their own safety. And I think revisiting history is just so powerful when it's done well. It is. You know, I think one of the things I think is interesting is, and you will see, Barbie, it's actually a very smart movie.
Starting point is 01:06:21 And so our up and down version is clever. People say it's clever. I've heard it. It's really clever. You'll see. It's got a lot of depth to it and very smart. Audiences are just smarter. They know what they like.
Starting point is 01:06:29 They've been attracted to intelligence in both these movies. And so that was what I found. And Barbie set off a great discussion, except for Ben Shapiro and, I guess, Piers Morgan, about women and men's relationships around plasticity, around corporate power. It was really interesting. There's all kinds of things to chew on in their feminism. And Oppenheimer, obviously, so much stuff about compromise and ethics and science and et cetera. Audiences like smart stuff, and they are smarter than us, the media, or the people who make these movies, honestly, I have to say. I find audiences always tell you exactly what the right thing to do is.
Starting point is 01:07:09 I don't know. I think the average American is of average intelligence. I don't know. I'm always surprised. You sound like a politician right now. I don't. I think they're smarter. I think they're smarter than you think.
Starting point is 01:07:18 In any way, I'm going to do my words in case. I can prove they're of average intelligence. All right. Okay. I like them better than many people. My fails, I'm toggling between the Israeli parliament approving this judicial overhaul that hundreds of thousands of people are protesting virulently in the streets, which is what stopped it last time.
Starting point is 01:07:38 But they went out and passed it anyway, saying they needed to correct judicial overreach. They don't have a constitution with a written constitution in Israel, so they can't like lean on anything. Just really Benjamin Netanyahu really needs to move along. But he's passed it despite what is obvious protests happening, very significant, including with the militaries protesting against it. And then I don't know if the failure is worth with Ron DeSantis pushing the idea that
Starting point is 01:08:05 black people benefited, had benefits from slavery. I just don't even know what to say about, and then he had Nazi imagery. He's going to change the name of Florida to X. Talk about something that just does not, if there was ever more reasons to like, to check back when people ask you where you're from and you have to say Florida. Yeah. He wants to whitewash slavery in history classes. And for positive, I was happy to see all the people in the theater. It's actually a really nice community experience. It was nice. I agree.
Starting point is 01:08:39 I loved the enthusiasm. And I know you could say Barbie. It's all about consumerism. My son was saying this about consumerism. You are obsessed with Barbie. Just go see it and then we can have a normal discussion. Anyway, we'll see. I like the community aspect of not just Barbie, but going to the theater. I think the Barbenheimer thing, even though it's exhausting and finally tiresome, eventually is great. I just loved it. I like the
Starting point is 01:08:59 whole thing and I don't care if it's commercial and sell tickets. I don't care if it's dressing up and buying plastic dolls. Very lovely little moment in marketing, I think. Anyway, as opposed to X.com. Anyway, which seems dark and glum and again, smells like Confederacy. Anyway, we want to hear from you. Send us your questions about business tech or whatever's on your mind. Go to nymag.com slash pivot to submit a question for the show. Call 855-51-PIVOT.
Starting point is 01:09:24 Scott, I am from Barbie. You are from Oppenheimer. You went on about Oppenheimer just as much as I go on about Barbie. Not even close. That means you lack self-awareness. Everything for you, we can be talking about AI and you reverse engineer it to Barbie. Oh, you'll see. I didn't even have Barbies when I was a kid because it has a spirit I like. You know, Barbie, she only orgasms with a GI Joe. I'm just saying. I'm just saying.
Starting point is 01:09:50 I love that. I love that. Our producer, Lara, is just saying untrue Barbie orgasms with other Barbies. Hi, Barbie. Hi, Barbie. You'll see. Anyway, Scott, that's the show. We'll be back on Friday for more.
Starting point is 01:10:03 Why don't you read us out? Today's show was produced by Lara Naiman, Taylor Griffin, and Travis Larchuk. Ernie Injitad engineered this episode. Thanks also to Dubrow, Emile Savario, and Gaddy McBain. Make sure you're subscribed to the show wherever you listen to podcasts. Thank you for listening to Pivot from New York Magazine and Vox Media. We'll be back later this week for another breakdown of all things tech and business. Have a great week, Karen.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.