What Now? with Trevor Noah - Dex Hunter-Torricke: Translating the Titans of Tech

Episode Date: April 9, 2026

In this episode, Trevor sits down with author and strategist Dex Hunter-Torricke, who has spent years behind the scenes with some of the most powerful people in tech, including Mark Zuckerberg, Elon M...usk and Eric Schmidt, and has seen influence move from institutions into the hands of a few companies and the people running them.   Together, they explore what that shift feels like from the inside, how much power is concentrated at the top, accountability and the lack of it, and what it means when the people shaping the future are also writing the rules as they go.  Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.

Transcript
Discussion (0)
Starting point is 00:00:00 The verdict is in that landmark social media addiction trial, Meta and YouTube found liable for a product that harms the very children who use it. Big Tech, your gig is over. They do it by design. They're trying to keep us all scrolling. It's bad for us. It's really bad for kids. I'll tell you this. If the jury had returned to know, the champagne corks would be popping in the boardrooms of Google and meta.
Starting point is 00:00:25 And for parents, we now know that they were manipulating our children for parents. profits. Every kid with a cell phone that with any device with a camera and connection to the internet is in danger. This is What Now with Trevor Noah. How do I pronounce your name, by the way? Dex and then Hunter Torrick. Oh, Torric. Okay. So the E, no, it's not a... Yeah, yeah. Okay. Where's the last name from? Scottish. Weirdly... Scottish? Yeah, so my dad's side of the family went to India. in like the 19th century. They were Scottish and they were part of the East India Company.
Starting point is 00:01:17 So they were like the working class Scots who made up like a huge part of the East India company. And then that's how the British Empire went and conquered places. So I think they were like merchants or like soldiers. Yeah, yeah, yeah, right. It's such a hazy bit of history. And then there was of course just this one like very elderly aunt, you know, in her 90s who like told me like the family history.
Starting point is 00:01:38 She was the only person who'd like mapped this thing out. Yeah. She was the only one who knew. Yeah. Yeah. The Dutch East India Trading Company. Yeah. What a story, man.
Starting point is 00:01:47 Yeah. What a... It's funny. It's... Sometimes it feels like we're living through new times all the time. And then you read like a history book or you read like, you know, an old story that's being told about something. And you go, huh. It feels like everything is the same just with the technology of the day.
Starting point is 00:02:08 And a different cast of characters. Do you know what I mean? Yeah. Just this revolving door. Yeah, because when I think about like a lot of the work that you're doing now and the conversations you're having, some of the conversations we have about tech feel very similar to the Dutch East India Trading Company. You know, we're talking about an organization that got so big that had the power to shift governments. Yeah, had its own armies.
Starting point is 00:02:32 Yeah, it had its own armies. It changed how people lived their lives. It started living, it sort of was lawless, had shareholders. I think it was the first publicly traded company ever. And because of that, it just shaped the world in many ways that weren't necessarily for the better. Yeah. And it's really funny because now when you talk about the future and the fact that you've got this immense concentration of power. Yeah. And you then try to figure out what could that look like 100 years from now. And people say, oh, well, obviously, like this is all just dystopian nonsense.
Starting point is 00:03:07 You're not going to have these mega corporations, like ruling things. It's like, we already had that. We've literally come from that. We've literally, literally come from that. But let's talk about your story in this. You know, I remember the first time your name came up in my world. I was reading something about the tech industry. And I've always been fascinated by how the tech industry has evolved.
Starting point is 00:03:30 You know, I've met many people involved in tech founders of some of the biggest companies, founders of startups still trying to get their thing going. And there's one thing I'll say, for the most part, every single one of them seemed to have, in Silicon Valley in particular, an idealistic view of the world and how they planned to make it a better place. You know, they were like, I've got this app,
Starting point is 00:03:55 and it's going to make accommodation better for everybody, and I've got this app, and it's going to make getting a car easy for everybody, and I've got this app, and it's going to make restaurant deliveries easier, and I've got this app, and it's going to improve how people connect with loved ones, and I've got this app, and it's going to,
Starting point is 00:04:11 and everyone came in with this wide-eyed optimism. But I'd love to know how you started in this world. Like, what was your original journey? How did you get into Silicon Valley? You know, and then we'll get to the world of you, like, knowing Mark Zuckerberg and, you know, Elon Musk and all of these people. But let's start at the beginning.
Starting point is 00:04:26 How did you get in Silicon Valley? I was obsessed with changing the world from a very early age. I was a very weird kid. And part of it was just, I was growing up in the UK in the 80s and 90s, and my dad was the, you know, he was the person who was a refugee as a kid. You know, he was four years old when he was a refugee during the Second World War. He'd grown up in a refugee camp in India. Wow.
Starting point is 00:04:53 And he'd come to the UK in the late 50s, you know, total outsider, you know, probably a few pounds in his pocket. You know, society was deeply racist. and, you know, his life was just terrible. I didn't learn about it until many years later, actually, near the end of his life, you know, when I was 30 years old and I was working in Silicon Valley at that point. My mom, she was an immigrant from Malaysia, came to the UK in the 70s to be a nurse in Britain's National Health Service. And growing up, we had absolutely nothing. There was no money. You know, I grew up mostly in the southeast of England, in this town, Tumbridge, and it was sufficiently provincial at the time that, When me and my sister walked down the street, people would literally stop and stare because they'd never seen people who weren't white.
Starting point is 00:05:39 You know, kids would come up to you on the playground. They'd say, can I feel your hair, mate? I always remembered, like, kids would ask, do you speak Chinese or Japanese? And I thought, I was born in Hammersmith. I mean, like, how likely is it that I speak Chinese or Japanese? But, like, that was annoying enough for me that, like, from this very early age, I was, like, interested in, like, why we even assign these ridiculous labels to people. And we make these assumptions, and they constrain our whole life. lives and like if I had spoken Chinese or Japanese, like, so what? Yeah.
Starting point is 00:06:09 And so like, I really wanted to like spend my life trying to figure out like, how do you improve these systems that shape your whole life from before you're even born? Why is it some kids get these amazing lives from the beginning? Why is it that others really are struggling the whole time against the tide of history? And the logical place to start there was at the UN. So I moved to New York in 2008 and I got a job at the UN. I had done research on the UN for my master's. I got a scholarship to go to university.
Starting point is 00:06:40 And, you know, I was the first of my family to get a degree. So I got two of them for good measure. I thought I will just, like, absolutely, like, work really hard now to, like, you know, try and rise through this system. And, you know, at the UN, I realized really quickly that it's not the sort of place that actually you can change the world most days. You're just trying to get through the day. That's a, that's, you're not the first person I've heard say this. And it's it's one of the most
Starting point is 00:07:05 depressing and frustrating things to hear people say is that they go work at the UN. I would argue most of the people who go work at the UN are smart, idealistic world-changing people who want to see things become better.
Starting point is 00:07:23 Yeah. You know, so it's everything from like the best lawyers, the best like accountants, the bed, like whatever it is. They go, I want to use these talents to help make the world. the better place. They get to the UN and they're like, oh, this place doesn't seem to move anything. But help me understand why. Like what is happening at the UN, or rather what is not happening at the UN that could be or should be happening? I think people don't really understand how poorly resourced and how weak the UN is structurally. Like the New York City Police Department
Starting point is 00:07:56 has a budget, orders of magnitude larger than the entire UN globally. And like, like, Like, that's shocking to a lot of people. The UN has hundreds of thousands of peacekeeping troops in the field worldwide. They feed millions of people globally through their aid programs. Yeah. It's an organization that's supposed to be at the center of international law, stopping conflict. But, like, ultimately, the UN is just a vessel for its member states. You know, it's for 190, several, you know, countries.
Starting point is 00:08:24 Yeah. These are the folks who call the shots. And it's really a very small number who really call the shots. It's the UN Security Council. Right, because they have veto power. Vito power. It's the UK, France, the US, Russia, China. And I think a lot of countries, a lot of people in the world would argue, should that be the club that gets to have a vote? You know, the deciding vote, not just a vote. The deciding vote on all manner of things that shape the entire world and all of our lives and the course of conflicts and the fate of nations. And that's why the UN fundamentally doesn't work. It has a structure designed not to save the world. It was designed to stop us having World War III. It was designed to freeze the international system in 1945. That's literally what it was for. People misread the original functioning of the organization and I misread it. And that's why after, you know, I was at the UN for only like
Starting point is 00:09:13 two, two and a half years, every day was deeply frustrating because I thought, there's a problem, there's a crisis, there's a disaster. Why can't we fix these things? And the answer was structural. It's not, it will never be in position to do those things probably. Is that, is that because fundamentally, you can't have a body that is ruling over autonomous bodies. Like, I mean, you have the UN, which is a club for countries, but countries already have their own laws, they have their own rules, they have their own everything. So how do you have a club telling other people what to do?
Starting point is 00:09:46 At some point, every country just goes, no. I mean, throughout history, nations have agreed to compromises, to alliances, to international institutions, that do constrain their actions in some. ways when they believe it's ultimately in their national interest. So obviously you have things like the European Union, you have NATO, you have the oldest international organization, the international postal union, you know, literally the rules of how you get post delivered from one country to tell us.
Starting point is 00:10:10 So like we have all these things, right, air safety. But like the UN is something that was crafted for a very narrow strand of things. And over time, you know, the nature of the actors behind that has shifted enormously. And of course, we're in a moment now, right, where the United States, which had at least a rhetorical commitment to advancing, you know, political liberalization of the world and free markets. It has very clearly chosen to take a different path, which I'm sure we'll get to you. But after the time I was at the UN, I thought, this is not going to work, at least not for me. I don't feel I'm having an impact.
Starting point is 00:10:43 Where can I go? Tech seemed the obvious place to go, a place which has lots of resources, and technology clearly is one of those other motors of transformation in the world. It's something that should be deeply hopeful and about fixing a bunch of these problems. So what did you study in university? What are the two degrees? I did politics. Okay.
Starting point is 00:11:01 And then for my master's, I went and did Russian studies. So I actually... Russian studies? Yeah, I focused on Russia's foreign policy in the world. Wow. Which was already deeply destabilizing. Wait, wait. Of everything you could have studied, why that?
Starting point is 00:11:12 I was living in Europe. I went to do my undergrad 2003. Yeah. And this was just before the European Union and NATO had expanded to the east. Okay. And Russia was about to become the bordering state. And I immediately had questions about that. You know, I was a teenager.
Starting point is 00:11:27 I thought, we've got the world's largest country with a very patchy track record on human rights and rule of law and democracy. And it's about to become our next door neighbor. And surely we need to prepare to understand how to deal with that new strategic challenge. And it was interesting because this was pre- Putin, you know, starting to march into other countries. Yeah. You know, 2008 when I arrived at the UN was when he was in power by this time, though, right? Oh, yeah. He took power in 2000.
Starting point is 00:11:55 Yeah, right. And, you know, he was still seen as an ally of the West. You know, he would get invited to all these, you know, summits with, you know, Blair and Bush and so on. Wow. You forget that that was a time. Yeah. And a lot of the students who were on my program at Oxford, these were kids who were
Starting point is 00:12:09 apologists for Putin. It was the wildest thing. What do you mean? In what way? You'd literally get into debates of them and they'd say, oh, he's making Russia strong again and he's a, you know, he's a great ally of the West. And, you know, anyone who, like, questions that entire thesis is just anti-Russian. And it was like, I love Russian literature.
Starting point is 00:12:26 If you want me to talk about Russian literature and food and all of that, we can do that. But that's not what we're talking about. We're talking about the fate of nations. And of course, we got to a moment when suddenly people realized that all the things that we had sort of swept under the rug in the name of just having a nice cozy alliance, having Vlad Putin at the table in the club, all these things were partially meaningless. There were a different set of interests at work. So here's something I've always wondered about Russia. is like, is Russia acting or are they reacting?
Starting point is 00:13:02 And what I mean by this is, you know, when you look at the Cold War, which you studied, I mean, clearly, like all of this, there are parts of the Cold War where I go, huh, it feels like Russia is reacting to what America is doing. It feels like Russia is reacting to its perceived, like, threat or it's worried about, like, bombs or not having, you know. And there's, there's moments where like Russia is like bluffing throughout history, which I find like, if it wasn't on a geopolitical scale, it would be great comedy. Sure. Because Russia's constantly bluffing. We've also got a bomb. Oh, we've got a bomb just as big as yours. We've got a military just as good as yours.
Starting point is 00:13:39 And there's all these moments where they're bluffing, bluffing, bluffing, bluffing, bluffing, bluffing. You find these moments where it comes to a head and then it simmers down. And then from the inside, Russians seem to go like, yeah, when you look at it in hindsight, they go, we actually never want. wanted any of that. We never wanted the fight. You know, Gorbachev being like, I didn't want that shit. Come on, man. As somebody who studied it, and you know, you've been at the UN and all of that is, how do you perceive Russia's reaction to the world? You know, like some people would go, well, Russia was always going to react to NATO being in its backyard, the same way America would react to another country's force being in its backyard. How do you process that having studied it, actually?
Starting point is 00:14:22 I mean, I think it's possible that a nation as complex as Russia is both being reactive and proactive, both at the same time, and has a very divergent range of interest that might explain all the things it's done over the last couple of decades, right? I mean, there are fascinating historical debates now about were there moments in the lead up to the end of the Cold War and right after where you could have dramatically shifted the path that Russia eventually took. And there were moments where, you know, even after Vladimir Putin came to power, he, at least rhetorically, you know, explored, you know, very different scenarios for what a different security architecture could look like for Europe. People back in the early 2000s literally had discussions about could Russia join NATO, you know, which was something that, you know, Vladimir Putin literally, you know, he held events where he would talk about that as potentially a long range goal for Russia. But, you know, it's one of those things where, you know, scholars will debate this stuff, you know, actually. at the dawn of the Cold War, there were proposals from the United States to put all the nuclear weapons in the world under the control of the UN. And it turned out that wasn't really a sincere thing. Well, I didn't know. It was, in fact, them trying to call Russia's bluff because they knew Russia
Starting point is 00:15:35 actually would never agree to it. They would just want to have their own arsenal. And so it was just a way to get the US to have some nice PR. But, you know, it's something where I think this speaks to kind of the moment we're in, right? This actually is very relevant to some of the stuff I'm working on. There was no vision, really, for what a post-Cold War international order that really was inclusive, that would represent the world's people, that would stop nations simply descending into grievances and, you know, end up butting up against just Western hegemonic power. There was no plan whatsoever. And we're sort of still in that moment. There's a bunch of leaders now who really regret the state of the world, but they have no vision for what a real future could look like in many ways. They just don't. They just don't. want to go back to some idealized moment of the past. It's the 90s. It's 2007 before the economic crisis. It's 2015 before Trump. We're never going back to any of those moments. That is not the way to actually save, you know, societies. That is not the way to ultimately stop any of these problems. And so that is something that is deeply cyclical,
Starting point is 00:16:35 a bunch of elites constantly being off-balance. Yeah, we had, uh, uh, mea, Martley, the Prime Minister of Barbados on the podcast. And it was It was interesting because she really helped frame how the world and the UN are working through a unique lens. And it was that larger countries and more powerful nations have often dealt with a one-way sort of streets. It was like, we say and we do, and then that's it. We don't look for feedback. We don't seek a response. But now the world has changed, you know, whether it's because of global.
Starting point is 00:17:15 Whether it's because of trade, whether it's because of migration and immigration, smaller countries affect the bigger countries. You know, it's going to be climate change. It's going to be famine. It's going to be people are going to move around the world. The people who can't move are going to end up finding themselves in some of the worst places where then the worst things happen. You know, you look at ISIS, et cetera, what's happening in Libya and stuff like that.
Starting point is 00:17:39 And that affects the quote-unquote Western world. And you can't have it as a one-way street anymore. You can't just tell small nations what to do or who to be. Yep. You have to be in a dialogue with them. And I don't think the bigger countries are used to that. They're not. And, you know, the West was top of its game for a long time, economically, militarily,
Starting point is 00:18:01 very stable societies. It got to dictate terms of the world. To your point. And so many of the things that are unfolding now, where our economic might and our institutions are in decay and our place in the world is being questioned, These are things now where a lot of leaders, a lot of publics, who have grown up in that world where the West was at the top of its game, they think, oh, it's all a lot of alarmism to believe that things could really dramatically deteriorate. Fundamentally, this is all just a bit of turbulence and then things will go back to normal. And they don't seem to realize that the vast majority of people in the world have lived through everything that's unfolding now and much, much worse.
Starting point is 00:18:39 When people talk about doom or the breakdown of societies, that's happened to so many. people. It's happened in our lives in just the last few years, which is why, you know, there's now tens of millions of refugees in the world. Yeah. It's, it's one of those situations where it feels like for the first time in history, or rather in, for the first time in modern history, the major countries are experiencing what all the smaller countries have always been experiencing. Yep. You know what I mean? So, so, so you, you have this UN epiphany. You realize that nothing seems to be moving in the way that it could be moving, or at least how you thought it would be moving.
Starting point is 00:19:17 And then you decide to go into tech. But this is when around 2005, six somewhere there. 2010. 2010, okay, so it's much further on. All right, so 2010. Still very early for a lot of those companies. Yeah, yeah. Tech wasn't what it is now.
Starting point is 00:19:34 It was a burgeoning world of like maybiness. The dot-com boom still had like the effects where people I don't know, the bubble could burst. So it was almost like punk in a way, tech at that time. That's right. I mean, it's kind of wild to think back to that moment now because, of course, we only think about those Silicon Valley companies as big tech. That wasn't even a label.
Starting point is 00:19:57 And it was sufficiently quaint that, you know, there was a film being made at Google the first time I went to Google in my career. So I started at the beginning of 2011, and my buddy Jim, who sat next to me, he was the PR guy assigned to manage this film, which Vince Vaughn and Owen Wilson were making set at Google the internship, or like the intern. Oh, yeah, yeah, yeah. And it was actually like, it was like a rom-com set at Google. Oh, yes. Because you could pull off that kind of warm, fuzzy schick and people would go and see it because it was like,
Starting point is 00:20:23 oh, Google, it's this warm fuzzy company where everyone sits on colored balls and events the future. Yeah, and they have the food and everyone chills and there's no real offices. And it's just that. Totally. And like, Steve Jobs was, you know, still alive. You know, the iPhone had just been, you know, announced 2007. So it was like very recent, you know, Facebook wasn't even a public company now. And after I was working at Google the first time, I was waiting for Eric Schmidt, who was the executive chair, chairman. I then went to work for Zuckerberg. I went over to Facebook right before the IPO. And it was a tiny, tiny company.
Starting point is 00:20:51 It was less than 2,000 employees. It was only a few years past the point where everyone was just hurriedly adding all your friends, you know, as soon as Facebook extended your college. And, yeah, at the time, it was just sort of, we're making it up as we go. but it still feels like the early years of the internet when every day was a new discovery and things were deeply hopeful in a lot of ways. You felt like you were part of inventing this amazing future. Yeah. Connecting people and look, all the mission statements,
Starting point is 00:21:20 they sound brilliant, don't they? You're connecting the world. You are. Isn't that a brilliant thing? That's what we all want. Yeah. But then it turns out the world's really complex and a whole bunch of things that you're not expecting
Starting point is 00:21:30 will happen when you connect the world. We'll be right back after the short break. So when did it flip for you? It's funny, I was actually reading a book. Maybe you even know the person. Oh, I'm going to be so bad with this. Is it called like careless people? Oh, yeah.
Starting point is 00:21:54 You know it. I do know that one. Yes, yeah. And it's written by a woman who used to work at Facebook. I worked with her. Oh, you worked with her? Yeah. Okay.
Starting point is 00:22:04 Oh, this is fascinating. I'm in the book. Oh, that's wild. Oh, this is crazy for me. Wow. Okay. So I was reading this book. And the reason I started.
Starting point is 00:22:13 reading it was just because I was like, all right, I'm, I'm interested in tech. I'll read a whole bunch of things about what's going on there, autobiographies, et cetera. But then I read this book and it was really interesting to see an account of Facebook's evolution. You know, like, we like, we like, we're absolutely bad. Or no, it's absolutely good. And it's like, when you read this book, you realize there was a time when Mark Zuckerberg didn't see himself as anything other than a guy who was running a tech company that was trying to connect as many people as possible in the world. And the people who worked around him were like,
Starting point is 00:22:50 we're just trying to connect the world. And every time the word politics was brought out there, they were like, no, no, no, no, no, no. We don't want to meet with presidents. We don't want to know presidents. We just make tech that connects people all over the world. I didn't know that you were there for like that. This is crazy that I'm reading the book that you were basically a part of.
Starting point is 00:23:08 So help me understand what went wrong. a bunch of those technologies are things that did exactly what they were designed to do. The world did get connected. There's billions of people using all those products. There was a very, very simplistic narrative and understanding of what would unfold as the world got connected, as all those technologies rolled out. And a bunch of the good things, fantastic. tech can change the world.
Starting point is 00:23:46 But there's another part of what happens. The world changes tech. It doesn't just show up in this vacuum. We never tell that story. And of course, a deeply complex world with technologies that change the way billions of people live and work, every industry, every government, the nature of our relationship with each other, with our communities, that's something which it turned out. Very few people had any.
Starting point is 00:24:13 real understanding of what the full scale of those challenges would be, how interconnected they would be. And I think there's been a widespread systemic failure of leadership across societies. The tech industry absolutely has a lot of that responsibility. They built these things. They're things which they had the closest understanding of as they emerged. But the failing is absolutely not exclusive. We also had a absolutely historic collapse in leadership from political leaders. from a bunch of other people who have immense power
Starting point is 00:24:45 on the trajectory of those technologies and how our society should respond. If tech hadn't had all the effects it's had in the world, if you didn't have social media and smartphones and AI and so on, I think it's probably quite likely that we'd still be facing huge crises at this point. A bunch of these things are unfolding. And of course, AI is a thing which I'm very focused on
Starting point is 00:25:05 and is a big part of this. But part of the reason what I'm doing now is, in fact, not just about tech, really, is if AI vanished tomorrow, we would still have gargantuan problems. We would still be in really bad shape. It is the combination of the tech with a deeply broken world
Starting point is 00:25:22 that adds up to the state of dysfunctional run now. Yeah, in many ways, I like to think of it the same way I think of like a nuclear weapon. Humans will always fight, but a nuclear bomb means the effects of that fight are going to be so much greater than they would have been otherwise, right? I think the same goes for AI.
Starting point is 00:25:40 I think the same goes for social media. People who work in these industries will often say, we aren't the cause of this. People will always find a way to blah, blah, blah. And I'm like, yes, yes, I agree with that. You're not wrong. However, the tool that you've provided the people with has given them an outsized ability
Starting point is 00:25:57 to inflict damage or harm on society. And so I want to go back to like your early days. So when you first got into the tech industry, you know, like let's use Facebook. Because you know, Facebook is really like at the center of many of these conversations right now. So what were you doing at Facebook? I was a speechwriter.
Starting point is 00:26:16 So I was the company's first speechwriter. The company was just about to go public. It needs to communicate a lot more. And folks like Zuckerberg, a lot of the other senior leaders, you know, they had to talk to everyone from Wall Street to Publix, media. You were constantly doing events and engaging with other leaders, moving the entire company. know, this is an organization that grew enormously in the time I was there from a couple of thousand
Starting point is 00:26:43 people to tens of thousands in just the space for a few years. And all along, you know, I was there to help figure out how do you manage the telling of that story at scale. Oh, wow. And that's something which, of course, you know, in order to tell that story, well, you have to be able to get under the hood of any number of these deeply consequential issues, all the things that are, you know, really controversial, the things that are really significant. You have to deeply understand the executives and those voices as well. I was a speechwriter for a lot of my early career, and people often think the voice of people is about the combination of words,
Starting point is 00:27:15 and that's the easy part. The words are an expression of ideas. And that's really what made the job so fascinating. It was all about trying to figure out what are those ideas that can galvanize and move massive populations and audiences and all these communities on a set of technologies doing something completely new in history. and it was a very, very, you know, intellectually, you know, hard and extremely fulfilling perch in which to see history being made.
Starting point is 00:27:48 It was also something where once you get under the hood, you realize, wow, there isn't actually a deeper level of thinking quite a lot of the time. And I think that's something that came out in the book you read. There were actually, there were lots of things I disagree with in that book. There were lots of details which I think were like, you know, absolutely not, in fact, how things went down. Oh, okay. That's good to know. That's also part of history. You know, people will have very different perspectives on that.
Starting point is 00:28:07 all of those things. So let's start with the good days then. You join this company. It is changing the world. It is this burgeoning idea. What were the messages that you were trying to get to the public? Or what were the messages you were trying to help Facebook and Mark Zuckerberg get to the public about itself at that time that you thought were noble? The power of connectivity to change lives and move societies forward, that is absolutely true.
Starting point is 00:28:35 It's still true. People can absolutely detest, you know, any number of these companies. Most of us still carry a smartphone. Most of us have social media. These are things which have become really valuable in our lives. And those are all real things. They're things which, you know, can mean a huge difference in the quality of life for communities and families and, you know, all sorts of people. You know, in fact, my own journey, you know, going from where I started to,
Starting point is 00:29:05 you know, being able to do all the things I'm, you know, doing, you know, in my life. In fact, I think technology was a big part of that. You know, I remember I was at home, my parents were of no help guiding me in my career, obviously, like when I was a teenager and I was trying to figure out, like, how do I apply to university? It's like my parents are going to university. So it was literally me staying up like, you know, typing away on my dad's all computer, literally flinging out queries into the early internet, trying to figure out, like, what would you need to do to do all these things?
Starting point is 00:29:31 Like, what is an internship? stuff like that. Where would you find that information in those days? What was going to get on the library and ask the librarian? So that's the value, right? And there were things when I was at the company, which really would fill you with pride. I remember going to rural India with Zuckerberg.
Starting point is 00:29:50 I traveled on the trip. It's in the book, I think. And you would meet people whose lives have been completely changed by having access to the internet. I remember we were at this village in the middle of nowhere in India and it had just got access to like an internet. internet cafe and this old man had come into the cafe to get help from one of the kids who had learned to use the internet because he wanted to track down his friend who hadn't
Starting point is 00:30:14 seen in like years and he was like, can you help me like research things? Because he was my best friend and I haven't seen him in like 20 years and like he's somewhere out there and I don't know where he is. And like this kid found him in like five minutes and this guy started crying and it was like, that is, I mean, that's the sort of, that's the magical side of it. Deeply tangible impact, right? Yeah. But this is the thing. This is the tremendous duality of the moment, that you can have things like that which are real. And you can acknowledge those things. But like, there's all sorts of other things happening too.
Starting point is 00:30:43 So what was the first thing that sort of triggered your, your spidey sense? What was the first thing that made you question the noble mission you were on? It was the refugee crisis. Which one? Yeah, which one? So, 2015, when the war in Syria was... Okay, yes. So the Syrian migration would move, yeah, fleeing into Europe, going through Germany, etc. So you suddenly have this like rapid acceleration, the number of people being displaced and you had tens of millions of people on the move. So at the time, there were about 65 million refugees in the world. And it was the greatest number in history, biggest humanitarian disaster of all time. And my dad was really sick that year. So he... He had lung cancer. It was the last year of his life.
Starting point is 00:31:28 We knew that he only had a few months left. He had started smoking when he was about 12 years old as a child refugee. You know, that was the really deeply unhealthy life he had, you know, living in the refugee camp. And anyway, it finally caught up with him. He was in his late 70s. So, you know, I had really come to understand his story in that last year of his life because he never talked about it.
Starting point is 00:31:50 And I had a very bad relationship with my dad, you know, he had this terrible childhood. He was like very abusive. if he was psychologically very scarred from his childhood. I could only imagine, yeah. Yeah, and so I made peace with him in the last year of his life. And so I was already like bubbling with a bunch of feelings. Yeah. And then it was September 2015 and this photo went all over the world.
Starting point is 00:32:13 And you probably would recognize it if you saw it. It was this photo of this three-year-old boy, Alan Kurdy, who was a Syrian refugee. I think so. Is it the one, if I'm not mistaken, is in the back of the ambulance maybe? Or is it like... That was a different. one. Oh, that's a different one. Okay. It was the little boy who drowned on the shores of the gosh. Yeah. And there's a famous photo of the body on the shores of this island. It looks like he's
Starting point is 00:32:36 sleeping. And, you know, he drowned along with his five-year-old brother and his mother trying to cross the Mediterranean to get away. And I remember that moment because all the world leaders were so shocked because publics was so outraged and people were so upset. I remember, you know, you had prime ministers going on TV saying, right, that's... That's it. We're going to stop this thing now. We're going to intervene. This is where we draw the line.
Starting point is 00:33:00 Of course, 10 years later, there's now 135 million refugees in the world, which is more than two world wars combined. It hasn't gotten any better. It's getting worse by the day. But at that time, that was the first moment where I thought, look at the gulf between the ambitions of the world's richest people with all their vast technology, you know, saying that we're going to lift humanity up. And there is humanity right there.
Starting point is 00:33:24 the biggest humanitarian catastrophe of our time. And you can say, well, why should it be up to the CEO of one of these tech companies to go and solve that problem? Well, maybe it shouldn't be. Maybe they have some part to play in it. But that's not the point. The point is, if you have a thesis that your technology is now going to be this thing, which automatically leads us to this better future, and the future is playing out right there, and you are not doing anything to even investigate how you might shape that, then your thesis starts to ring a little hollow. And I remember sitting at my desk in Facebook in this palatial headquarters, you know, on the edge of the Pacific Ocean, just thinking,
Starting point is 00:34:04 God, I'm like, I'm at the edge of the world in like air-conditioned luxury. And that is the animating reality of people all over the world. And people who are just like my dad when he was a kid. That's my dad's story playing out right there. Anyway, my dad passed away, you know, just a little bit more than a month after that photo came out. and I went back to England and I had a bunch of time to think over the next few weeks and I decided that's it. I've got to find a different path through this industry. And so, I mean, that really was something which started a bunch of these gears spinning.
Starting point is 00:34:36 I still was in tech for a bunch more time and it was something that I began to take more and more of a position that I had to be involved directly in shaping the cause and the movements and the issues that were playing out in very different parts of the world. So I understand that on an altruistic level, you know, as it pertains to the world. But I'm trying to understand where your disillusionment began with regards to Facebook. And then we'll move into like the broader tech world itself. Because I'm hearing what you're saying. You know, you were taken aback by a story that was affecting the world. And on your side, it was affecting you personally as well because it, you know, it had echoes of a past that had shaped you in many ways. but I don't understand how this then costs Facebook in a bad light.
Starting point is 00:35:25 You know, this is just a social media company. What is it that Facebook did? What is it that Mark Zuckerberg said or did that made you go, Facebook is part of the actual problem that we're not the cause of everything in the world, but Facebook is contributing to these problems. Facebook has become a central piece of the machinery for how you convince us. It's the world of things. It's the battleground of ideas played out across Instagram and WhatsApp and Facebook connects billions of people. It connected the world. And all of the central problems
Starting point is 00:36:00 of our time are playing out in those communities. And so Facebook absolutely has a role in how it prioritizes and it thinks about the architectures that might dramatically shape and shift how the entire world acts on any number of problems. And of course, the company absolutely would point to all the things that it thought might be positive stories for the company and how it did those things. You know, you'd look at his all the money that's raised on charity on Facebook. True thing. Very, very powerful for lots of organizations to raise money on Facebook. But at the same time, there'd be other things where, you know, the company would have a sort of, yeah, we've got enough on our plate. You know, that's not really a thing for us right now. And so actually, I'm not
Starting point is 00:36:40 saying that suddenly Zuckerberg had to make the refugee crisis is number one thing. absolutely think if you have a thesis about you're connecting the world and you're building this motor of society and you have to be one of the world's richest people, you probably should make it a part of your responsibility. But there's any number of other issues which absolutely are things that where the company dropped the ball on. Why is it our societies are so deeply divided and angry now in a world awash with disinformation and hate? That is something which absolutely wasn't the company's power and it touched the lives of refugees and non-refugees alike. It shapes all of our lives. It shapes the entire digital ecosystem and the company did not meet its
Starting point is 00:37:18 obligations there. When I read, you know, when I read this book and when I've read different accounts of people who've worked in Facebook and then to a lesser degree Twitter at the time, they'll talk about how they got frustrated at seeing which scales the company would put its thumb on, right? Tech companies generally will tell us the general public that they are not shaping anything. They will say we're a platform, we're a message board. It's part of the reason they don't have the same obligations that a TV station does. So, you know, if somebody tells a lie or an egregious lie on CBS or NBC or one of those,
Starting point is 00:38:05 they get held accountable. Facebook, Twitter, all of them go like, no, no, no, no, no. We say nothing. All we do is provide a message board and the people say things, but we do not participate in what is being said or not being said. Right. When you read accounts from people who've worked in these organizations, it seems like that couldn't be further from the truth.
Starting point is 00:38:28 Yeah, that's absurd. Like, look at how much the algorithm on X shifted once Twitter became X under Elon, right? You'd like spend five seconds in there. It's just a complete cesspool of the absolute worst people who, been algorithmically boosted now to dominate your feeds. So, like, absolutely, there are choices made every day in the architecture of these platforms and, like, the content they're surfacing. Like, you know, one of the things that, you know, occurred, you know, during the first Trump
Starting point is 00:38:53 administration onwards was, you know, Facebook started experimenting with reducing the volume of news and political content in some feeds because they discovered that people were getting to the point where they were so, like, exhausted from the amount of political content that they said, oh, you know what, this is bad for engagement. Let's just get rid of the news. And that's the thing where, hey, maybe that actually gave people some better, you know, experience in their maybe they spent more time. That would be how Facebook would define a better experience. But, like, in fact, if your society's in flames, wouldn't it be, like, more responsible to actually share these things? Like, if you had a thesis that this is the, you know, global town square,
Starting point is 00:39:34 that was literally how, you know, Jack Dorsey used to describe Twitter. And the global town square has to have all the conversations that matter in it, right? Not just the ones that, you know, give you the easier life when you're living, you know, in Palo Alto or San Francisco. Yeah, it's, it's fascinating because you realize that these guys have built a world where it would be so much easier if they were just trying to be evil. But it seems like their evil is a bipartisan. product of the apathy or rather the appetite that they have for just getting their thing to grow as much as possible. That's right. I think that's spot on. And that is the complexity of this entire situation. If they were just cartoonishly villainous.
Starting point is 00:40:23 Yes. You know, like, Zuckerberg was like, right, I'm going to like get these people's data. And it's like, it was never like that. In a way, it's like, you know, the data is, you know, to fuel what Facebook really is, which is a big ad machine. You know, his ultimate evil is to give you some crappy clickbaity ads in your feed. I mean, it's not the most, you know, wild scheme ever invented. But like the failure to engage with a whole bunch of other things, which are within his power and which are playing out on the platforms, that can end up exerting tremendous evil in the world.
Starting point is 00:40:53 So it's something where there's, again, this great duality about these things, right? And you can be an engineer working at Facebook. And every day you go in there and you work on things that are really, really boring, designed to sort of incrementally improve user experience, and they do improve user experience. And yet you are also part of an institution that might also be doing things which in some other part of this vast global machine actually aren't effects you like at all. And you still need people to go and do the good things as well, right? You know, all institutions are imperfect.
Starting point is 00:41:22 And there's lots of good people, I think, who work within tech companies because they are driven by nothing more than they want to build systems that really do solve problems for people. Do you think we got it all wrong in terms of like assigning a political point of view to tech leaders? I think of all of them, Jeff Bezos, like when he bought the Washington Post, people were like, oh, man, democracy has been saved. And they came up with that slogan, democracy dies in darkness. And it was this whole thing. And it's like, oh, this is what we need mega titans to do is save the journalistic world that we like. Mark Zuckerberg, people are like, oh, yes, he is a titan of free speech. And because of him, we will move this thing.
Starting point is 00:42:01 You look at all of these figures. Even Elon Musk, they're like, you know, electric cars. and here he is pioneering a world that gets us away from fossil fuels. And it seemed like they were liberal bastions. It seemed like they were progressive icons that were trying to move society to a place that was better for everyone. And then almost overnight, it was like, well, Jeff Bezos is like, nope, shut that down, move that thing, pay the people less, get this go.
Starting point is 00:42:30 I'm doing my own thing. Zuckerberg was like, actually, screw the news and we're not fact-checking anything and we're not doing that. And you know what? I'm actually pro this. I actually think Trump's a cool guy. And I'm at the inauguration. And you see this happening.
Starting point is 00:42:43 And Elon Musk is like, you're like, oh, so he likes electric cars, but not necessarily, it doesn't care about the whole thing. It's this weird, like, it's this weird gray that I think people struggle with. Yes. From the inside, did you see a shift? Or do you think that there was just a big misunderstanding of who these people were from the very beginning? I think it was both. People are desperate for heroes. And, you know, I moved to New York in 2008, you know, right on the eve of, you know, the economic crisis unfolding.
Starting point is 00:43:17 And how many more crises have we been through since then? It's like the 21st century has been a never-ending series of disasters befalling most of the world. And in that context, people will always look for heroes. We're looking for good guys and bad guys. We're looking for a simplistic narrative to make sense of deeply complex events. Who were the good guys in the economic crisis? You know, there were lots of different bad guys. There were lots of different shades of gray every single day.
Starting point is 00:43:41 But the real villains weren't even just these individuals. You know, we would sort of say, ah, that person's going to prison, right? Screw that person. There's entire systems that are rotten. And of course, this is the nature of many of the problems that are playing out now. We have deeply rotten, you know, systems that, you know, are behind many of the problems that are, you know, unfolding in many different domains. So I think that was part of it. But also it was genuinely, you know, it was a whole bunch of very new challenges emerging in the world, which in ways these were leaders who simply weren't engaged at all.
Starting point is 00:44:14 There were things where they might have had more constructive roles. They almost certainly would have had more constructive roles if they took those seriously. But there's lots of other leaders who should share that blame too. The failure of technology in our societies is only part of a general failure of societies in the last. couple of decades, which political leaders, a whole set of other, you know, thinkers and actors, they absolutely should share in that blame too. It's weird because it's like we're trying to figure out where the first domino fell, right? Some would argue there's a moment when Facebook said, we're going to allow political people on Facebook, we're going to allow them to run campaigns,
Starting point is 00:44:53 we're going to allow them to buy ads, we're going to, some would say, oh, that was it. That was the domino. Other people would say, no, it's when the lawmakers allowed Facebook to have the thing that it were. So where do you think the first time? Like were there conversations within the company in and around politics and Facebook when this was still, you know, sort of burgeoning? Oh yeah. I mean, these conversations, you know, have been going on for many, many years, you know, these were things that, you know, predated me. You know, companies were already really agonizing over these things, you know, as social media was, you know, getting started and reaching, you know, big scale. And a lot of the thinking, right, was, again, driven by things that were very idealistic. Can you build a
Starting point is 00:45:32 global town square in which, of course, you want democratic discourse. You want politicians on there and you want people exposed to their views. And these are things where, you know, political content would naturally appear because, you know, if you want to build something for free expression, then people will talk about, you know, something that is one of the central things in our lives and our societies, you know, that early idealistic moment in that journey of tech, you know, for me was the Arab Spring. You know, I remember walking into the hall at Google when news came through that the Egyptian government had fallen. People started cheering. You know, it was literally like, you know, there is something in the world happening, you know, fueled, you know,
Starting point is 00:46:07 combined with the power of technology, you know, people power plus these, you know, instruments for like mass communication and collaboration, suddenly like exerting, you know, dramatic shifts in societies. But it's very hard, I think, to think of like that's the domino that, you know, played the central role when really there were so many, you know, interconnected systems that really were breaking at the same time. The failures of leadership, right? happening on any number of fronts, right? You know, disinformation on Facebook, right? You've got the failures of platforms to think through how algorithms would, you know, shape content and shape, you know, communities. But at the same time, you had a global order breaking down in which
Starting point is 00:46:45 nations like Russia, Iran, China, the US, a whole bunch of people were all employing cyber weapons and looking to drive influence operations through social media and to weaponize all of those new platforms in order to achieve their geopolitical interests. That's a reality too. And so you can't just look at one of these problems. You've got to actually have a thesis on how all these things play out together. Don't go anywhere because we got more. What now? After this. You said that you really had this moment for yourself in 2000 and? That was 2015 when I started having. 2015 is when you start feeling this. You then leave in 20. I left Facebook in 2016, but I was still in the tech industry.
Starting point is 00:47:41 So you stayed around for a while. I was waiting for Elon then when I was at SpaceX. And so I went to the space industry, which is something that should be deeply hopeful. I mean, I grew up watching Star Trek. And many elements of it have been. I often say this with my friends all the time. I go, look, man, I'm not a fan of like half of the shit Elon says and his views on many things. But I'm like, you know, Starlink and the idea is pretty amazing.
Starting point is 00:48:05 We live in a world where many people are not connected. to the internet and we've seen what that can do to their financial outcomes or there's a young deck somewhere who's trying to find another internship you can't do it but if you have an internet that doesn't need
Starting point is 00:48:19 last mile it changes your life so it's fascinating electric cars I don't think the industry would have moved if it weren't for Tesla you know so there's many elements where it is deeply hopeful but you seem to have gone
Starting point is 00:48:31 from one place to another and am I correct in saying that you are disillusioned by the entire tech industry now? No. This is the duality of the moment. It continues to this moment. I absolutely believe in the power of technology.
Starting point is 00:48:46 No one's ever accusing me of being anti-tech. Okay. Like technology is something that can be and should be deeply hopeful if it's managed well. Is it being managed well by our societies? No. Are tech industry leaders meeting the moment and the nature of the disruption that is now unfolding as a result of AI and a bunch of these breakthrough technologies, absolutely not. And so that's why I'm no longer in the industry.
Starting point is 00:49:13 I don't think that's the right perch to be for somebody who wants to focus their time on figuring out how as societies do we get out of the mess. That is the consequence of the backdrop of a technologically infused civilization in the year 2026. But the technology itself could do wonderful things, right? That is the complex everyday duality, which infects every part of how we're think about these technologies, which is why lots of leaders, lots of institutions, lots of regulators, they really struggle with this stuff.
Starting point is 00:49:42 AI might help us cure all diseases this century. That is something which lots of scientists are very hopeful about. We're going to see the first drugs coming to market in the next few years, which have been developed with the help of AI, which can treat cancers and all sorts of things. That's real. But at the same time, you also have AI driving disinformation and hate and cyber weapons and all these things, tooling autonomous weaponry and AI military targeting systems right now in action today, in the Middle East, which are inflicting enormous damage on people's lives. You have an industry
Starting point is 00:50:14 with a rapacious appetite for resources, which is absolutely going to be part of accelerating climate change and the devastation of ecosystems at a time when we're heading for some of the worst-case climate change scenarios. And you have this soaring gulf in societies between the winners and losers in our economy. You know, 12 billionaires in this country. have as much wealth as 49% of the world now. And that's before we've seen most of the effects of AI on economies. If anyone think that's going to get better, they've got a bridge to the future to the sell them.
Starting point is 00:50:46 Right before you left Facebook, did you voice your concerns and try and like shift the company as a cog in a machine? I did all the bits that I felt were within my ability to go and do these things every single day. every single day. What could I do? I was a guy who was a words guy. And so my, my ability to argue over the message and how you frame these things and what are the things we should really be talking about. That's something which I absolutely exercise that power every single day. And that's, that's what drove my whole career. Did you experience any wins? Yeah. I mean, you would get lots of sort of little wins. Yeah. Do you remember any? Yeah. I mean, like, even like how,
Starting point is 00:51:28 choosing to even like, you know, focus on, you know, topics which people would say, oh, really? Like, why is like some tech executive like posting about, you know, let's say international development? Okay. I took Zuckerberg to the UN in 2015. You know, like that's the thing where, you know, for me, as a former UN person, it felt really important, actually, that you bring Silicon Valley, you know, people building these platforms with this immense reach together with a set of people who are.
Starting point is 00:51:58 were deeply non-technological, but need to achieve really hard things in the world. So for me, you know, a lot of the work I did was on advancing this idea that connectivity should be a human right. It should be something that people have access to and they have the ability to express themselves freely. I consider that a very, very small win. Lots of other things I would lose arguments all the time. One of the things that I, you know, was part of my sort of final chapter when I was, you know, at Facebook was Trump was now running. Yeah. And I absolutely did not believe that Trump should be given a, you know, an unfiltered megaphone to spew all sorts of hatred. And I had furious arguments with, you know, very senior people, you know, saying you are the boss of Facebook.
Starting point is 00:52:41 You have a megaphone, if nothing else. You get to say that that's not acceptable. And they wouldn't even do the bare minimum there. You know, they sort of, you know, I remember I had a huge argument with some folks there about, you know, a post which I had drafted for Zuckerberg to take on Trump directly and call out his bullshit. Oh, interesting. And, you know, it's been reported, it's been covered in some other media over the years in the Washington Post, I think. But, you know, that was the thing where I lost that argument. You know, the company was absolutely, you know, cowering with the idea that this is something that, you know, they should have any sort of direct view on.
Starting point is 00:53:11 And so it ended up with one of those messy, unsatisfying compromises where, like, a post was put out and all the sort of sharp bits of language were, you know, watered down. And it didn't mention Trump by name. It was just the sort of, like, thing expressed in the ether. And it was like, why are we playing coy with this stuff? Like if you really do believe in, you know, the values which at the time Zuckerberg really said he believed in, you know, things like standing up for dreamers, you know, standing up for the rights of, you know, underrepresented people for immigrants, you know, for communities, you know, diverse. All those communities, Trump attacked from literally day one when he rode down that escalator in Trump Tower and he said that it was his mission to basically, you know, exercise them from society and roll the clock back to the 50s. And if you're not going to stand up for your values, when, you know, there's something as consequential as a future president of the United States, you know, starting to spew those things. When are you ever going to stand up to them?
Starting point is 00:54:04 So, so Facebook, I sort of understand because it's, you know, it's public facing in many ways. And we see its effects. You know, you see Facebook on trial or meta on trial for like Instagram and the things they knew in how it affects, you know, younger users, particularly girls. you see them on trial for like trying to get kids addicted to the platforms that they create. So they're pretty like front facing. Help me understand on like a SpaceX side. What could go wrong? It's space.
Starting point is 00:54:34 You're launching rockets into space and this is a good thing. You know, you launch the thing and you make it cheaper and you put satellites where they're supposed to be and we see the world. And like what could go wrong at SpaceX? Well, this is now well after I've left SpaceX. But now SpaceX is one of the companies. That is obviously a prime defense contractor. And part of the work, you know, that is now unfolding is, you know, to obviously develop a new generation of weapons and infrastructure to enable the U.S. to win future wars.
Starting point is 00:55:06 You know, there are proposals such as Golden Dome, you know, the absolutely astonishing space-based missile defense system that Trump wants to build, which some estimates say will cost over a trillion dollars. SpaceX is going to be the primary contractor for all of that. And these are things now where, again, this is the duality of the moment, you could say, oh, it's a defensive missile shield. It's designed to shoot down incoming nuclear warheads. Yeah. Maybe that's okay.
Starting point is 00:55:34 Except if you are part of a nexus of leaders and institutions which are exerting tremendous political authority and economic power, because the technologies you're developing have become so central to economic growth in the United States. United States, you have the ability to radically shift agendas here. If you've chosen instead to just make a bunch of money, building space-based military infrastructure, rather than using your overall influence, to in fact shift us to a world where maybe we don't have to fight. Why are we preparing for a nuclear war instead of trying to figure out how to not have nuclear wars? Then I would say that is a historic mismatch of resources. That is not the wrong strategy. That's not the right strategy. And of course, Mr. Musk himself, well after I worked from, has now taken on a role, which I think is
Starting point is 00:56:24 unforgivable in the tech industry and in the world. The work he did on its own of gutting USAID through Doge, which we have emphatic reporting about the lives that have been lost, the children who have died. He should not ever be rehabilitated. Like, that is a guy who should be on trial. Yeah, it's, I mean, that's probably one of the most heinous things, him and the administration did because it was spiteful. That's right. Do you know what I mean? It had no material effect on the United States, on its budget, on its money, but the effect
Starting point is 00:57:01 that it has now had on the countries and in the world that needed it has been like generationally devastating. That's right. So that's the thing right. Space and space technology could just be brilliant. And if we just looked at that, if we just had a tunnel vision, we had a naval gazing on that, fantastic. But like, when your boss is like out there gutting aid.
Starting point is 00:57:19 agencies and children are dying, like the two things are connected. So what do you feel like your mission is now, then, now that you are not in the industry? What are you trying to rally? What are you trying to move people towards? I am very focused now on the societal implications of AI. I spent the last several years, you know, I was back at Google and, you know, a lot of the last 15 years have been involved in AI. And the technology and how quickly it is gathering steam, how much more advanced it's getting,
Starting point is 00:57:49 how much more advanced it will be in the next decade, that is something which is going to be wildly more devastating than any of the consequences of the mismanaged technologies of the last 15 years. And the consequences will crash land on societies all over the world. We are not prepared for these things at all, economically, societally, geopolitically, environmentally. And that's the agenda that I'm really focused on.
Starting point is 00:58:13 The tech industry has massively distorted the agenda for how to manage these things. We talk about AI governance. And there's a whole bunch of stuff which is about how you manage the AI systems themselves. That's like 1% of the problem. 15 years from now, probably less than 15 years, when a lot of people have lost their jobs to AI and we have no economic solutions, we have no idea how to give people a good quality of life, that is a much, much bigger problem than some of the technical issues we're facing in those systems.
Starting point is 00:58:41 What happens when AI turbocharge industries consume wildly more resources and they destabilize already reeling ecosystems, which we know are hanging by a thread, and which will impact the lives of millions, if not billions of people. Between 2018 and 2023, the world consumed almost as much resources as one third of the 20th century, and it's accelerating all the time. And these are things where, in fact, big tech companies, they go to government who are desperate for economic growth now, right? A lot of the advanced economies in the global north, they've got plateauing growth, they've got declining living standards. They go to them and say, build this wonder technology, build this infrastructure. We will give you back the keys to a good future. And of course,
Starting point is 00:59:24 that's not happening. This is something which on the present path promises to enrich a tiny sliver of societies, mostly in the United States, mostly in China. And I think the world we're going to end up in, you know, in the next, you know, 10, 15 years is going to be much more unstable than today with rapidly declining living standards for hundreds of millions of people. I think a lot of graduates will not be able to get good jobs at all. People who did a whole bunch of things in their lives and were told if they did these things right, they would get a good life and their kids would have better lives than them. That is not going to happen at all on the path we're on. And so, is the tech industry going to solve these things? No. And in fact, they have tried to minimize
Starting point is 01:00:03 debate about these things. They've tried to point people back towards the technical solutions. Now it is a moment in which societies and our leaders need to have a mainstream conversation about what the heck we're going to do to manage all of those effects. and that cannot be left up to the tech industry. But now we're in a situation where, like you said earlier in the conversation, the world affected tech and now tech is affecting the world. In the lead-up to some of the elections happening in the U.S., the tech giants have put together some of the most insane amounts for political campaigns.
Starting point is 01:00:45 They're going after every single politician who is proposing any type of restraint or constraint on AI. And whether we like it or not, you know, money shapes a lot of an election. So you have the tech industry fighting to define how it should be managed or it should be governed. And the people who are going to govern it being funded by them, it seems like an infinite loop that's for the most part already lost, unless something big breaks, as we've seen, you know, in the past, it's only when something big breaks that something changes dramatically. I mean, I don't think we should be under any illusions. The most likely path we're on, the one we are on right now, leads to disaster for our societies
Starting point is 01:01:31 and all societies globally. I actually don't think anyone gets out of this thing. I've spent the last, you know, six months traveling the world, talking to leaders and, you know, a whole bunch of people behind closed doors where you have some really, you know, spine tingling conversations. And there's a strain of opinion, mostly in the US and China, which thinks we can just win this race for AI. And much further people will get screwed, but we'll be okay. And that's not going to happen when, of course, you have societies all over the world that are going to be in chaos. No one gets out of this thing in one piece. But we're in a moment now where I think actually
Starting point is 01:02:05 a lot of people across societies, you know, who are paying a bit of attention, they're seeing things are failing on any number of fronts. Yeah. You know, you look at it. all the polling, why is it the overwhelming majority of societies in almost all the world say things are generally going in the wrong direction? People are angry. They realize that there aren't new answers. There is no vision coming from the top anymore. And in that context, you then have big tech racing to install these systems where people are now really worried about their jobs. You know, there's a lot of polling showing that the number one thing that gets people, you know, panicked about AI, which explains why it is, in fact, incredibly unpopular with societies is because they're worried
Starting point is 01:02:44 about their future jobs. So this is something where I think actually there's a lot of people who already are looking for a very different approach to how you manage these things. The debate has not shifted there yet because I think a lot of people don't understand the scale of what might happen, the speed at which this will unfold, and how interconnected all these agendas are. When people really do understand that the technologies that might give you your amazing apps on your phone also might be things that destroy the prospect of your kids having future jobs
Starting point is 01:03:11 and also will threaten, you know, natural ecosystems, then it becomes a very different agenda. So I hear you on a on a macro scale, but like, what are we actually doing? What are we actually trying to do? Like where does it start? Who does it start with? What is the first action, the second action?
Starting point is 01:03:38 What do we hope to achieve on the other side of it? I think before you can get really granular about lots of tactics, you really need to have won on the idea. Like, does anyone think Donald Trump is the president because of his exceptional ability to run a discipline and complain? Like, he had an idea, and it was an idea fit for a moment and, you know, poised to disrupt a whole bunch of bad incumbent ideas. His answer was another set of bad ideas, some really truly evil and villainous ideas, but he had ideas. And what is happening now is that in the absence of any vision of what our future could be, we're just staggering on trying to prop up the things that are not working, the remnants of these old economic, political, geopolitical systems,
Starting point is 01:04:26 the shreds of the international order built in the 1940s. No one is truly saying, this isn't working. What could something that actually work to look like? And don't focus on what's politically achievable. Focus on what do you actually want to build? If you could get away with it, what's the real best case scenario? And I think you have to start there because if you do that, the world isn't static. The moment you have leaders with vision, you can change reality around you in enormous ways.
Starting point is 01:04:53 You know, Donald Trump is the guy who was hosting the apprentice. And now he's the most powerful man in the world. The moment you have a vision, you can start to really move things in ways that may seem utterly implausible. And I think actually the things that most people want in the world, including the United States and the UK, wherever you look, are generally things that were pretty timeless. They want to be able to give their kids a better quality of life than they have. They want to be able to live in peace. They don't want to have to worry about going to war. They want to preserve nature.
Starting point is 01:05:24 They want to have a climate. That is something that isn't something that's collapsing. And when you really wake people up to the fact that there is no plan for. of these things now, and we're almost out of time. And there is still a chance to not only avert the disaster, but here's the thing. What if we could use all that technology now, with a very different vision of our societies and our politics and our economics, to build a radically better future, something that really does solve these problems for all time in a big way, then a lot of people get really hopeful. They get excited about that. So this is the stuff that
Starting point is 01:05:59 I'm going around the world talking about. I absolutely don't believe we solve our problems by pretending they don't exist. There's a lot of people. certain strain of people who think, oh, this is too depressing. Let's not talk about all these things. But if you can admit the path we're on is not working, but then also wait people up to what could the tech really do if it was harnessed with an agenda designed to truly provide a good life to everyone, not just that narrow sliver of billionaires into the tiny percentage of people, then people really, I think in many cases, they long for it. They wish that there was something coming that would save us. That's an interesting idea, because it's an interesting idea, because
Starting point is 01:06:35 it's it's um yeah it proposes it proposes something that i guess isn't often pushed which is first shift the mindset and then let's get into the granular let's sell like the idea of a future or a world and get people on the same page is it possible when we i mean like we even having this conversation are at the mercy of the very machine that we are trying to disrupting I always think about that paradox. I go, you are putting a thing on the internet, on social media platforms, basically trashing the social media platforms and the way they run, which inherently is like this weird conflict of interest slash even like you go, will it even allow you to do it if it
Starting point is 01:07:26 gets to a certain point? Like how do you think of that conundrum? How do you begin a movement when the very thing that is moving you is part? of the thing you're trying to change? I mean, it's something which I think there probably are real consequences, right? I mean, things that we may not even know about. The lack of transparency in how algorithms operate means that certain content may be deprioritized when it's critical or, you know, touches certain issues.
Starting point is 01:07:51 That's something, you know, I was at Meta's oversight board, you know, a few years ago, which, you know, was an independent body looking at how content was managed on these platforms and the things that the company did every day to put its thumb on the scale, blow your mind. You can see all that data online. The board has made it publicly available. It would shock people. But they can't stop everyone if conversations and agendas really are, you know, moving people. And there's lots of other channels which we can go for, right?
Starting point is 01:08:19 These are agendas which have to be pushed politically in business, you know, in the vast majority of the world's economies, which are not the tech industry. Through communities, through nations all over the world, you can advance ideas on infinite numbers of other. the vectors, which ultimately are going to penetrate. And, you know, you see lots of, you know, different points of view on social media, even while you've got some platforms which have clearly tried to put their thumb on the scale, people are going to shut up. And in fact, some of those heavy-handed attempts to muzzle debate or to pretend that everything's fine, it's just the industrial revolution.
Starting point is 01:08:54 That stuff backfired is so catastrophically on the people who say those things, you know, because, of course, you know, I think every serious person realizes that's bullshit. So I think we have to try. I mean, ultimately it comes down to this, right, the vision of the good future and then how we actually achieve that, turning it into a plan, you know, what does a new economic model look like for societies when machines will take jobs? That's going to be really hard. It's going to be the hardest thing we've ever achieved in history, and the odds are stacked
Starting point is 01:09:24 against us from the start, and we would never choose to do it at this moment in time when we're already reading from crises, and yet what other choice do we have? And that's the thing. I think there's a moment now where politicians, they can just keep trying to pretend everything's normal, and they can try to cosplay basically as like 90s politicians when everything was sort of one-dimensional. They can do that. And that's what most of them will do. Humans are just like AI.
Starting point is 01:09:45 They're bad pattern recognition machines. So people will look at all this stuff and they'll be like, actually, it's not this doom and stuff. It's just this stuff over here. It's stuff from the past we've really seen. But they'll fail. They won't solve these problems. and because they will not succeed in solving them, angry, volatile populations facing declining living standards and rising chaos,
Starting point is 01:10:06 they will overthrow those people. They will demand probably worse alternatives. That's why populists are coming to power. That's why nationalists are advancing every day. And then we end up in more and more chaos. But the politicians who realize that that is not the good future, and now we really need to say, we don't have the answers, but we do understand what is happening and the problem.
Starting point is 01:10:26 They will be the people who can begin to show, that agenda. And that's what I'm trying to do now, to work with the leaders, to work with people who really want to start having that radically different conversation. And I don't think talking about it is a sort of kumbaya kind of thing. It's not a let's spend years having a nice conversation. It is, this has to happen really fast now. We have to have people understand specifically what is happening. And if we can do that, then we can very quickly get to practical tactics and an agenda. But we have a narrow window in which to do all these things now. A narrow window
Starting point is 01:10:58 And a mighty, mighty tall ask Thanks, thank you very much, man Thanks, I'm on. Yeah, man. Good luck. Good luck on the rest of the tour and the leaders that you meet. I think you've given us a lot to think about.
Starting point is 01:11:13 Yeah. It's going to be a hell of a 15 years. There is a way out of this stuff. We are not destined to fail, but we have to admit that. It's not working. All right. Thank you so much.
Starting point is 01:11:24 Thanks, I mean. What Now with Trevor Noah is produced by Day Zero Productions in partnership with Sirius XM. The show is executive produced by Trevor Noah, Senaziamen, and Jess Hackle. Rebecca Chain is our producer. Our development researcher is Marcia Robiou. Music, mixing and mastering by Hannes Brown. Random Other Stuff by Ryan Hardoof. Thank you so much for listening.
Starting point is 01:11:51 Join me next week for another episode of What Now.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.