The Diary Of A CEO with Steven Bartlett - Most Replayed Moment: Mo Gawdat - This Is The Only Thing That Will Survive AI, We Must Act Now!

Episode Date: May 16, 2025

As AI rapidly advances, Mo Gawdat shares a candid and thoughtful exploration of what the future holds for humanity. From AI’s growing intelligence and its impact on creativity to the looming disrupt...ion of jobs and society, this top moment challenges us to rethink our role in a world where machines might outsmart us. Mo also reflects on the irreplaceable value of human connection and the need to prepare for a future shaped by technology. Whether you’re excited or worried about AI, this conversation offers a crucial perspective on how we adapt, survive, and find meaning in the age of artificial intelligence. Register yourself by visiting: ⁠Gartner.com/Symposium⁠ Listen to the full episode here - Spotify - ⁠https://g2ul0.app.link/ocYmHrkuoTb⁠ Apple - ⁠https://g2ul0.app.link/EBSoMmQuoTb⁠ Watch the Episodes On YouTube - ⁠https://www.youtube.com/c/%20TheDiaryOfACEO/videos⁠ Mo Gawdat - https://www.mogawdat.com/ Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 If you're an IT leader or a CIO or someone who makes big decisions in tech, there's an event happening this year that I want to be on your radar. Gartner, the sponsor of today's moments episode, is hosting their annual IT Symposium Expo. This conference is the fastest way to get up to speed with everything that's happening in tech. The whole idea is to give executives frameworks to lead confidently in 2025 in a world that's evolving so incredibly fast. And Gartner's global reach means there's an event near you, including the US, Europe, Asia and Australia.
Starting point is 00:00:28 You can expect conversations on AI, cybersecurity, leadership in a digital age, as well as an industry-specific deep dive. Secure your spot by registering at gartner.com slash symposium. That's gartner.com slash symposium. And enjoy this moment with Mo Gorda, former CBO of Google X, talking about the possibility of AI replacing humans in the workforce. So the first inevitable, just to clarify, is what is... Will we stop?
Starting point is 00:00:53 AI will not be stopped. Okay. So the second inevitable is? Is they'll be significantly smarter. As much in the book, I predict a billion times smarter than us by 2045. I mean, they're already what? Smarter than 99.99% of the population. 100%. ChatGTP4 knows more than any human on planet Earth. Knows more information. Absolutely. A thousand times more. A thousand times more. By the way, the code of a transformer,
Starting point is 00:01:18 the T in a GPT is 2000 lines long. It's not very complex. It's actually not a very intelligent machine. It's simply predicting the next word. Okay. And a lot of people don't understand that, you know, chat GPT as it is today, you know, those kids that, you know, if you're in America and you teach your child all of the names of the states and the US presidents and the child would stand and repeat them and you would go like, Oh my God, that's a prodigy. Not really. Right. It's your parents really trying to make you look like a prodigy by telling you
Starting point is 00:01:54 to memorize some crap really. But then when you think about it, that's what she had GPT is doing. It's it's the only difference is instead of reading all of the names of the states and all of the names of the presidents thread trillions and trillions and trillions of pages. Okay. And so it sort of repeats what the best of all humans said. Okay. And then it adds an incredible bit of intelligence where it can repeat it the same way Shakespeare would have said it, you know those incredible abilities of predicting the exact nuances of the style of of Shakespeare so that they can repeat it that way and so on but still
Starting point is 00:02:36 You know when I when I write For example, I'm not I'm not saying I'm intelligent, but when I write something like, uh, you know, the happiness equation, uh, in, in my first book, this was something that's never been written before, right? Chad GPT is not there yet. All of the transformers are not there yet. They will not come up with something that hasn't been there before. They will come up with the best of everything and generatively will build a
Starting point is 00:03:03 little bit on top of that. But very soon they'll come up with things we've never found out. We've never known. But even on that, I wonder if we are a little bit delusioned about what creativity actually is. Creativity as far as I'm concerned is like taking a few things that I know and combining them in new and interesting ways. And ChatGTP is perfectly capable of like taking two concepts, merging them together. One of the things I said to ChatGTP was I said, tell me something that's not been said before that's paradoxical but true. And it comes up with these wonderful expressions like, as soon as you call off the search,
Starting point is 00:03:41 you'll find the thing you're looking for, like these kind of paradoxical truths. And I get, and I then take them and I search them online to see if they've ever been quoted before and they, I can't find them. It's interesting. So as far as creativity goes, I'm like, that is the algorithm of creativity. I, I, I've been screaming that in the world of AI for a very long time because you always get those people who really just want to be proven right. Okay. And so they'll say, Oh no, but hold on human ingenuity.
Starting point is 00:04:08 They'll never, they'll never match that. Like, man, please, please. You know, human ingenuity is algorithmic. Look at all of the possible solutions you can find to a problem. Take out the ones that have been tried before and keep the ones that haven't been tried before, And those are creative solutions. It's an algorithmic way of describing creative is good solution that's never been tried before. You can do that with ChadGPT with a prompt.
Starting point is 00:04:34 And mid-journey with creating imagery. You could say, I want to see Elon Musk in 1944, New York driving a cab of the time, shot on a Polaroid, expressing various emotions. And you'll get this perfect image of Elon sat in New York in 1944, shot on a Polaroid. And it's done what an artist would do. It's taken a bunch of references that the artist has in their mind and can merge them together and create this piece of quote unquote art. And for the first time, we now finally have a glimpse of intelligence that is actually not ours. Yeah.
Starting point is 00:05:10 And so we're kind of, I think the initial reaction is to say that doesn't count. You're hearing it with like, no, but it is like Drake. They released two Drake records where they've taken Drake's voice, used sort of AI to synthesize his voice and made these two records, which are bangers. If they are great fucking tracks, playing them to my god. I was like, and I kept playing it. I went to the shark at playing it. I know it's not Drake, but it's as good as fucking Drake.
Starting point is 00:05:35 The only thing and people are like rubbishing it because it wasn't Drake. I'm like, well, for now, is it making me feel a certain emotion? Is my foot bumping? Had you told did I not know it wasn't Jake when I thought, have thought this was an amazing track? A hundred percent. And we're just at the start of this exponential curve. Let me show you something. Jack, can you pass me my phone? I was playing around with artificial intelligence and I was thinking about how because of the ability to synthesize voices, how we could synthesize famous
Starting point is 00:06:09 people's voices and famous people's voices. So what I made is I made a WhatsApp chat called ZenChat where you can go to it and type in pretty much anyone's, any famous person's name and the WhatsApp chat will give you a meditation, a sleep story, a breath work session synthesized as that famous person's name. Yeah. And the WhatsApp chat will give you a meditation, a sleep story, a breath work session, synthesized as that famous person's voice. So I actually sent Gary Vaynerchuk his voice. So basically you say, okay, I want, I've got five minutes and I need to go to sleep. Yeah. I want Gary Vaynerchuk to send me to sleep. And then it will respond with a voice note. This is the one that responded with for Gary Vaynerchuk. This is not Gary Vaynerchuk, he did not record this, but it's kind of accurate.
Starting point is 00:06:49 Hey Stephen, it's great to have you here. Are you having trouble sleeping? Well, I've got a quick meditation technique that might help you out. First lie, find a comfortable position to sit or lie down in. Now, take a deep breath in through your nose and slowly breathe out through your mouth. It is really shocking, huh? The idea of you and I inevitably are going to be somewhere in the middle of nowhere in 10 years time. I used to say 2055, I'm thinking 2037 is a very pivotal
Starting point is 00:07:26 moment now, you know, and we will not know if we're there hiding from the machines. We don't know that yet. There is a likelihood that we'll be hiding from the machines and there is a likelihood it will be there because they don't need podcasters anymore. Excuse me. Absolutely true. Steve, there's absolutely no, no, no, no, no, no, that's where I drove. This is absolutely no doubt. Thank you for coming.
Starting point is 00:07:51 It's great to do the part three and thank you for being here. Sit here and take your propaganda. Let's talk about reality next week on the Diverse. We've got Elon Musk. So who here wants to make a bet that Stephen Bartlett will be interviewing an AI within the next two years? Oh, well, actually, to be fair, I actually did go to ChatGDP because I thought having you here, I thought at least give it a chance to respond.
Starting point is 00:08:16 Yeah. So I asked a couple of questions about me. Yeah. Today, I'm actually going to be replaced by ChatGDP because I thought, you know, you're going to talk about it. So we need a fair and balanced debate. Okay. And she said he's bald. So I'll ask you a couple of questions that ChatGTP has for you.
Starting point is 00:08:33 Incredible. So let's follow that threat. I've already been replaced. Let's follow that threat for a second. Yeah. Because you're one of the smartest people I know. That's not true. It is. But I'll take it. It is true. I mean,
Starting point is 00:08:45 I say that publicly all the time. Your book is one of my favorite books of all time. You're very, very, very, very intelligent. Okay. Depth, breadth, uh, uh, uh, intellectual horsepower and speed. All of them. There's a but coming. The reality is it's not a but. So it is highly expected that you're ahead of this curve and then you don't have the choice Stephen This is the thing the thing is if so I'm I'm in that existential question in my head Because one thing I could do is I could literally take I normally do a 40 days Silent retreat in in summer. Okay, I could take that retreat and and write two books in summer, okay? I could take that retreat and write two books, me and Chad GPT, right?
Starting point is 00:09:27 I have the ideas in mind, you know, I wanted to write a book about digital detoxing, right? I have most of the ideas in mind, but writing takes time. I could simply give the 50 tips that I wrote about digital detoxing to Chad GPT and say, write two pages about each of them, edit the pages and have a book out. Many of us will follow that path. Okay. The only reason why I may not follow that path is because you know what? I'm not interested.
Starting point is 00:09:57 I'm not interested to continue to compete in this capitalist world if you want. Okay. I'm not. I mean, as a human, I've made up my mind a long time ago that I will want less and less and less in my life, right? But many of us will follow. I mean, I would worry if you didn't include the smartest AI, if we get an AI out there
Starting point is 00:10:22 that is extremely intelligent and able to teach us something and Stephen Bartlett didn't include her on his podcast, I would worry. Like, you have a duty almost to include her on your podcast. It's inevitable that we will engage them in our life more and more. This is one side of this. The other side, of course, is if you do that, then what will remain? Because a lot of people ask me that question, what will happen to jobs? Okay. What will happen to us? Will we have any value, any relevance whatsoever? Okay. The truth of the matter
Starting point is 00:10:56 is the only thing that will remain in the medium term is human connection. Okay. The only thing that will not be replaced is Drake on stage. Okay. Is, you know, is, is me in a hologram. I think of that two pack gig they did at Coachella where they used the hologram of two pack. I actually played it the other day to my, to my girlfriend when I was making a point and I was like, that was circus act. It was amazing. Amazing. Yeah. See what's going on with ABBA in London. Yeah. Yeah. And Cirque du Soleil had Michael Jackson in one for a very long time. Yeah. I mean, so this ABBA show in London, from what I understand, that's all holograms on stage. Correct. And it's going to run in a purposeful arena for 10 years. And it is incredible. It really is. So you go, why do you
Starting point is 00:11:41 need Drake? If that hologram is indistinguishable from Drake and it can, it can perform even better than Drake and it's got more energy than Drake and it's, you know, I go, why do you need Drake to even be there? I can go to a Drake show without Drake cheaper. I might not even need to leave my house. I can just put a headset on. Correct. Can you have this?
Starting point is 00:12:03 What's the value of this to the listener? Oh, come on. You hurt me. No, no. I mean, I get it to us. I get it to us, but I'm saying what's the value of this to the listener? Like the value of this to the listener is the information, right? No, no, no. 100%. I mean, think of the automobile industry. There was a time where cars were made, you know, handmade and handcrafted and luxurious and so on and so forth. And then, you know, Japan went into the scene, completely disrupted the market. Cars were made in mass quantities at a much cheaper price.
Starting point is 00:12:34 And yes, 90% of the cars in the world today, or maybe a lot more, I don't know the number, are no longer, you know, emotional items. They're emotional items. They're functional items. There is still, however, every now and then, someone that will buy a car that has been handcrafted. There is a place for that. There is a place for, if you go walk around hotels,
Starting point is 00:12:59 the walls are blasted with sort of mass produced art. But there is still a place for an artist expression of something amazing. the walls are blasted with sort of mass produced art. But there is still a place for an artist expression of something amazing. My feeling is that there will continue to be a tiny space, as I said in the beginning, maybe in five years time, someone will, one or two people will buy my next book and say, hey, it's written by a human. Look at that, wonderful.
Starting point is 00:13:23 Oh, look at that, there is a typo in here. Okay. I don't know. There might be a very, very big place for me in the next few years where I can sort of show up and talk to humans like, hey, let's get together in a small event. And then, you know, I can express emotions and my personal experiences. And you sort of know that this is a human talking. You'll miss that a little bit. Eventually, the majority of the market is gonna be like cars. It's gonna be mass produced, very cheap, very efficient.
Starting point is 00:13:52 It works, right? Because I think sometimes we underestimate what human beings actually want in an experience. I remember the story of a friend of mine that came to my office many years ago, and he tells the story of the CEO of a record store standing above the floor and saying, people will always come to my store because people love music. Now, on the surface of it, his hypothesis seems to be true because people do love music. It's conceivable to
Starting point is 00:14:18 believe that people will always love music, but they don't love traveling for an hour in the rain and getting in a car to get a plastic disc. Correct. What they wanted was music. What they didn't want is like, evidently plastic discs that they had to travel for miles for. And I think about that when we think about like public speaking in the Drake show and all of these things, like people, what people actually are coming for, even with this podcast, is probably like information.
Starting point is 00:14:43 But do they really need us anymore for that information when there's going to be a sentient being that's significantly smarter than at least me and a little bit smarter than you? So you're spot on. You are spot on. And actually, this is the reason why I'm so grateful that you're hosting this. Because the truth is the genie's out of the bottle. So you know people tell me is AI game over for our way of life? It is. For everything we've known this is a very disruptive moment where maybe not tomorrow but in the near future our way of life
Starting point is 00:15:23 will differ. What i'm asking people to do is to start considering what that means to your life what i'm asking governments to do by like i'm screaming is don't wait until the first patient you know start doing something about we're about to see mass job losses we We're about to see, you know, replacements of categories of jobs at large. Okay. Yeah, it may take a year. It may take seven. It doesn't matter how long it takes, but it's about to happen.
Starting point is 00:15:56 Are you ready? And I, and I have a very, very clear call to action for governments. I'm saying tax AI powered businesses at 98%. Right? So suddenly you do what the open letter was trying to do, slow them down a little bit. And at the same time, get enough money to pay for all of those people that will be disrupted by the technology. When you talk about the immediate impacts on jobs, I'm trying to figure out in that equation, who are the people that stand to lose the most? Is it the everyday people in foreign countries that don't have access to the internet and won't benefit? You talk in your book about how this sort of wealth disparity will only increase.
Starting point is 00:16:34 Yeah, massively. The immediate impact on jobs is that, and it's really interesting, again, we're stuck in the same prisoner's dilemma. The immediate impact is that AI will not take your job. A person using AI will take your job, right? So you will see within the next few years, maybe next couple of years, you'll see a lot of people skilling up, upskilling themselves in AI to the point where they will do the job of 10 others who are not.
Starting point is 00:17:03 You're saying tax those companies 98%, give the money to the humans that are going to be displaced. Yeah, or give the money to other humans that can build control code that can figure out how we can stay safe. This sounds like an emergency. How do I say this? Remember when you played Tetris? Yeah.
Starting point is 00:17:26 Okay. When you were playing Tetris, there was, you know, always, always one block that you play strong. And once you place that block wrong, the game was no longer easier. You know, it started to gather a few mistakes afterwards and it starts to become quicker and quicker and quicker and quicker. When you place that block wrong, you sort of told yourself told yourself okay it's a matter of minutes now right there were still minutes to go and play and have fun before the game ended but you knew it was about to end okay this is the moment we've placed the wrong and i really don't know how to say this any other way it even makes me emotional we fucked up we always said don't know how to say this any other way. It even makes me emotional. We fucked up
Starting point is 00:18:09 We always said don't put them on the open internet Don't teach them to code and don't have agents working with them until we know what we're putting out in the world until we find A way to make certain that they have our best interest in mind Why does it make you emotional? because humanity's stupidity is affecting people who have not done anything wrong. Our greed is affecting the innocent ones. The reality of the matter, Stephen, is that this is an arms race. Has no interest in what the average human gets out of it. It is all about every line of code being written in AI today is to beat the other guy.
Starting point is 00:18:55 It's not to improve the life of the third party. People will tell you this is all for you. And you look at the reactions of humans to AI. I mean, we're either ignorant people who will tell you, oh no, no, this is not happening. AI will never be creative. They will never compose music. Like where are you living? Okay. Then you have the kids, I call them, where, you know, all over social media, it's like, oh my God, it's squeaks. Look at it. It's orange in color, amazing. I can't believe that AI can do this. We have snake oil salesmen, okay,
Starting point is 00:19:27 which are simply saying, copy this, put it in chat GPT, then go to YouTube, nick that thingy, don't respect, you know, copyright of anyone or intellectual property of anyone, place it in a video and now you're gonna make $100 a day, snake oil salesman. Okay, of course we have dystopian evangelist, basically people saying, this is it. The world is going to end, which I don't think is a reality. It's a singularity.
Starting point is 00:19:51 You have, you know, utopian evangelists that are telling everyone, Oh, you don't understand, we're going to cure cancer. We're going to do this again. Not a reality. Okay. And you have very few people that are actually saying, what are we going to do about it? Hmm.
Starting point is 00:20:03 Okay. And you have very few people that are actually saying, what are we going to do about it? And the biggest challenge, if you ask me, what went wrong in the 20th century? Interestingly, is that we have given too much power to people that didn't assume the responsibility. So, you know, I don't remember who originally said it, but of course, Spider-Man made it very famous, huh? With great power comes great responsibility. We have disconnected power and responsibility. So today, a 15-year-old, emotional, with out of fully developed prefrontal cortex to make the right decisions yet, this is science, huh?
Starting point is 00:20:42 We developed our prefrontal cortex folium at age 25 or so. With all of that limbic system, emotion and passion, would buy a crisper kit and modify a rabbit to become a little more muscular and let it loose in the wild. Or an influencer who doesn't really know how far the impact of what they're posting online can hurt or cause depression or cause people to feel bad.
Starting point is 00:21:12 And putting that online, there is a disconnect between the power and the responsibility. And the problem we have today is that there is a disconnect between those who are writing the code of AI and the responsibility of what's going about to happen because of that code. Okay. And, and, and I feel compassion for the rest of the world. I feel that this is wrong. I feel that, you know, for someone's life to be affected by the actions of others without having a say in how those actions should be, is the ultimate, the top level of stupidity from humanity. If this conversation resonates with you, and you want to ensure you're staying at the forefront of innovation, AI and more, register to secure your spot at one of the Gartner IT Symposium Expos.
Starting point is 00:22:00 There are hundreds of sessions taking place based on thousands of hours of research and conversations within the IT community and all led by Gartner experts. Connect with visionaries, network and dive into the big trends shaping tech so you don't get left behind. Spaces are limited, so register yourself by visiting gartner.com. That's gartner.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.