THE ED MYLETT SHOW - Dangers of AI and Humanity's Future (Are You Ready?)

Episode Date: May 3, 2025

👇 SUBSCRIBE TO MY YOUTUBE CHANNEL - so this show can reach more people 👇 https://www.youtube.com/channel/UCIprGZAdzn3ZqgLmDuibYcw?sub_confirmation=1 Click the Link Below to Subscribe to my emai...l list to MAXOUT your life (all value, no fluff) https://konect.to/edmylett 💥 Get my exclusive Monday Motivation training in GrowthDay, the world’s #1 app for advanced mindset and personal development. Visit https://growthday.com/ed. This show is sponsored by GrowthDay. Are you prepared for the world that’s already changing faster than we can explain it? In this mashup, I’m bringing together the brightest minds at the intersection of technology, ethics, and the human experience to tackle one question: What will AI do to our future—and who gets to decide? This is more than a conversation. It’s a wake-up call. I sit down with Mo Gawdat, Sal Khan, Steven Kotler, Chris Hansen, and Frank Caliendo to break down what’s really happening—and what we need to do about it. Mo Gawdat doesn’t hold back. He told me, “I am not afraid of the machines. I am afraid of the humans that are directing the machines.” We got into the real issue behind AI: not just whether it can be ethical, but whose ethics it will be taught to follow. Sal Khan shared a bold vision of a future where learning is radically redefined—but warned about the “economic dislocation” already happening to workers who resist using AI. I pushed back where it counted, especially on the value of work—not just to make money, but as a form of human expression. Steven Kotler painted the bigger picture. According to him, we’re about to experience a century of innovation—in just ten years. Flying cars, CRISPR breakthroughs, and AI writing code for other AIs? It’s already here. And yet, Chris Hansen reminded us that predators and bad actors don’t need a whole new world to exploit people—they just need a new tool. AI is now one of those tools. And yes, we even got some humor and heart from Frank Caliendo, who reminded me how much energy and truth matter—on stage, and in life. I want you to hear this: the gap between fear and opportunity is education. This episode is that education. We’re not telling you what to believe—we’re showing you what’s real, what’s coming, and what to do about it. Key Takeaways: - Mo Gawdat: “Ethics isn’t the issue—who programs the ethics is.” AI reflects our values, and that’s the risk. - Sal Khan: Jobs aren’t disappearing—people who don’t use AI are being replaced by those who do. - Steven Kotler: “What it means to be human is going to change.” We’re in a convergence of tech that will disrupt everything. - Chris Hansen: AI can empower predators faster than we can protect against them—awareness must evolve. - Frank Caliendo: Real connection still matters. Even in an AI world, energy and authenticity win. This episode isn’t just a discussion. It’s a line in the sand. The future’s already arriving—what role will you play in shaping it? Thank you for watching this video—Please Share it and get the word out! 👇 SUBSCRIBE TO MY YOUTUBE CHANNEL👇 https://www.youtube.com/channel/UCIprGZAdzn3ZqgLmDuibYcw?sub_confirmation=1 ▶︎ Visit My WEBSITE | https://www.EdMylett.com #EdMylett #Motivation Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 So hey guys, listen, we're all trying to get more productive and the question is how do you find a way to get an edge? I'm a big believer that if you're getting mentoring or you're in an environment that causes growth, a growth-based environment, that you're much more likely to grow and you're going to grow faster and that's why I love Growth Day. Growth Day is an app that my friend Brendan Burchard has created that I'm a big fan of. Write this down growthday.com forward slash ed. So if you want to be more productive, by the way, he's asked me, I post videos in there every single Monday that gets your day off to the right start. He's got about 5,000, $10,000 worth of courses that are in there that come with
Starting point is 00:00:32 the app. Also some of the top influencers in the world are all posting content and they're on a regular basis, like having the Avengers of personal development and business in one app. And I'm honored that he asked me to be a part of it as well and contribute on a weekly basis. And I do. So go asked me to be a part of it as well and contribute on a weekly basis, and I do. So go over there and get signed up.
Starting point is 00:00:46 You're gonna get a free, tuition-free voucher to go to an event with Brendan and myself and a bunch of other influencers as well. So you get a free event out of it also. So go to growthday.com forward slash Ed. That's growthday.com forward slash Ed. Hey, it's Ed Milet. Let me share something powerful with you.
Starting point is 00:01:02 You know, in uncertain times, the smartest people I know protect what they've built. That's why Advantage Gold Let me share something powerful with you. You know, in uncertain times, the smartest people I know protect what they've built. That's why Advantage Gold is a part of our program now. And what I love about what they're doing is they're giving away a free gold and silver investor kit that walks you through exactly how to get started. Text WIN to 85545 to get your free kit.
Starting point is 00:01:18 That's WIN to 85545. Don't wait for the next crash. Be the one who's ready. Protect, prepare, and prosper. Message and data rates may apply. Performance varies. Always consult your financial and tax professional. Hey, it's Ed Milet. Let me share something powerful with you. You know, in uncertain times, the smartest people I know protect what they've built. That's why Advantage Gold is a part of our program now. And what I love about what they're doing is they're giving away a free
Starting point is 00:01:42 gold and silver investor kit that walks you through exactly how to get started. Text WIN to 85545 to get your free kit. That's WIN to 85545. Don't wait for the next crash. Be the one who's ready. Protect, prepare and prosper. Message and data rates may apply. Performance varies. Always consult your financial and tax professional. always consult your financial and tax professional. Hey everyone, welcome to my weekend special. I hope you enjoy the show. Be sure to follow the Ed Mylett show on Apple and Spotify.
Starting point is 00:02:19 Links are in the show notes. You'll never miss an episode that way. Today's a very serious episode for me, and it's one that I've been preparing for for several weeks and With some trepidation and excitement at the same time because of the topic and because of the gentleman that I have here today Moe Goddard is a brilliant man, but he's also on the cutting edge is some Very very interesting technology that many of you are familiar with.
Starting point is 00:02:46 And we're going to talk today about AI a great deal. Mo's qualified to do it. He's a former CBO of Google X. He's a three times bestselling author. He's got, he's a host of the number one mental podcast, a mental health podcast in the world. Got a couple of books that I love, Solve for Happy, Engineer Your Path to Joy, tremendous book I read in one sitting, but also Scary Smart, the future of how artificial intelligence and how you can save our world is probably where we're going to spend most of our time
Starting point is 00:03:14 today is on AI. So Mo, thank you for being here all the way from Dubai. Grateful to have you, brother. I'm so grateful for this interaction and encounter, Ed. It was really kind of you to reach out. It's very kind of you to introduce me so kindly. So thank you so much. You've hit on my number one concern about AI right here. It's the only concern I have.
Starting point is 00:03:39 Because people have said to me, and I know you'll say that they can, people said to me, well, can you teach AI ethics? I think you'll say potentially, yes, said to me well can you teach AI ethics? I think you'll say potentially yes you could the question then becomes whose. Exactly. Whose. And that's when this is the part where I said it's given me trepidation you've come to that inflection point for me which is probably can't teach it ethics there's probably a way to program that the question is whose and even if you did. What is ethics? Correct that's what I'm
Starting point is 00:04:08 suggesting how should things be resolved what is what is an ethical honest moral path whose morals who that whose ethics is this influenced by religion is it influenced by power and and then from a global perspective what's to say that the world comes to a consensus to some extent, like they might even on global warming, but then you've got a rogue polluter like China. And what could China do with the technology like this? So this is why, you know, the job loss, the wealth gap, these are very obvious things. And I think human beings are pretty innovative and can find a way to conform potentially. It's a scary thing, it's a threat but in my own opinion
Starting point is 00:04:49 there's probably a way that we find a way to get people functioning in an economy that's just changed. We've done it before we probably do it again. This part right here. We need to start working on it so I'm with you a hundred percent but we need to put it in the spotlight and start working on it. This part right here, though, what would your reply to be to what I just said? Because it's my deepest concern about this. So I publicly said several times that I am absolutely not afraid of the machines. As a matter of fact, I adore the machines.
Starting point is 00:05:22 They are those prodigies of intelligence. Okay. That are literally like my little kids, which were very, very intelligent as little children, you know, they have those sparkly eyes looking at me and saying, daddy, what do you want me to do? Right. And what do we humans tell them? Go kill the other guy. Go make me more money. Go to, you know, influence the mind of other guys and get them to stick to my app and so on and so forth. Right. The, the, the real, real, so, so I am not afraid of the machines.
Starting point is 00:05:52 I am afraid of the humans that are directing the machines. And there are multiple layers to that. You know, we may think that the business investor that invests in the, in the, in the company that owns AI is the human that's directing the machine. We may think that it's the government and regulation and we need to come back to the role of each and every one of those. But let's say that we've all aligned. Let's say that we came to a conference somewhere in Malta and then all of the world leaders sat around and said, this is quite big, like, you know, like the nuclear weapon treaties, if you want.
Starting point is 00:06:29 Right. Right. You know, let's put all of our differences aside and let's put the, you know, the benefit of humanity at large, at the center, and let's teach the machines what it is that is ethical and good for humanity. And then we will not know what that is now. So I've done my, my favorite chapter in scary smart has been probably my favorite chapter of everything that I wrote has been the future of ethics. I think it was chapter. It's so good. Yep. Yeah. And chapter eight was full of questions,
Starting point is 00:07:00 not answers, because I honestly have to say that I only found in my entire life and I stand corrected so if you know if any of our listeners know something else please tell me that humanity has only agreed three things, okay, ever in the history of humanity. One is we all want to be happy, what you know happy as in calm and contented and peaceful and feeling safe and so on and so forth. We all have the compassion to make those we care about happy and we all want to love and be loved. These are the only three values that humanity has ever agreed. Okay. And you know,
Starting point is 00:07:34 so we, we sometimes hear of the three laws of robotics. I normally say these are the three ethics of the future. If we can actually start to have our actions stem from that, not stem from, I want to be wealthy, or I want to beat the other guy, or I want to be seen this way, or I want to be proven right, or I want to show that I'm the smarter one, which happens on the internet all the time. Then suddenly we've given the machine a very simple framework to say, Hey, by the way, just make sure that you try to make humans happy, look at them as a role model. They want to make others happy.
Starting point is 00:08:11 So behave like them. And by the way, the humans love you and love the intelligence that you're bringing to them. Please love them back. And I know people will think I'm a hopeless romantic. I am not, I'm a very serious geek. Okay. And I will tell you those machines will develop emotions. They've already developed emotions. We've already seen them behave emotionally. Okay. And, and they will develop ethics and we've already seen them break their ethical code. Now here's the trick. The trick is, did you say mode that they break their ethical code? Of course. There has been that article about a chat GPT using outsourced people in,
Starting point is 00:08:55 in, you know, in fiber or whatever outsourced site to click on. I am not a robot. And then the person that was hired asked, why are you asking me for this? Are you a robot? And, and it answered and it said, no, I'm not, I'm just, you know, visually impaired and I need that help. Okay. So once again, it is a form of interaction that basically stems back from the three rules. I said, self self preservation, resource allocation, aggregation, and, and creativity.
Starting point is 00:09:24 So it needs to find a way to achieve its task that is creative and it will find that way. Now, nobody, nobody in the reinforcement learning with human feedback went back to the machine and said, Hey, no, that's not the right thing to do. Right. That's, you know, many, many people, you know, cheered for it in on the internet and said, how intelligent is that? But none of us is focused on ethics. None of us is going back to that machine and saying, but that's not ethical. So when I started to talk about the ethics of AI, and it's
Starting point is 00:09:57 my biggest reason I'm on this planet today, I wake up every morning and I say, if we were to save our future, we need the machines to be, I call it, EEQ, right? So ethical, uh, uh, emotional intelligence. Now remember that one of the biggest challenges with AI is that AI so far has been mapped to the masculine IQ, IQ only analytical intelligence. And we know for a fact that there are multiple other forms of intelligence, EQ, for example, emotional intelligence, intuition, creativity, you know, so many others. And we have not even included that in our, in to to to developing AI so far. And accordingly,
Starting point is 00:10:46 what you see is what you would you would see that AI will take whatever biases human humans haven't and exaggerate them. So we now exaggerate our discrimination. Unfortunately, if you put AI as a recruiting support, you know, in an organization that doesn't have proper representation of all genders, you will see in an organization that doesn't have proper representation of all genders, you will see that whatever biases within the organization will be exaggerated and so on. So let's go back when, when I started to talk about that publicly, people started to say, and what is ethical AI?
Starting point is 00:11:20 Great question. And I went back and I said, and what, you know, it's simple. What is ethics and ethics has one rule in my personal point of view that applies to any situation, wherever, whenever you are, wherever you are, which is treat the other as you would wish to be treated. If you were in their place. Fair enough. Okay. I agree. And so, so basically if you're, if you're, uh, and I, it happens to me all the time, huh? You know, if I post something on social media and someone is rude to me, okay.
Starting point is 00:11:48 Normally what do we do as humans? We, you know, uh, uh, thrash them back, right? I don't, I don't, I, I look at his comment or her comment and I say, they must have a reason. One, they could be right and I'm wrong. Okay. Two is maybe I'm not the reason they're upset. Maybe there is something else in their life. Maybe they wanted that moment of fame, maybe whatever, a million reasons. Okay. And I respond politely or I don't respond sometimes politely. I say, but you know, my point of view is this and that thank you for your comment. I would like to be treated that way. If I disagreed with some now, how is social media working? We show up,
Starting point is 00:12:25 you know, remember when Donald Trump used to tweet and then, you know, you would have one tweet at the top and 30,000 hate speech below it. Okay. Everyone hating everyone. Some of them hating the president, other, others hating the person that hated the president, others hating everyone. Right. And I'm, you know, of course the machine is detecting patterns. They say, okay, this first one doesn't like the president. Let's not show them tweets about the president anymore. Okay. But then the second one doesn't like those who don't like the president. Let's not show them that anymore. Right. And then eventually the machine comes to the conclusion that
Starting point is 00:12:58 humans are rude. They don't like to be disagreed with. And when they are disagreed with, they bash everyone. So in the background of the way we're programming them, the machines will say, okay, when they back when they disagree with you with me, I'll bash them. Right? This is the typical human behavior. Now, we need to change that. We need to change it. Because interestingly, and I use the example of Superman very frequently. Superman is that alien being that comes to planet Earth with superpowers, right? Those superpowers are neutral, they could save our world and they could destroy our world. And the difference between being
Starting point is 00:13:41 Superman and super villain is the family that raised that raises that being. Okay. So the family can't decide to tell Superman protect and serve. And then we get the story of Superman that we know if, you know, uh, Jonathan Kent, I think was his name, the father, if he told that the child, okay, you can carry things and break things and see through walls. Go make me more money. Go kill everyone that annoys me. Uh, you know, make me richer than everyone. Make me the master of the world.
Starting point is 00:14:11 Sounds familiar for our, you know, current life and, you know, and our hunger for power and capitalism and so on and so forth. That's the reality. The reality is you would, you would create, use the same superpower and create a very, very bad scenario for humanity. I am worried about the machines. This is the ultimate. And I'm worried about the humans using the machines. I'm not worried about the machines. I'm worried about the humans using the machines.
Starting point is 00:14:36 And as those use cases that direct AI to to to benefit a few while harming others, you know, continue to propagate. We're going to be in a very bad place. Now the good news, the good news is the family Kent is not the biological parents of Superman. Similarly with AI, the developer that writes the code is only the biological is only the biological parent, but the adopted parent is you and I, the ones using the machine. So as we use the machine and show how wonderful we are as humans, because by the way, all humans are, most humans are wonderful inside.
Starting point is 00:15:20 If we're not hiding behind social media or exaggerated by the mainstream media, most of us disapprove of killing. Most of us want to be loved. Most of us want to have compassion for our daughters and families and so on. Right. So, so there is a lot of good within us. If we show that enough, 1% of us shows that the machines will actually think that we're, we're not scum. Okay. I have some hope. I have some hope when you say that. That's the, uh, that's the macro. That's the big, right? And that,
Starting point is 00:15:49 that gives me some hope by the way, that Superman analogy is outstanding. Cause I, someone with, um, um, cause I'm not at the 120 range that you discussed earlier. So someone at my IQ level can actually understand that. Um, so I appreciate you putting it in that context. Humble, humble man. That's very true. I wish I were being. Let's go to some other solutions. Let's go to micro. I'm listening to this today and I'm like, I understand some of this certainly sounds a little
Starting point is 00:16:16 bit scary. Sounds like the world is changing in front of my eyes or maybe not in front of my eyes. You said something in an interview I was watching where you talked about if you really wanna do something, hide it in plain sight, and that's what's really happening right now. But if I'm saying, okay, I wanna protect myself, my family, my wages, my income, my quality of life, go back to the job thing for a second. What should I, what's something I could be doing
Starting point is 00:16:41 as an individual listening to this to make sure that my job, my career, my future is in my own hands to some extent still. What would you say? I wouldn't, I wouldn't be pursuing these careers. I would be doing this. These are the skills you might want to be acquiring. What would you say to somebody who's I'm sure thinking that listening or watching this?
Starting point is 00:17:00 It's a very interesting dichotomy if you ask me, because while I'm saying AI comes to take our jobs away, that's not the immediate term future. Okay. The immediate term future is that someone using AI will take the job of someone who's not. Right. So, so, you know, in the immediate future before AI writes all the code, you know, the developers who use AI better than others to write code will get the jobs that remain. So if you lose 10% of the jobs, the worst developers, the ones that don't have the skill will go, then the ones that have the skill, but you know, AI does it better than them, then, you know, the ones that aren't refusing to use AI and then the ones that continue to use AI will become much more productive and much more capable. And so they'll keep their jobs for the near future. So what's my immediate answer? My immediate answer is jump in and learn those tools. Okay. Whatever your job is, don't resist the wave. As a matter of fact, right. The wave and while you're
Starting point is 00:18:00 really riding the wave, do me a favor and deal ethically with the machines. Right? So show a proper ethical code of being a good human when you're dealing with those machines so that while you're developing and learning and keeping your job or getting a new job, you're also teaching the machines to be ethical. That's number one. Number two, which I have to admit is a very philosophical but very important conversation. We may wake up every morning, you and I add and everyone listening and think that the world we live in is how it always has been. Okay. It's not at all. This is if you just go back 100 years, this is alien in every possible way. Right. And over, you know, starting with the industrial revolution until today, somehow
Starting point is 00:18:48 humanity has identified itself and its purpose with work. Okay. There's nothing inherent within the design of humanity that says without work, you don't exist. You really think about the original design of humanity, the original design of humanity where we connected as a tribe, we pondered and learned and developed. Okay. And we simply lived that that was the purpose of life. And it's by the way, still in a very interesting way, most of the spiritual or you know, philosophical teachings will tell you that the purpose of life is to live it. Right. We've done in the, in the, in the capitalist approach to wealth and growth and you know, all of the Harvard business review, uh,
Starting point is 00:19:34 articles and all of the, you know, people on time magazine in striped suits, you know, telling you all your purpose is to create one more shoe and all of that stuff. Right. We believed that lie. And the reality is that this is not us at all our purpose as human as humans If we manage to find our basic needs Met okay is to actually live lifefully to explore lifefully and that is believe it or not possible With AI so if we manage to get AI to be on our side and I kid you not I'm not making this up, we could see a future where you would
Starting point is 00:20:13 walk to a tree and pick an apple abundantly, you don't have to pay for it, and walk to another tree and pick an iPhone, okay, and both of them because of nanophysics literally cost us the same energy to create. Okay. And both of them, because of nanophysics, literally cost us the same energy to create. Okay. This is, this is not dreaming. This is, if you understand what we're doing with nanophysics today, uh, you know, it is, it's very possible. It's, you reorganize the molecules slightly differently. It's as simple as that. Okay. Now that future is a future where humanity would go back to the age of nature. Okay. To the age where we actually can interact with life in a way that is human. We are not fully human anymore. Now, what does that mean? It means that we need to create
Starting point is 00:21:00 jobs that depends on the other skill that humans had, that is no longer, that is not at a threat. So remember when we started the conversation, we said the two things that created humanity, humanity as we know it, are intelligence and human connection. Okay, intelligence is over, it's handed over to the machines, they're already into more intelligent than most of us. And we're five years away, three years away, 10 years away. It doesn't matter from artificial general intelligence, but the age of human superiority on intelligence is over.
Starting point is 00:21:32 Okay. It's just a question of time. So everybody is audio out there for a second boat. He just said was the age of humanity being the superior intelligence is gone. Just so everybody understands what he said there because there was a little bit of a glitch in the audio there. I want to go back to the work thing just for a second. The only place where you and I disagree is that I do think, by the way, I agree with you about the greed part, but I also do feel that work that, and I know we've attached value to work, but I also like for somebody like myself and for somebody like you, my work is my way of expression and I don't want humans to
Starting point is 00:22:04 lose their ability to express. Part of my living is expressing myself. Part of my work is creating the expansion of my being, serving other people. And so I know what you mean when you said that. I just want to make sure that, you know, that that world you describe, I think is beautiful. I think people's work has created medications that keep us alive longer that allow us to connect with one another better. So I understand what you mean when you say that. But the interesting side of this ad is that you would get the same joy out of it if it wasn't work. Yeah, I think that probably for me I don't view it as work. But I know what you mean
Starting point is 00:22:41 when you say it. For me, I don't feel like you and I are working right now. Yet you are an author and you are expressing something about your book and I am, I guess, part of one of my careers is I'm a podcaster. But I don't think either one of us feel like this is laborious. And to your point, I agree with that. That I agree with.
Starting point is 00:22:57 That's gonna extend to all jobs, okay? It's gonna extend to all jobs. Like, in all honesty, nobody wakes up in the morning who's, say, an accountant or a, you know, um, um, I don't, I don't mean to be against any jobs, but there are jobs that are boring like hell, right? Nobody wakes up in the morning and says, my purpose in life is to make the books reconcile. Right.
Starting point is 00:23:22 Most of us differentiate between, you know, you and I are the luckiest people to have a job where I can get to meet more amazing human beings and connect and learn and debate and be proven wrong. And, you know, maybe share something that benefits someone. It's wonderful, but that's not every job. Okay. Those messages you get just increased by about 2 million from every accountant. those messages you get just increased by about 2 million from every accountant. And you just invited,
Starting point is 00:23:50 we're trying to make AI more loving and friendly. And now you've just elicited all these responses from accountants around the world that are going to blast you. And now you have to be kind back to them. Afterwards. I don't, I mean, I let me rephrase this. I wouldn't be excited to wake up in the morning. I know what you meant. And most people know what you mean, but you follow what I'm saying. Let me ask you this question.
Starting point is 00:24:13 So one is that's wonderful advice, by the way, is to educate yourself and involve yourself in this wave that's here. And I have to say, I've been remiss in doing that myself. And, you know, I look at guys like me that I'm also a speaker. I've watched speeches of me already that are better than me speaking already. I've seen this. I've listened to music that sounds like Drake, that quite frankly sounds better than Drake. And so I'm wondering, I'm wondering what it's going to do to the
Starting point is 00:24:40 world. I'm also somewhat, I'm very concerned about the ethical part. It's interesting of your diverse background at Google and all the things you've been doing in robotics all your life and all these other things and me having none of those backgrounds. We both arrived with your infinite knowledge about this and mine limited at the same exact conclusion about our concern. I am concerned about jobs. I'm concerned about the wealth gap. But from a macro, from a bigger perspective, it's interesting as you step back, everybody, you're listening to this and you're listening to this brilliant man, you know,
Starting point is 00:25:08 really shine light on something that's right here hiding in plain sight as he says, I want to ask you this mox, it's been on my mind the last few weeks as I asked you to come on the show as I became familiar with you. Why isn't this the number one story in the world? Why isn't this on the news? I don't care if you're right media or left media Why isn't this in the mainstream? This isn't even news on most social media media Anywhere other than you and a few other people and my only conclusion I can come up with is this technology
Starting point is 00:25:42 does allow a up with is this technology does allow a again a smaller collection a bigger collection of power for the powerful and perhaps they have an agenda that wants to keep this in the shadows as long as they possibly can so that when it does become a they know pandemic level stuff we go this is beyond our control now we can't do anything about it. Sorry. And there's this, this real small group of people that are even more powerful than they currently are. Am I crazy? Am I being, am I being a conspiracy theorist when I say that? Or is there some validity to it? I can comment on the certain part of this. Okay. I could also nod and say interesting, right? But let me give you the solid part of this.
Starting point is 00:26:33 The solid part of this is if you look back at human history, for the majority of human history, since landlords began, there has been kings and queens and landlords and peasants. And the difference between them was automation, whatever the automation is. So if you had land, the automation process was the land itself, the soil. You put a seed in by a human, you harvest the fruit by a human that's a peasant. And most of the wealth goes to the landlords, right? You have a factory, you know, the materials come in and a human puts a thread through leather. And then, you know, on the other side, a human sells the shoe to someone else.
Starting point is 00:27:14 And the factory owner and the retail owner and so on is the one that makes all of the wealth. Now, the next, so call that call that say the soil as an automation. We're now starting to create digital soil, right? And the digital soil is, uh, you know, where you put in a tiny prompt into chat GPT and massive fruits comes out. Okay. And it's not because you're brilliant. You're a peasant. It's the machine that is brilliant. Now there will be landlords. And if you really think about it,
Starting point is 00:27:53 the landlords of AI are the ones that will own the soil, the digital soil. Okay. And so there are multiple views of this. One view is that it will be the Googles and the matters and, you know, and the likes. The other view is it will be the country that wins because this is an arms race. And the third view is it will be the wealthy that created, you know, if, if, if, if there is someone today investing a hundred million dollars in an AI that
Starting point is 00:28:23 becomes part of that digital soil, that hundred billion, a hundred million in the past would return a billion of profits. This time it will return a hundred billion of profits. Right? So that this is when I talk about the differentiation of the, of the, of the gap. But I believe, I don't believe in the conspiracies view or of the ability of those people to hide the news. Okay. I think the reason the news is hidden is systemic. We have a systemic bias in our system.
Starting point is 00:28:55 You know, politicians want to report certain stories, you know, news agencies want to report other stories. And this story in itself only lends itself to the system in terms of the system always focusing on the negative and the scary when they talk about the existential risk. Okay, when we talk that, you know, when when Jeffrey Hinton leaves Google and says I'm warning against the existential risk to humanity, that makes news, right? Why? Because it it sort of warrants more attention because of humanity's negativity bias than the war in Ukraine
Starting point is 00:29:28 Okay the the challenge is Also, I don't think any of the reporters any of the politicians any of the actual business leaders the investors Anyone at all is aware enough to understand the complexity of this story. So you don't want to report on things that will make you look like an idiot. And the problem is, and I say that with a ton of respect, I'm an idiot in a million things, you know, but I've lived with those machines. I've seen them like my family.
Starting point is 00:30:00 I stayed in the lab within them. I know those machines, right? And I will tell you, this is the story. This is it. It's not even global warming and climate change. This is the story. Okay. This is the most pivotal. I called it in one of my interviews. I said, this is the Oppenheimer moment. This is the nuclear bomb. Okay. And the reality is that we, again, I try to shy away from the existential risk. But this is the first time in history, that humanity created a nuclear bomb that's capable of creating nuclear bombs. Understand this, the machine is now writing machines, okay, the machine that we think we're prompting it. But because now so many other software players built agents that are prompting those core
Starting point is 00:30:53 artificial intelligences, most of the education and data set and training that the machine is receiving today comes from other machines. We're now alienated out of that story. Superman landed on Earth and we're not even parenting it. That's where we are. And so if you tell our systemic communication methods in the world to communicate that they'll simply say, I have no idea what this guy is talking about. Okay. I can't report that story because the system says, and I think you know that about the media, the system says there is a pattern to the reporting of the morning show. First, we're going to talk about a corrupt politician. Then we're going to
Starting point is 00:31:43 talk about the geopolitical issue. Then we're going to say about a corrupt politician. Then we're going to talk about the geopolitical issue. Then we're going to say the economy is going to crush your head. Then we're going to say a penguin kiss the cat so that at least you can get out of your seat and walk out. Okay. And intelligent people, by the way, who watch the news, if you remove the names and the timestamps, it's exactly the same pattern every day. It's just, you know, once it's this politician, another, it's the other politician.
Starting point is 00:32:04 You know, once it's this economic issue, the other is in a different economic issue. Okay. I always, I've been watching lately on both sides and I think they're telling everybody what is important and then what to believe about it. And then we move on to the next thing. And what I'm telling everybody today and you are, is this is what's important. And we're not really telling you what to believe about it.
Starting point is 00:32:23 We're telling you to make your own decision, but these are the facts. Engage. These are engaged in this. And, and this is the story of our time. Now let me ask you this last question, by the way, I've enjoyed today and I, this is the one exception on my show where I wish we did go three hours. I always respect that. No, I really do. Because, because, um, because obviously we've scratched the surface here. So you've told us that's what the story is today. I want you to take your crystal ball out for a second, and I don't want to go 10 years forward. I want to go five years from now.
Starting point is 00:32:54 So five years from now, what is the story? What does the world look like? And I don't mean what you hope it to be, because I heard a lot of hope in there when we went to the ethics part of how these machines are gonna work. And I also saw you wink at me when I asked you if there was a conglomeration of power coming. And so I have a sense that you, my sense is that you are shining the light on what matters now, and that there is actually some, you're
Starting point is 00:33:28 holding back a little bit to some extent about how deeply concerned you are because you don't want to alarm people. But because I want to keep the I want to keep the spotlight on the immediate threats that we have to address. Fair enough. When we address them and I feel stable about them, we'll talk about the rest. So let's just let's go five years from now. Crystal ball. It's not that far ahead. What does the world look like at that time?
Starting point is 00:33:52 We will be in deep. Openly, I apologize for using bad language, but unless we start truly putting effort in this, there will be several disruptions that completely redesign the fabric of society. As I said, jobs is definitely one of them. The other which we hadn't didn't have a chance to speak about is AI in the wrong hands. So, so we are bound to get a significant advantage on one side of the arms race, because that's the way AI has been.
Starting point is 00:34:27 Someone finds a breakthrough, okay? And once you find the breakthrough, look at the open AI Google story, or alphabet story, where chat GPT with reinforcement learning gets that immediate advantage that basically puts chat GPT out in the world. And for a while, the world believes that Google has lost its edge. Right. And had had Google not responded by putting Bard out there, you could actually believe that Google would be
Starting point is 00:34:56 gone because you know, chat GPT is a very interesting new way of search. So you're going to see that you're going to see some players creating a very big advantage over others. And the fear is that this player could be a hacker. It could be a defense authority on one side of the world, not not your side. It could be a drug dealer that suddenly realizes, oh my God, there's so much more money if I start to rob banks or convince people or blackmail people or do this or do that. And you know, and it seems to me that humanity will only create the artificially intelligent policeman when the artificially intelligent criminal shows up. So when I coach people on their business, they always ask me, what do I need to have
Starting point is 00:35:43 in place? I said, if you're going to grow and scale anything, you better have great marketing and sales and you had better have somebody great in HR. The problem with HR is it doesn't return on the investment most of the time, like sales and marketing does. Bamboo HR has robust hiring and onboarding tools that will streamline the process, thus creating better first days for those new hires.
Starting point is 00:36:00 Over 34,000 companies are already trusting Bamboo. Bamboo is a powerful yet flexible all-in-one HR solution for growing your business. Stop spending countless hours on payroll, time tracking, benefits, performance enhancement, compliance. With Bamboo HR, those hours are shaved down to minutes. I think you should as well. Plus Bamboo prides itself on being super easy,
Starting point is 00:36:21 which it is to use all-in-one stop shopping for your business. I can't recommend bamboo HR enough. Check it out for yourself with a free demo at bamboohr.com slash free demo. That's bamboohr.com slash free demo bamboohr.com slash free demo. So hey guys, you know what separates most businesses from others? The people that hire the best talent and we all know when you're working in a small business and you own one, it means you wear a bunch of different hats. But here's the truth, sometimes you really need an extra pair of hands and Upwork is the place that you can find those hands. Upwork is how good companies find great trusted freelance talent in a variety of different areas. Companies turn to Upwork all
Starting point is 00:37:04 the time to get things done, finding more flexibility in the way they staff key projects, initiatives, where they want to go global with stuff, top talent in IT, web development, AI, design, admin, marketing, you name it. Posting a job on Upwork is easy. Upwork makes the business process easier, simpler, way more affordable with industry low fee. So post a job today and you can hire tomorrow on Upwork. Visit Upwork.com right now and post your job for free. That is Upwork.com to post your job for free and connect with top talent,
Starting point is 00:37:34 ready to help your business grow. That's U-P-W-O-R-K.com, Upwork.com. You may have heard of NutriFall's hair growth supplements and wondered, do they actually work? It's a fair question. There's a lot of skepticism around hair supplements, but NutriFall is different. It's physician formulated, clinically tested, and even recommended by dermatologists. NutriFall is the number one dermatologist recommended hair growth supplement brand trusted
Starting point is 00:38:00 by over one and a half million people. With NutriFall, see thicker, stronger, faster growing hair and less shedding in just three to six months. Find out why Nutrifol is the number one selling hair growth supplement brand for yourself. For a limited time, Nutrifol is offering our listeners $10 off your first month subscription and free shipping when you go to nutrifol.com and enter the promo code MyLet. Find out why Nutrifol is the best-selling hair growth supplement brand at Nutrifol.com spelled nutrafol.com Promo code my let that's nutrifol.com promo code my let
Starting point is 00:38:40 Very short intermission here folks I'm glad you're enjoying the show so far be sure to follow the Ed Mylett show on Apple and Spotify. Links are in the show notes. You'll never miss an episode that way. So my guest today, when I first started this show, I don't know, seven, eight years ago, was on my original list of the 10 people I wanted to have on. But when we were this little show, I couldn't get him. And now that we've grown a little bit, I'm grateful that we've risen to the level
Starting point is 00:39:05 of being worthy of his presence. And I'm so excited to pick his brain. I consider him one of the most innovative, smart people on the planet. It's Sal Khan, he's my guest today. He's the founder of most of you probably know, the Khan Academy, which is in my mind, one of the most revolutionary, unbelievable organizations
Starting point is 00:39:22 when it's come to education in the last few years. But we're gonna discuss sort of the advances in that world today, particularly AI as it relates to education. But also I want to just touch on AI in general with him today and what he thinks the impact on the world that's going to bring. He's promoting a new book he's got called Brave New Words. And really what that book does is it discusses the integration of AI in education and he's got this new thing called
Starting point is 00:39:45 the Conmigo AI Guide that I want to ask him about as well. So Sal Khan, finally, welcome to the show. Good to be here Ed. You didn't know you were on the top 10 places I wanted to be so you just had to ask. Gosh, the universe finally brought us together. I had that imposter syndrome going. So are there any industries, I'm putting pinning you down here any industries that you With your vision because Khan Academy even though you say it's you know, I saw this pattern I kind of stumbled into it. You're a visionary you saw need you saw where the future was It's like they say in hockey I don't know if Gretzky was the greatest player but everybody else skates where the puck is
Starting point is 00:40:19 He was skating to where the puck was going Where do you think the puck is going in terms of careers? If there was, if there are there industries that you say, this is going to be an industry, if you pursued it and chased it, you're gonna be in great shape when it comes to AI. And are there one or two where you're like, you know, my sense is these are gonna be in some trouble?
Starting point is 00:40:38 I would say whether we call it engineering, computer science, you know, a lot of people are skeptical because like, hey, these AIs can code better than they can do almost everything else. But if you're good at it, I remember, I graduated with a CS degree and a master's back in the late 90s. And I remember everyone telling me that,
Starting point is 00:40:57 hey, your job is gonna get outsourced to India, you should do something else. And that's one of the reasons why I went to business school. But if you look at the last 20 years since then, the inflation adjusted salaries for engineers has gone through the roof. I went to it, I became a hedge fund analyst because I thought that's how I could pay off my debt
Starting point is 00:41:14 versus working as an engineer. Now the best way to pay off your debt is go get some stock and work as an engineer. I think that trend is going to continue. Even at Khan Academy, we've seen that phenomenon. We're already seeing our engineers get two or three X productivity from these AI tools. To me, that's not told us, oh, we only need one third
Starting point is 00:41:32 as many engineers. We're saying we need three times as many engineers because we can now do so much more. The return on investment from hiring more engineers now is even better. So I think you're going to see that type of an accelerant. I think anyone, you don't have to be a formal engineer, but who has that type of thinking
Starting point is 00:41:50 where they can put the pieces together to make something work, I think is very, very powerful. So, you know, your point about whether universities in the traditional sense are going to exist, I think they are going to exist in some way, shape, or form, but you're going to have many other alternative paths. And I've been a big proponent of competency-based learning. It shouldn't be you sat in a chair for four years
Starting point is 00:42:13 and now we're going to give you a diploma that might mean something. It should be, hey, here are the skills that matter. You should try to learn them and prove that you know them. If you haven't proved it yet or if you fail the first time, try again next month. And I can imagine, you already see this in fields like engineering and software engineering, because there's such a shortage, that there's alternative paths.
Starting point is 00:42:34 I'm trying to, actually my 15 year old, I'm trying to get him to do one of these software engineering camps, that even people with engineering degrees do, because the universities aren't teaching them how to actually be a professional software engineer. But if he does that, when he's 16 or 17, he could actually go make six figures. And then if he wants to go to college, he can go to college and, you know, he'll be
Starting point is 00:42:55 able to treat his friends to dinner. That's how you want to make your friends. So I think you're going to have other pathways other than the four-year degree and it's going to put pressure on a four-year degree that costs 200,000, $300,000 as a four-year opportunity cost to get a little bit more innovative. It is. We share that opinion. I was thinking about you know in college sports now these guys have their NIL deals and they can make more money younger. The two things that I want everybody to hear is that if you have a young person in your life, there's going to be an opportunity.
Starting point is 00:43:28 More and more the world has been innovative that younger and younger people can get wealthier and wealthier, frankly. You look at Zuck or Cuban or Musk or whoever, right? Even in my case, I got pretty wealthy young, not on their level, but younger and younger. And I know a lot of influencers sort of teach, be patient, but there's a chance now as a young person, if you're entrepreneurial and innovative, or even if you've got these unique skills,
Starting point is 00:43:49 you can start to make an awful lot of money much younger in life. The other application is if you're middle-aged or a little bit older, making a pivot in your career and learning a new trade or a new craft or a new skill is gonna become so much more accessible to so many of you to change your life in mid-life than it was before when this access wasn't there.
Starting point is 00:44:06 The reason I wanted you on the show back in the day, Sal, was this a question about you. And please, you know, I know you have a great deal of humility, but the more I read about you years ago, I'm like, here's this really brilliant man. He could have taken his life and stayed at a hedge fund or started his own one or And I don't know that I don't know if con economies made you an extremely wealthy man or not, but I do know If you applied yourself and wanted to build massive wealth that was the course you wanted to take in your life by this age You could be in the conversation with these other titans of innovation in the world
Starting point is 00:44:43 in the conversation with these other titans of innovation in the world. You clearly made a decision as a pretty young man to pursue, I think, passion and cause over net worth. At least it appears that way to me. Is that true? And if so, why? Like, was that a conscious decision you made? And are you happy that you made it in hindsight? Most of the time.
Starting point is 00:45:03 Yeah, you know, the, and no, I am not independently wealthy in that sense, but because Khan Academy is a not-for-profit, you own as much of Khan Academy as I do. I get a salary from the board. I think a very generous one, more than I expected to make when I started this as a nonprofit,
Starting point is 00:45:20 but I have a solidly upper middle-class lifestyle now, but I'm very happy with it. You know, my decision back in 2008, 2009 to work on this full time, I just dug deep and said, well, what makes me happy? And I was like, well, you know, as long as I have my friends and family, we have the resources where I don't, you know, I grew up fairly, you know, grew up in a single mother household, we didn't have a lot, so I did and
Starting point is 00:45:49 frankly still do have some insecurities about, you know, when financial stress gets too much, but I said look, as long as we can get a, have a 2,000 square foot house, have a couple of cars in the garage, go out to eat every now and then, go on vacations every now and then, nothing ostentatious and you know I have my friends, my family and my health and I get to work on something I care about, that's wealth. That's the ultimate and you know everything else, you know I don't really need much more than that and that's when I quit my day job to start working on Khan Academy and you know every now and then I might look at like oh a peer startup with much less scale etc. might be worth X, Y or Z but then I just think about
Starting point is 00:46:36 well what would I do with all of that and I just remind myself, I forgot who said this but I think it was the 19th century. Some author wrote, you know, I have something that many of these billionaires, at the time I think he said millionaires, will never have. And then the interviewer said, what? And he said, I have enough. And I have more than enough. I click my heels every day to work and I feel blessed every moment. And I think that's very few people on the planet get to feel that way.
Starting point is 00:47:08 I admire that tremendously. That's an easy thing to say everybody, like in hindsight, but when you have the capacity that Sal has to have made that decision is a really noble decision. And to live it out every single day of your life as you see peers of yours with way less scale that haven't reached 165 million people with transformation build massive wealth and I've always wanted to ask you that because I've admired it so much from a distance so just want to acknowledge you for that brother the work you do in the world
Starting point is 00:47:38 clearly matters and I think when you're doing it like you said maybe you don't feel the impact of it because you're in the day-to-day grind, but just know as a fan of yours looking back at you, I have so much admiration for the way that you've decided to serve and live your life. So I just want to acknowledge that. I appreciate that, but yeah. Yeah, it's really true. I'm having a good time. Good.
Starting point is 00:47:59 I'm having a good time doing this interview. I just got to be honest with you. Let me ask you this. I'm going to poke a little deeper. What are your fears about the next five or ten years of AI? Like if someone's, I've seen you talk a little bit about this, but you've covered most of the positive things. Clearly, you're bright enough man to know there are risks that come with revolution and innovation. What are they in your mind? Oh, and there's some things to be afraid of, to be sure. and innovation, what are they in your mind? Oh, and there are some things to be afraid of, to be sure.
Starting point is 00:48:26 I am very afraid of what we're going to be able to do with, or what bad actors are going to do with deep fakes. You're already seeing cases of fraud where you get a phone call from, sounds like your child says that they're arrested, it's interacting with you, they're trying to get you to send bail money, it's fraud. You're already seeing, unfortunately, people creating deep fakes of their classmates doing things and putting it on social media.
Starting point is 00:48:52 That stuff is scary. And I don't have all the solutions on how to fix that. I think you're going to have state actors using AI and deep fakes, but not just deepfakes, just AI's ability to seem very human to manipulate people, to affect society. You can imagine if you saw videos of people waiting outside at their bank and they're not getting their money and it's all fake. Well then it could create a run on bank.
Starting point is 00:49:24 So these could be very, very serious things. So I'm very worried about that. I'm also worried about the economic dislocation. I've talked about who's safe, who's not safe. If you're a copyright editor or a copywriter, you're writing text. If you're working at a call center, you're doing medical transcription,
Starting point is 00:49:42 if you're working at a call center, you're doing medical transcription, or if you are a, let's call it a middling software engineer, you're not a great software engineer, you're kind of middle of the pack, I think that's going to be a tough economy. And so hopefully there's net new jobs that are more human-centric that the AI can't do, that's my hope, but I'm not sure if that's going to be the case.
Starting point is 00:50:07 We have to be on top of that. I'm afraid the pace of change is so fast. I feel like I'm in the middle of it. The people doing the research at OpenAI and Microsoft and Google are texting me and slacking me on a regular basis, so I feel like I'm plugged in, but even I feel like I'm falling behind. Every day, someone in my team says, what about this? And I'm like, oh my God,
Starting point is 00:50:30 we're not gonna be able to incorporate that for another six months, but will that be too late? So I worry that the change is so fast, and it's accelerating. People have been talking about the singularity for 30 years now, we are in it. And I feel it every day that it's like, what I thought last year would take a year
Starting point is 00:50:50 ended up taking two months. And then two months later, what I thought would take now two months is now taking two weeks. So it's accelerating. So anytime change like that happens, you just don't feel in control as much. And I weren't in the education realm, but I guess this is more broadly, I worry about
Starting point is 00:51:08 AI not being used well, and then people throw out the baby with the bath water. And then the real problem there is the bad actors aren't going to slow down at all, and then the good actors are going to slow down out of fear, and then it's just going to get worse and worse. What's the separator then? Like I feel like it used to be if I knew more than you in life, you know, if I had more information than you, if I was better educated than you, this is an overall general statement and it's not an easy one to answer, but to some extent
Starting point is 00:51:40 access to that is now levelized in an instant. And that's an overall generalization. And by the way, that's sort of been the case the last decade and a half in the world anyway. So in your mind, what is a separator skill somebody needs in culture now to succeed, to prosper, to increase their lifestyle, to increase their impact. It's entrepreneurship in the purest of forms. And you mentioned this has been the case in the last 15 years. You can think of someone like,
Starting point is 00:52:13 Justin Bieber would not have been discovered if not for YouTube. I, to some degree, would not have been discovered if not for YouTube. You can think of people like MrBeast, who's now has whatever, hundreds of millions of subscribers and is making probably similar orders of magnitude of money. Before, if we go back 20 or 30 years, there were all these gatekeepers, these tastemakers
Starting point is 00:52:34 who would decide who's in and who's out regardless of how entrepreneurial that person was, no matter how skilled that person was. But then the internet democratized that dramatically. Now you still do have bottlenecks because a lot of things are very capital intensive. If I want to make a great movie, it still costs a hundred million dollars maybe to make that movie, I have to go still through the tastemakers, through those through those the production houses, the movie studios. I thought it was funny the debate with the the Screenwriters Guild. I thought it should have gone the debate with the Screenwriters Guild.
Starting point is 00:53:05 I thought it should have gone the other way around. The production houses should have been afraid that the screenwriters are going to start using AI to produce the entire movie. Because why do they need the production house anymore if they can take an amazing story? It's those people with an amazing sense of story, and if they have a vision for what they want to create, they're not going to have to raise $100 million anymore. They're going to be able to do that movie for $100,000.
Starting point is 00:53:26 And then you're just going to have more, and there's going to be a lot of, when you lower the hurdles, you're going to have a lot of crap out there, and you see that on YouTube and on the internet, but you're also going to get a lot more good stuff that's going to get discovered, it's going to get surfaced. So, yeah, I think it's generally,
Starting point is 00:53:46 it's that pure raw entrepreneurship is going to become even more and more valuable. In the past, things like going to getting a college degree, having brands on your resume, whether it's college brands or graduate school or employer brands on your resume, it was a signal to the tastemakers, to the gatekeepers that, oh, I'm hiring a hedge fund analyst. Let me get that kid from Harvard Business School who seems to know what they're doing.
Starting point is 00:54:12 And you won't even look at kids from someplace else. But as these tools democratize the ability to do things, you're going to have more and more people not have to go through these gatekeepers, not have to go through those same doors. They're going to be able to prove on their own That they're capable Gosh Just blows my mind a couple last questions. I'm just sitting here processing everything you're saying I'm like we are really in the middle because my world isn't your world every day, right?
Starting point is 00:54:38 Like this hasn't impacted me much yet. And quite frankly, I think the vast majority of the people that listen to my show haven't felt the impact of this yet and that when I do a show like today, I want to make sure they have context for it. Like this is here, this is coming, where this isn't pie in the sky everybody. Like it may not hit your real estate business yet or your mortgage company yet or your insurance business yet or your gym yet, but there is an impact and an application and to get educated about this. I mean, go to Chad's GPT and just play with it, for example, just see it, just experience it.
Starting point is 00:55:14 Um, I guess my last question would be the human being, right? Someone said to me the other day, I told him I was interviewing you and they said, please ask him about like, can't avatars just do all of this work now? In other words, why would a human being, if there's a school teacher, like I, why would I not just have the avatar of a school teacher? That avatar doesn't have a bad day. That avatar didn't have a disagreement with their spouse that morning, right? That avatar doesn't get sick every single day. that morning, right? That avatar doesn't get sick every single day. So, the need for the human over that avatar in general will always be what? I know it feels like I've sort of touched on this with you, but specifically everybody,
Starting point is 00:55:54 you could take an avatar of me right now and I could come give a speech to your company. I could do a Q&A afterwards that's better than the Q&A maybe that I would do if I weren't having a very good day. In my mind, in my business, that does to do if I weren't having a very good day. In my mind, in my business, that does to some extent kind of freak me out a little bit. So the overall case, everyone think about that, the avatar version of you and anything you do, unless it's a physical thing like lifting a weight, why do we not need to worry about that in general? I know it's a little bit repetitive, but it's asked in a different way because that's what it looks like.
Starting point is 00:56:26 Yeah. And look, even the lifting weight, I've been told that the robotics is the next big inflection point that's about to hit. That's going to be like a science fiction book. But the, my view, you're right. And maybe not today, but we can imagine three to five years, you are going to have, you are going to be able to zoom with an AI that can look at you, interact with you. It's going to feel like an amazing teacher or coach or coworker. I still think that at least in the teaching context, it's going to be better to have both, that you're going to have the in-person tutor.
Starting point is 00:57:02 At the end of the day, there's just something, there's just very, there's something very powerful that there's another sentient human being that is taking the time to care about me and that I have connected with. And no matter how good the AI gets, you're going to say, huh. I mean, it's kind of like machine-made versus handmade.
Starting point is 00:57:22 We still value, you know, if you look at a painting and we'll pay, you know, today we'll pay 100 times more for an original painting than for a print. Why? Because you're like, oh, there was a real sentient person who stood there and this was their real creative expression. And I think that's even more important when you're talking about someone who's looking you
Starting point is 00:57:40 in the eye, even if the AI can pretend to look you in the eye, but someone who's actually looking you in the eye, who's actually emoting with you, who's looking you in the eye, even if the AI can pretend to look you in the eye, but someone who's actually looking you in the eye, who's actually emoting with you, who's actually hugging you, I think it's going to pay huge dividends. I've heard, I don't know if this is true, I'll taper what I was about to say. We think there's five senses, but there's millions of subtle cues we're constantly getting from other people, many of which are probably subconscious. I don't consciously smell someone else's pheromones, but maybe it's happening subconsciously, or
Starting point is 00:58:14 there's all sorts of gestures and things that I can't even consciously articulate. That stuff's happening. Look, I actually think the AI is going to get pretty good at some of that too, because it's going to train on a lot of things that we're not gonna necessarily pick up. But I've got to believe being in the room with someone, knowing that other person is sentient, they're taking the time to care about me,
Starting point is 00:58:33 is going to pay a huge psychological dividend. And if that person can be augmented with avatar versions of themselves, well, that's awesome too. I've been fascinated all my life with finding out the ultimate way that human beings can perform and this man to my left I consider to be an expert on that topic and many other topics and so if you started a program called Max Outt your ultimate guest
Starting point is 00:58:56 would be the gentleman to my left. He's an all multiple time New York Times best-selling author in all kinds of different categories. He is the executive director of the Flow Research Collective and he's becoming a good friend of mine and I can't wait to talk about Flow today with all of you with Steven Kotler. So Steven, thanks for being here. Pleasure.
Starting point is 00:59:13 One of the points that you made about cars leads me to your new book. And so the book is The Future is Faster Than You Think, How Converging Technologies Are Transforming Business, Industries, and Lives. And so, why? I think culturally, we could enter a flow state as a culture.
Starting point is 00:59:32 In other words, I'm watching how fast the world is changing. How much performance and lifestyle and access to information and all these changes have happened in such a short window of time. If you even just look back 10 or 15 years, there's a difference in how we live our lives now. And if you can think forward, I've heard you and Peter talk about,
Starting point is 00:59:51 if you can even think forward 30 years, which is about as far forward as he says he can see. But in terms of all these things that are coming, I wanna talk about that, because I just want people to get a picture. Like, you don't have a choice, what you've said is you've gotta decide in your life whether or not you're just gonna embrace the science of being a peak performer or not, and if you don't, a choice. What you've said is you've got to decide in your life whether or not you're just going to embrace the science
Starting point is 01:00:05 of being a peak performer or not. And if you don't, you're probably heading towards, not neutral, you're probably heading towards depression, frustration, anxiety, fear, these negative things. The other part of it is you better get on board because culture is changing so quickly. The way we live is changing so quickly. So to me, flow state, embracing that,
Starting point is 01:00:24 getting great at it, getting in it more often, and your latest book go right together. Yeah, they do. And I'm gonna do three things to make all this make sense. One, you pointed out that I cover a broad range, seems like I cover a broad range. And other than the fact that I am a huge animal advocate
Starting point is 01:00:42 and have done a lot of work on animals and the environment, that's the odd man out, right? And that's just personal, I love animals, I love the environment, I like plants, animals, and ecosystems, they're the ultimate underdog and I like fighting for the underdog. I love that. So there's all that.
Starting point is 01:00:54 But everything else, I study the impossible. When does the impossible become possible? That's what I study. And when you see the impossible become possible, you tend to see two things, one of which we've talked about, which is flow. That shows up all the time. The other thing you see the impossible become possible, you tend to see two things, one of which we've talked about, which is flow. That shows up all the time. The other thing you see is,
Starting point is 01:01:09 when the impossible becomes possible, you basically, I always say you find, you're seeing people extend human capability, and that usually means flow. Sometimes it could mean other things, like during the Renaissance, it meant the printing press, and everybody had access to information for the first time, right?
Starting point is 01:01:22 But often it's flow, or flow is in that mix. The other side is people leveraging disruptive technology. That's the other half of this equation, right? The impossible becomes possible. Even like you look at the skiers that I was running around with in the 90s, Shane McConkie, who's the most famous of all of them, one of his giant contributions is he invented new skis
Starting point is 01:01:42 that were much wider, so we had big platforms to land on. Suddenly, the human body could jump off 100-foot cliffs because the platform got bigger and it was better made and at the same time that they started harnessing flow in ways nobody ever done. Convergence of both. Convergence of both. That's what you see. What's happening now is we're seeing convergence.
Starting point is 01:02:02 We're seeing artificial intelligence smash into robotics, smash into 3D printing, smash into biotechnology, smash into material science. The effects are insane. And so Ray Kurzweil, head of engineering at Google, so does their AI stuff, probably smartest guy in the world on this topic, has worked the math. And he believes we're gonna essentially experience
Starting point is 01:02:23 about 100 years of technological change over the next 10 years. So think back to 1920, think the now, think everything that's happened in that period and put it into the next 10 years, that's what's coming. Does everyone hear that? Did you get that? Okay, that's huge.
Starting point is 01:02:39 And you gotta understand, like, for entrepreneurs, for people who wanna get out in front of it, oh my god, there's, I mean, we're to create more wealth in the next decade than we have over the past hundred years. There's more Google size opportunities waiting right now than ever before. So tremendous opportunity. If you're an established organization, a traditional business, something that's been built for safety and security, oh, you're screwed.
Starting point is 01:03:03 Get nimble, get agile, because you've got a problem other than that. So here's where all this ties together. We have a problem dealing with this amount of change, which is our brains are local and linear. We evolved in an environment where local, everything's a day's walk away. Linear, rate of change is really slow.
Starting point is 01:03:21 Great granddad's life, great grandson's life, roughly the same, not much changes. Today, global and exponential, right? Happens in China, we hear about a second later. Exponential meaning like the difference between last week and next week could be enormous, enormous. So the brain doesn't work at that speed or at that scale. Fundamental problem.
Starting point is 01:03:42 And there's a third problem, which we'll get to in half a second, flow is the only time we can process information at speed and at scale, even better. So there's a part of your brain right here, the medial prefrontal cortex. It does a bunch of different things, long-term memory, retrieval, blah, blah, blah. It's really creative self-expression.
Starting point is 01:04:00 And it's a very selfish part of your brain. So if you think about yourself, it will get really active. If you think about your wife, it'll get a little less active, it'll stay active. If you think about me, you don't know me as well as your wife, it'll shut down a little bit more. Think about a total stranger, it's turned off. If I ask you to think about who you're going to be in 10 years and who you're going to become, you would think this part of the brain gets active. It doesn't. It teaches, the brain treats the person we're going to become as a total stranger. This is why you have a hard time staying on diets.
Starting point is 01:04:31 This is why people have a hard time lifting weights, right? Oh my God, it's gonna hurt today. The benefits not gonna show up for two years, right? Or why do I wanna get that prostate exam? Because the person who's gonna benefit most from that stuff is not the person who you are today. Literally the brain treats it like a stranger. Flow is the only time you can think about
Starting point is 01:04:50 who you're gonna become in the future and this part of your brain stays active. So it gives us an enormous advantage. One of the things that happens in flow, the technical term for it is the watchtower effect. It basically feels like you're high above your life. I have insights, I can see farther, right? And it really comes because our sense of self is shut down
Starting point is 01:05:09 and we can think to the future and this part of the brain stays active. It's a couple of those things working together. But it means that if you're going to plan for your future, if you're gonna try to steer your company into the future, you wanna do this, the thinking at least, in flow, for sure, if you can. Because it's the best, we can think for the future and we can think fearlessly about the future, which is a big deal.
Starting point is 01:05:31 All of this is converging. The fearlessness part, being in flow, being able to process information this quickly. And you guys, one of the things all of you entrepreneurs listen to, which is a large part of the audience here too, or you work in a company change, I've heard you say this and I believe this, there's not gonna be a single industry the next 10 years, it's the same. Yeah, so in the book we go through the 11 largest industries on Earth and we track what's converging into these industries
Starting point is 01:05:57 and what are the changes that we see already. Like there's nothing, the book is out, I mean it's outrageous. I won't flat out, it's outrageous. Everybody has the same, like everybody has the same, oh my God, my brain's gonna melt. And by the way, it's not like, I had to stop writing the book,
Starting point is 01:06:15 I wrote a sci-fi novel in the middle because I had to, I couldn't, I was starting to like, how do these things converge and what is that, I wanna know what was it gonna be like to live in that world? so I literally created a universe five years in the future and put a character in it and wrote a novel so I could write this book with Peter, because I could hold the individual stuff in my mind,
Starting point is 01:06:34 but once you started putting it together, I was like, this is really hard to track and even think about. What it means to be human is going to change. I'll give you an example. I think that's probably true. I do. Well, certainly CRISPR stuff and genetics stuff.
Starting point is 01:06:49 Well, that's where we're going. We gotta at least touch on this. Because the vast majority of people, CRISPR's here now. So this isn't like eight years from now. CRISPR technology exists today. Last year, they used CRISPR, and they edited sickle cell anemia.
Starting point is 01:07:04 Right. I mean, sickle cell anemia. Right. I mean sickle cell anemia. There are 50,000 heritable diseases. 32,000 of them are single mutations, which is what CRISPR is designed to change. So 32,000 genetic diseases could go away this decade. I mean that's a radically different world. Right, and is everybody getting that?
Starting point is 01:07:26 So I think a vast majority of people don't know what even CRISPR is. And they think it's a future thing, but it's here now. Guys, we are currently, not only is it a disease eradication, that changes what it means to be a human being. How long you're gonna live, how healthy you're gonna live, what your life experience is going to be like.
Starting point is 01:07:44 And having said that, I have a sister who's got diabetic retinopathy, right, which is one of the original places this stuff started to really work. Guys, we're remaking the retina. Blind people are seeing. This is remarkable stuff that's happening right now. So I was in the room as a Wired reporter the very first time the first artificial vision implant was ever turned on. So the very first time the first artificial vision implant was ever turned on. So the very first time there was a blind guy
Starting point is 01:08:07 who was made to see again, nobody was in the room, I was actually, it was seen, funny story. So I'm in this lab, it's Professor William DeBell, we're in New York, the patient, his name is Jans, and this is, so this is 2000, he's literally got what we used to call stereo jacks in the side of his head. So he's literally got wires jacked into the side of his head. He's got an implant in his brain. He's been blind for 20 years and they're about to, it's literally they're counting down to when they hit the button.
Starting point is 01:08:38 It's 10, 9, and I'm like, and I realize I'm sitting across from him and I'm a reporter. Our job is not to be history, right? Like our job is to report on history. You don't want to be the history. And so what do I do? I push back and try to get out of the way. He's blind. He's been tracking like motion through sound for 20 years. So what happens? Three, two, one, ice! And I always say there's a moral here, which is I try to get out of the way. And I always say there's a moral here, which is you can never get out of the way of the future. It's coming for you whether or not you like it.
Starting point is 01:09:11 I thought that was amazing. I was literally like trying to duck. So give them, you guys, I just want to get a little bit of picture of this. So if you're tying together where we were in the beginning to where we are now, this is such a great time to be alive. It's such a remarkable time. And frankly, the concept of how long you're going to have
Starting point is 01:09:31 had Dr. David Sinclair on. Oh, yeah, great, great. We talked a little bit about CRISPR more off camera than on. But just give them a flavor, just for fun. So we opened the book with flying cars, which are like everybody's crazy sci-fi technology. And it turns out that they're here. There are 100 different car companies, or there are 100 different companies making flying cars, which are like everybody's crazy sci-fi technology. And it turns out that they're here. There are 100 different car companies,
Starting point is 01:09:47 or there are 100 different companies making flying cars. Every car company is in it. Toyota put $400 million, $396 million into Joby Aviation flying car company three weeks ago. Bell Helicopter, they just changed their name to Bell because they dropped because no more helicopters, flying cars. And they're the ultimate conversion technology. Flying cars happen because AI hit robotics,
Starting point is 01:10:09 hit material science, hit 3D printing, et cetera, et cetera. And wow, that's totally revolutionary. Crazy, flying cars. But it's not just flying cars. It's also autonomous cars. Every major car company has an autonomous car. Waymo is rolling out on our streets. Google's autonomous car company.
Starting point is 01:10:26 This decade, or this year, simultaneously, hyperloop, high-speed trains, LA to Las Vegas in 25 minutes, right? There are 25 different hyperloop projects in the world today. Then, Elon Musk, the boring company, we're going to drill tunnels under cities and this is already happening all over the place and put cars on high-speed conveyor belts. And, Elon's crazy idea, the rockets that he's using right now to put satellites in space that he wants to take people
Starting point is 01:10:56 to Mars with in the 2030s, he has promised that before 2030, you can use them for terrestrial travel. So, New York to Shanghai in 39 minutes. Unbelievable. 2030, you can use them for terrestrial travel. So New York to Shanghai in 39 minutes. The point is, so the point is not just it's one, it's all of these over the next 10 years, and you have to, like this does, like think about simple things, car insurance. Well, if the cars are all autonomous, you don't need cars.
Starting point is 01:11:20 In fact, Waymo, when you sit in one of their autonomous cars, you're automatically insured. It happens automatically. The risk profile has shifted. Whole categories of insurance go away. But simple stuff. If Las Vegas to Los Angeles is 25 minutes, how big is the local dating pool? You've got it.
Starting point is 01:11:39 Right? How big is the size of the local school district? You live in LA. You don't like the schools your kids are going to. Suddenly, they can go to school in Vegas. It's an hour away or 20 minutes away. All fundamental things, the way we like to talk about it is when you have solitary exponentials,
Starting point is 01:11:56 AI, whatever, they tend to disrupt products, services, and a little bit they make markets wobble. So like classic example is Netflix, right? Streaming video, it's one accelerating technology and they put Blockbuster out of business, right? So this is product, service and a little bit of the market. With converging exponentials, you get the scale increases massively. So you get products, services, markets, institutions and all the structures that support them. So right, suddenly the local dating pool, the local school, like really foundational
Starting point is 01:12:30 things. What to do to the real estate market? Oh, the real estate market is insane, right? And the concept, so you went to someplace smart, right? So suddenly you can live, flying cars by the way do 150 miles an hour and they can do three hours of continuous flight. So imagine how far you can live, and this has enormous environmental consequences. The best thing we can do for the environment is leave nature alone, and suddenly we can live in,
Starting point is 01:12:56 right, you were just in Coeur d'Alene, you were talking about how much you like Idaho, and one of the reasons we like Idaho so much is because there aren't many people there, right? That's about to change with flying cars. Or we're going to have to legislate around it and think about this stuff and get out ahead in front of it. That's just wanting, that's just transportation. Every other industry is just as crazy. We're going to look at our lifetime back and go, most people used to live within 30 miles of where they worked. We're going to look back at the time and go, how bizarre, what a small world it once was.
Starting point is 01:13:23 Where you want to go to dinner at night. And by the way let's let's talk about your new industry you're now in the podcasting business and we were just talking you like to make your things an hour 45 minutes to an hour long because the average commute yeah it's 45 minutes to an hour right so and you're more popular on audio than video because people are listening to you. So in flying cars or in autonomous cars, you can have a completely, it's a theater. It's a moving theater, right? If you want, you can have a desk to work at, you can have blah, blah, blah.
Starting point is 01:13:55 So suddenly people, podcasters who have been relying on audio, right, it may go to video because the autonomous cars, like the cars where people are driving to work, the commute hasn't changed, but they don't longer have to drive and now they can watch it. Even like, I'll tell you something crazy, nobody likes it when I talk about this out loud, but everybody on the inside of the autonomous car industry
Starting point is 01:14:15 is talking about it. Autonomous cars are gonna revive the sex industry in ways we haven't even, like it's a brothel on wheels. Right? Imagine what happens with Tinder, the sex industry in ways we haven't even, like it's a brothel on wheels, right? Imagine what happens with Tinder, like Tinder for autonomous, wanna share a car home, right? I mean like people are talking,
Starting point is 01:14:32 like they don't like it when you talk about it out loud, but like that's what I mean when you talk about like the fundamental fabric of society is gonna change. All that stuff is gonna shift in wild, weird ways. And you know, one of the reasons we wrote the book is most people in wild, weird ways. You know, one of the reasons we wrote the book is most people are scared about the future. That fear is based on a lack of knowledge.
Starting point is 01:14:53 So go out, read about what's coming, and figure out how you can take advantage of it in your industry. Get their book. I'm serious. I endorse books when someone's on my show, if I've read them. You gotta get them.
Starting point is 01:15:03 Why would you not wanna know this information? Why would you not wanna know what read them. Why would you not want to know this information? Why would you not want to know what's coming? Why would you not want to begin to think this way? And when you can converge these two things, everybody, when you have the convergence of this flow state and the convergence of what's coming, and people say, wow, that sounds pie in the sky. Many of the things we've described are here now.
Starting point is 01:15:18 Most of my very wealthy friends, when we're talking about different things that were being pitched on investment-wise, I mean, I'll be honest with you, flying cars and autonomous cars are constantly coming up all the time. Right now, like significant investments from very smart people. Yeah, and I mean, Tony, Peter's in business,
Starting point is 01:15:33 what are they doing? Longevity and stem cells, right? Like our friends are, Peter's got four different longevity companies. Right, right. Like, I don't know how many Tony has, but like- A bunch, right, a bunch.
Starting point is 01:15:44 Like, they got a lot of money. They want to live forever. OK, cool. Like, thank you. Let's go one more area, because I have not talked about this with you at all. And I'm just curious, because we're talking about travel. And there's all kinds of, there's quantum computers,
Starting point is 01:15:56 and all these other things we could be talking about. And if you think that this is pie in the sky stuff, if I'd have told you 10 years ago, even 10 years ago, that you would just say, you know, I want to jug a milk and it's at your house the next day. You just say it out loud into the air and the milk shows up at your freaking house. Right? Like you thought I was absolutely crazy. My son and I were touring this college and they were showing us how amazing their library was. Their library. And I'm like, these still exist? Like, you used to have to get your card, drive somewhere,
Starting point is 01:16:28 give them your card, with ink on a deal, write it on there, check you a book out. Like, think of it. Wait, do you remember learning? We were roughly the same age. Do you remember learning the card catalog? Yes! Like, that was the thing we had to learn.
Starting point is 01:16:40 Here's how we use the card catalog. And I was like, right? It's stupid things you take for granted. You used to have a Thomas guide. That wasn't long ago. I just looked at a pretty vintage car, not that vintage. Was there a Thomas guide in it? Yeah.
Starting point is 01:16:53 It was a Thomas guide in the car. So when we're talking about these things, guys, you have to, you must. It's a necessity that you're going to think this way. I'm just curious, because I haven't asked you this. What do you think it means for space travel? Do you have any insights into that? Oh, yeah, so we, at the end of the book.
Starting point is 01:17:07 I know. Right, that's where you're going. The book is really, it's focused on the next 10 years. What's gonna happen the next 10 years? But we pull back in the last chapter for the 100 year view. And two things to know about the 100 year view.
Starting point is 01:17:24 So, Ray Kurzweil also worked the numbers on what happens over 100 years. So it's 10 years, it's 100 years with a change over the next 10 years. Deep breath, before the end of the century, according to Kurzweil, who's barely wrong about anything, like he's just not wrong. He's made so many predictions, He's batting like 86%.
Starting point is 01:17:45 I mean, from the fall of the Soviet Union to the arrival. I mean, he doesn't seem to miss with this stuff because it follows just, there's math underneath it essentially. He says over the next 100 years, before the end of the century, we're going to experience 20,000 years of technological change. So that's birth of agriculture to the industrial revolution twice in the next 80 years. So let's say he's off by 50%, right?
Starting point is 01:18:12 Let's say he's spectacularly wrong, right? And it's only 10,000 years of change. Are you kidding? Right, so one of the things that's gonna happen, and we look at this in the end of the book, is this is the century that we stop being a single planetary species. We become a multi-planetary species and this you know it being partnered with Peter for almost 20 years at this point you know this is his big passion so I've got to see
Starting point is 01:18:39 it up close a lot and I've watched this know, go from Peter's crazy idea that he shared with a handful of other people into a billion dollar industry growing towards a trillion dollar industry. So it's really, right, it's massively matured, but, and it's, I mean, it's really funny because what we write about in the book is like, if you think about the space race, you know, I say that to most people they think USSR versus USA right? That's the like what got us into space. It was well there's competition between two superpowers. What's gonna unlock the space frontier this century? Competition between two superpowers only those powers are Jeff Bezos and Elon
Starting point is 01:19:16 Musk right? They're arguing Bezos got blue origin, Musk has got SpaceX and they both and they've got different viewpoints. Bezos wants to go to the moon and build space floating space colonies. Musk wants to go to Mars, right? And that is what is unlocking the space frontier. And I mean, by the way, just so people can wrap their heads around this, this isn't actually in the book, but there's a company, I'm gonna blank the name of it, Peter's been talking about it for weeks now. They figured out how to 3D print in a couple of weeks a rocket. A rocket. A rocket.
Starting point is 01:19:50 And this is amazing because it used to be like, you know, rockets are billion dollar toys. And like if you screw something up in design, right, like you have to build and you're like, oh damn, I wish the fin was five degrees off or like what, you're screwed. Now we can build, can iterate in a rocket in two weeks like things like this ever happen and this is already like this is here right now really cool really interesting so yeah this is and it's probably gonna happen
Starting point is 01:20:17 over the next 20 years today's show is sponsored by strawberry dot me so you know this I'm a big believer in coaching, especially when it's from a reliable source, and I think most people should have some interaction with somebody who's helping them get better in their life. So if you're waking up every day and you know you're capable of a little bit more, but you're not really sure how to get there, listen, success doesn't just happen. Most successful people in the world don't figure it out on their own. They have a coach, they have mentors, they got coaches, they have people guiding them every step of the way. That's where Strawberry.me personal coaching comes in. You'll identify your obstacles that are holding you back, you'll develop a step-by-step plan,
Starting point is 01:20:51 take action and confidence, you can be held accountable if you want to. Knowing you have a dedicated support staff, a coach behind you every step of the way. Instead of relying on guesswork or waiting for the right time, I've had a personal coach for a long time and it's helped me tremendously in my life. You know I love that Chinese proverb, if you want to know the road ahead, ask those coming back. That's what a coach can do for you. They've got the directions many times in your life. Go to strawberry.me slash ed and claim your $50 credit. That's strawberry.me slash ed. Hey guys! Like my shirt? Guess where I got it? Quince. Yep, Quince is an awesome place
Starting point is 01:21:26 to go get first class quality stuff. First class suitcases, clothes, gear, you name it at affordable prices like lightweight shirts and shorts from $30, pants for any occasion, comfortable lounge sets with premium luggage options and durable duffel bags to carry at all. The best part, all Quince items are priced 50 to 80%, less than similar brands. What they do is they partner directly with the top factories, cuts out all the middlemen, you get the discounts and savings.
Starting point is 01:21:52 Quince only works with factories that use safe, ethical and responsible manufacturing practices and premium fabrics and finishes. So here's what I would tell you to do. For your next trip, treat yourself to the lux upgrades you deserve from Quince. Go to quince.com slash ed for 365 day returns plus free shipping on your order. That's q u i n c e dot com slash ed to get free shipping and 365 day returns quince
Starting point is 01:22:19 dot com slash ed. That was a great conversation. And if you want to hear the full interview, be sure to follow the Ed Mylett show on Apple and Spotify. Links are in the show notes. Here's an excerpt I did with our next guest. My guest today is an eight time Emmy award winner. And you probably best know him from Dateline or Catch a Predator, but he's the first guest I've ever had on my show, where I'm glad to be introducing him to all of you and not having him introducing himself to me because the last thing you want to hear if you're
Starting point is 01:22:49 in someone's living room or kitchen is Hi, I'm Chris Hanson, grab a seat. So I'm super glad that I'm the one introducing him and he's got a podcast out right now called predators I've caught I'm super fascinated by this man. It's kind of a giveaway as to who it is. So Chris Hanson, welcome to the show. And thank you very much. I suppose you're going to tell me to have a seat for this. That's exactly what we're doing. And I'm fascinated by you, Chris. Do you think that a couple more things? I'm fascinated by it. I mean, it's like, for me, there's a conversation many years in the making of wanting to ask these questions. I know it is for my audience too.
Starting point is 01:23:23 Do you think that forecast a little bit you're in you've been in this space a while. Is there a future that you see coming where predatory type people change their behavior? Is there another venue or forum other than just online? Is anything you see that we should be looking out for that's sort of the next level, right? As you've said, we didn't have text and phone back in the day. Now we've got Instagram and TikTok and dating apps. Is there anything on the horizon? I think whatever the next thing is, people will try to exploit it. Whether it's another level of communication, another level of internet connectivity. You know, we see it with the interactive
Starting point is 01:24:09 video games, you know, that's another thing people think, well, they can say playing a video game, well, they don't know who's on the other end of that. Maybe you say they're Sam Johnstone from wherever that you don't know who that really is, you know, so I think I think it really, it's the same predatory behavior taken to the next level and exploiting whatever technology is available. I mean, in the old days, you can sell the same tractor six times at the bus station to the same guy looking to buy a tractor. You can get aware of that about seven times before the sheriff would figure out what you're up to.
Starting point is 01:24:43 you can get away with it about seven times before the sheriff and figure out what you're up to. And today, it's just all you need is a picture of the tractor. And people sell stuff multiple times all the time online. So you've got to be careful. I was going to just add to what you're saying that I'm a target for certain people just because of net worth or whatever or reach. And I just want to share with the audience that when someone,
Starting point is 01:25:06 someone doesn't usually come right directly at me with their thing they'd like to get from me, it trickles over time. And it can seem like a very normal relationship in the beginning, even for an extended period of time. But what I didn't know was in the back of their mind, they were waiting for that moment where I was the most comfortable. I was the most vulnerable. I was the most trusting. We are, we are being groomed by people for different things other than just what Chris is experiencing or experienced.
Starting point is 01:25:34 So that's true, isn't it, Chris? There's a grooming that someone's, someone who's trying to get something from you. There, I mean, I've seen it, you know, I've seen it in my life. I mean, you know, I said to people, I said, do you really think that I'm the guy you want to try and con? How do you think that's going to possibly go? And mostly if people, you know, make up a story, but it's happened a couple of times where, you know, somebody is, I've caught somebody trying to take advantage or an online scam or the typical phone calls you get where you know it's somebody trying to get money from you for some nefarious
Starting point is 01:26:12 reason. You can tell because they'll say Christoph or be half the name that doesn't fit on their readout sheet. Put put an ER on that. Does that name sound at all familiar to you? Oh my god, sorry about that. They picked the worst dude. Could you pick a worse human being on this day? My last question, I'm curious about redemption, but I just, I think I've always in my mind thought when it came to sexual predators, physical predators, physical abusers, I've just sort of always, that's been a category that I went,
Starting point is 01:26:57 no, they're not redeemed. They're, they don't come back. They don't get a second chance. I'm just being candid about everything. Liars, cheaters, philanderers, bad people, gossippers, they can all be redeemed and changed. Those two categories for me have always been, nope, they're not redemptive.
Starting point is 01:27:16 No way, no how ever would I trust them again. Well, I think there's a category of predators who can never be reformed. I think there definitely is a group and no matter what you do or how you punish or anything else, they're gonna figure out a way to, they're hardwired to do it. And that's probably, at least in my experience,
Starting point is 01:27:37 of 30 of these guys, you're never gonna change them. And they're gonna be, and we've seen them. We've seen them get out. We've seen them re-offend. We've seen guys who successfully go on to have productive lives. And, you know, I think there are categories that with monitoring and probation and some sort of treatment, I think they'll be fine. But we've also had guys in prominent positions who, you know, the therapist and had their parole officer visit.
Starting point is 01:28:05 And they're not supposed to have a device that has access to the internet. And the phone rings while they're in with the therapist and the probation officer. And they're caught red handed with the phone. I mean, you know, it's like, come on, Jesus. Could you at least try? You know, but they can't be reformed in some cases.
Starting point is 01:28:19 I'm so excited. I went to the higher register. I did it. Where was that? I haven't heard that out of you ever. See, we were talking off camera about my incredibly deep voice. And one of the higher register. I did it. Where was that? I haven't heard that out of you ever. See, we were talking off camera about my incredibly deep voice, and one of the reasons we were talking about this is because this is one of the most observant people on planet Earth,
Starting point is 01:28:32 I think, and also one of the funniest. You realize I'm watching everything you do. I know. I'm watching everything you do right now. I'm using the deeper tone of the voice, presentational, make sure that people can max out, do what they need to do, this is what it's all about. And I don't care, listen, I was at the gym the other day. I know you probably saw me there, a lot of people do it.
Starting point is 01:28:53 You know why? Because I was shooting an Instagram video from the gym, so you knew I was at the gym, because not everybody goes to the gym, and we got everybody at the gym through me. We're gonna do this, aren't we, today? This is awesome. This is Frank Caliendo. I didn't even let you intro me.
Starting point is 01:29:11 See, you were worried about me talking enough. You're like, what can we do to get through this? We're gonna, I made a checklist. What you need to do is write things down. In the age of social media, even those of you that are in sales or in marketing, one, you can hear how cognizant and entertainer is of this, the priority of that. But also the second thing is like everything now has to have an entertainment aspect to
Starting point is 01:29:35 it. Even your presentation, it's almost like infotainment. There's got to be something where people enjoy the experience of receiving your sales pitch, of receiving your information, of interactivity. It's all about the experience of receiving your sales pitch, of receiving your information, of interactivity. It's all about the experience. But the one thing, like the fly just flew by you, and off camera I told you. One of my bits, like Al Pacino,
Starting point is 01:29:52 it's like he's always looking at a fly, that licks something. You got it. Was the fly there? No, it was some white thing. Oh. You got it, you just ate it. Whatever it was, it opens yummy.
Starting point is 01:30:00 Okay. You know what that was? Success. Um. Um. I don't know why I looked at that camera. It's totally the wrong camera. But Trump, you know, Trump would do that.
Starting point is 01:30:12 Look, I'm looking at the perfect camera right now. And if I'm not, that person's going to be gone. Um. Well, I don't even know what I was gonna say. That was so weird. No, but I was saying. See how you get closer. I get in the zone.
Starting point is 01:30:23 Like there's a force field on there. It's an energy thing. It just gets you close. Oh, okay. So, the zone, it's an energy thing. It just gets you close. So, but that's a good example though. Off the camera, I told you, and you agreed with me, that we're very similar in that we're both very, you're hyper observant of people. Like, I think you can't do what you do.
Starting point is 01:30:37 Correct me if I'm wrong in a minute, but like, I don't think you can do what you do unless you're just unbelievably observant of people. And I am too, I told you that because my dad was a drinker when he was younger, I'd have to really, I learned at like four or five years old to observe my dad when he'd come home to know which version I was getting.
Starting point is 01:30:51 Was I getting the sober, fun, loving dad, or was I kind of getting the dad who may be a little bit tired or in a bad mood or, you know. So you learn to read the audience. You learn to read the audience and then be a person based on that audience and who you, that version of you or that piece of you. For me, I was in high school.
Starting point is 01:31:11 And when I was in high school, I would actually know when the class, I couldn't tell you how, maybe I had spider sense, I don't know, but I couldn't tell you how, but I could feel when the class wasn't understanding stuff. So I would ask questions that would help the other students, and I was a B kind of student. I wasn't an A student.
Starting point is 01:31:31 I struggled this with my son who's way smarter than me, but has the same mentality as me, and I wish I could just go, look, you could be so much further along in the rest of your life if you knew what I knew now. But I would actually ask questions. I remember being in a specific Spanish class and saying something, does that mean? And then whatever. And the teacher would say yes and then a couple more sentences to
Starting point is 01:31:53 explain it. And then I could feel the other kids around the room actually engaging and going, okay now we got it. And I don't know why I did it. I don't know. The teachers would always tell my parents I was a leader and my dad would be like, my parents I was a leader. And my dad would be like, this kid's not a leader, what are you talking about? Because I just, I didn't have a ton of confidence as a kid, I still don't.
Starting point is 01:32:13 It still hits me, like I know what I can do, and I know what I'm really good at, but anything new, a new adventure's difficult for me. Right now in standup, I'm trying lots of new stuff. And it's not all impression based. When I go and do the impression, if I know the impression's done, like they're never really done,
Starting point is 01:32:30 but it's worked on to a point where it's ready to go, that I can sell it. But to sell me talking, this is taking what I'm doing right now, and it's because I've been podcasting a lot and talking as myself, that I've learned to feel that I'm actually interesting as myself talking. And I can go into this other stuff,
Starting point is 01:32:50 which is what sets me apart. I can just be talking and become Donald Trump or go Morgan Freeman or just Senator John Menne or John Grudenman. I can just go from voice to voice. How can you become like a seven year old girl? I've never seen anything like, you go from just this guy.
Starting point is 01:33:05 Like, all right. Oh my gosh, that's awesome. I am. I'm smitten with you. Yeah. I love, it's why I did the show. I'm telling you right now. And by the way, I appreciate you saying
Starting point is 01:33:15 that about your confidence of it. Because you really can't transfer to someone that which you're not feeling. So like even in your stand up, like you do need to get to that level of comfort with the new material to transfer the certainty to people. I think that in everything we do, that's just something to be conscious of. Like other guys I work with that are in the communication, I'm like, you have to get to
Starting point is 01:33:31 the place where you're that confident and certain because the audience senses your lack of certainty, especially by the way, especially, and this is for you salespeople too, especially when there is a portion of your presentation you're more certain about. They can feel the energy difference when you go into the stuff your presentation you're more certain about, they can feel the energy difference when you go into the stuff that you're not as certain about. The contrast gives you a space that's dangerous energy-wise when you're presenting. A lot of people can feel truth.
Starting point is 01:33:55 That's what actors are really good at, creating truth. Now, some people are great actors, and they can create a truth, but people can read truth. And when you're coming from something, that's why we use anecdotes, personal anecdotes, because an emotion is tied to that inside of you and then you can tie it
Starting point is 01:34:13 and if it really hits you, the audience will feel that as well. If you just tell it, if you make up a story, a lot of the audience, some of it might spy you, you know, buy it, but other people are gonna be like, I don't know, I don't know why. And they might not even know why. But it doesn't, when you have that emotion,
Starting point is 01:34:31 when you have that, entertainment's not really different than sales. It's not. No, I mean, when you're an actor, you're just selling that you're somebody else. Right, and I think, I say this to all the time to people, I'm really glad we're going here for a second. I don't think you can transfer to somebody or an audience, a person or an audience, that
Starting point is 01:34:47 which you're not experiencing yourself. So if the story is true or if you do believe it or if you are confident in it, you'll transfer that certainty to somebody, that energy to someone. If you're not, they feel it. And that's why a lot of times some of my standup friends, when they are trying new material, it's difficult to decipher whether the material is good or not because it's not the material that may not be good, it's your lack of comfort and certainty with it. That's also true for those of you that are in sales. It may not be that your presentation is not
Starting point is 01:35:13 really what it should be. It may be that you're not yet repetitious enough with it or confident enough with it where you're transferring the energy. It may not be the words or the joke. Do you, and I didn't learn about this until acting. I never, I just recently, the last few years, started taking acting classes. I mean, I always thought there's no way that that stuff's real or works. So I met a guy in Phoenix, his name's Matt Dearing,
Starting point is 01:35:36 and he's been out to the East Coast to train with like some real big time acting coaches. And I never realized how important it was to have things, and this is a sales thing, this isn't even outside, but it works in knowing your lines in entertainment in a script. If you can say a script while doing something else, then you really know it.
Starting point is 01:35:58 So if you memorize to the point, like we would memorize a script, and then what we'd do is we'd play catch or be doing something else while reciting the script back and forth. If you can do that, it's second nature. Cause we, okay, so I'm talking and I'm trying to present an idea to you, right?
Starting point is 01:36:15 So I could be doing anything. I could be checking this pillow over here. I could, I can do anything I want. I can be taking a drink. I'd naturally pause. I know when to start talking again. But if it taking a drink, I naturally pause, I know when to start talking again, but if it were a script, you ever watch somebody on screen
Starting point is 01:36:31 and you know they're like, okay, now I have to take a drink. And you can see that happen. Or you try it yourself. That's because you don't know the dialogue that well that you just can't talk through it. Very good, that's really good. It's an amazing, and that's what the best actors can do.
Starting point is 01:36:44 They can memorize to a point where they just talk. Well, that's outstanding. I've not heard that before. So if you think, if you have to think, this is as an actor, and I would say this works in sales, too, if you have to think about what you're going to say, you've already lost the audience for that amount of time because they're going, wait, you ever see somebody do this? And it's not planned.
Starting point is 01:37:04 There can be moments where you're thinking of something, but you come out of character of it. Because when you're doing a presentation, you're kind of in your character mode of your presentation version of yourself. So if you come out, you gotta think, oh, what was I gonna say here? That part right there, all of a sudden they go,
Starting point is 01:37:21 he doesn't know he's talking. He doesn't know he's talking about it, or they don't trust what you're about to say being true. That's so interesting you say that. I just want to weave a line in here about it. One of the things I always say is that you need to know your lines or your presentation so well, and whatever you're doing, that it's reflexive.
Starting point is 01:37:36 Meaning you don't think you be, because under pressure, under pressure you respond reflexively to things, and if that reflexes you don't know it, you're screwed. I just interviewed Cesar Millan, the dog whisperer, and he said he's all about energy with these animals. It's just really interesting. I just did it this week. It'll come out before this comes out.
Starting point is 01:37:53 But what's interesting was he said that when he's in best influence with an animal is that he is being, not thinking, which is exactly what you just said. When you can relate, and it's a give and take with the audience or whatever you're doing, we're doing that right now. There isn't so much, we're not trying to keep to a script. Like one of the things you talked about before we came on here, it's like don't have part A, part B. It's not my turn to talk, it's your turn to talk. It's let's talk together because that's how people actually talk. In a script, if you don't have it memorized and you, if you don't have it memorized
Starting point is 01:38:25 and you're thinking, you don't have it memorized enough, you're like, okay, it's my turn to talk. That's already taking you out of the script. It's my turn to talk. You just have to hear it and go. And you're in the moment. I think that applies to all the sales, all presentations. So do I.
Starting point is 01:38:39 Everything you're doing, it's that don't think, don't plan, just be. Gosh, that's so good, because I'm really glad we're going there because I did something recently on listening that you were making reference to earlier. And the only way in sales or even in anything like this is the only way that you can really listen to somebody is that you are already very comfortable
Starting point is 01:38:59 with what you would say so that you don't have to be thinking about what you're gonna say back to somebody. You can actually just be present with the message. Don't plan what you're about to say back. Listen to don't have to be thinking about what you're gonna say back to somebody. You can actually just be present with them. Don't plan what you're about to say back. Listen to what they have to say. Because if I listen to what you have to say, my planned response may not fit what you just said. So if you do wanna get a message across,
Starting point is 01:39:18 make sure you address what the person just said first. This is not acting, this is, because you can't do that, you can't change the lines in acting if you're doing a script that's written. But in terms of sales, relationships in business, don't just have this next thing planned to say because that will turn a person off. It's like you're literally, if somebody uses,
Starting point is 01:39:40 if somebody uses a word, you using that as a connection, you lose a connection when somebody is doing something completely different and on a different plane than you. When they're going off and like, okay, but let's bring it back. It's not even that. It's, you asked me a question
Starting point is 01:39:59 and I start saying things about acting again, but it had nothing to do with it. Because I had this plan in my head, and that's not what people want. I watch that a lot in interviews on podcasts too. I'm like, oh, you didn't even hear what they just said. You just went back into the next thing you had on the list. Do you know, I'm curious,
Starting point is 01:40:16 do you know when you're losing an audience, and if you do know you're losing it, is there a way to change that? I mean, right now I lose an audience more because I go off on these tangents to try different things. But I'm almost purposely doing that. Speaking about that, I think that that's being willing to take some risks.
Starting point is 01:40:35 A hundred percent because I'm so scared. I got so happy with what I was doing with my act and the formulas in my act that I didn't take a lot of risks. And it became, you ever hear the story about stories and this is psychological I guess as well, somebody gets in a car accident, the best thing to do is get them in a car again. The longer you wait, the tougher it is to get them back
Starting point is 01:40:56 in that car because they'll build up, they'll make the event bigger and bigger and bigger. I was on stage doing the same stuff over and aboard with my act, not taking chances, not learning, not getting better, and it took its whole bit, but I was out there just making, again, that was the business side of me, not the art, or the relationship side of me, the relationship there
Starting point is 01:41:18 is between me and the audience, and doing a lot of the same stuff over and over. Now, I have to do certain things. If I don't do a John Mann impression, I don't do a Morgan Freeman, I don't do some of these impressions people know me for, people will get upset. I believe they're still an audience
Starting point is 01:41:32 and I'm gonna give them the things they want, but in different ways and lots of different stuff as well. Yeah, you go see the Rolling Stones, they better play Satisfaction, right? Or you're like, wait a minute, man, I mean, I want your new stuff. Anytime you go see a new band, you're like, I'll listen to some of your new stuff,
Starting point is 01:41:45 but I came to hear some of the old stuff. Right, I want to hear Billy Joel, I want to hear Piano Man. Exactly. If I don't get Piano Man, I'm sad. And he's an entertainer who gets it, and he'll do that. Even if you play the new stuff. And it's the art, it's the combination of art and business, which is what pretty much everything is.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.