The Rich Roll Podcast - Everything Is A Story: Journalist Nick Bilton Thinks AI Might End Humanity & How Stories Could Save Us

Episode Date: April 13, 2026

Nick Bilton is a Special Correspondent for Vanity Fair, a New York Times bestselling author, and screenwriter. This conversation explores the power of story — how tech titans like Jobs, Dorsey, and... Musk wield narrative as a weapon, and why AI may be the first technology capable of wiping us off the face of the planet. It also happens to come from someone currently writing the book and screenplay for Martin Scorsese's upcoming film starring Dwayne Johnson. He also pulls back the curtain on Silicon Valley's reality distortion field and how completely it can play you. Nick is a rare mind. This one is not to be missed. Enjoy! Show notes + MORE Watch on YouTube Newsletter Sign-Up Today’s Sponsors: Caraway Home: Save up to $190 on cookware sets + an additional 10% off with code RICHROLL👉🏼https://www.Carawayhome.com/RICHROLL Rivian: Electric vehicles that keep the world adventurous forever👉🏼https://www.rivian.com Squarespace: Use code RichRoll to save 10% off your first order of a website or domain👉🏼https://www.squarespace.com/RichRoll WHOOP: The all-new WHOOP 5.0 is here! Get your first month FREE👉🏼https://www.join.whoop.com/Roll Shokz: Visit SHOKZ.com and use code RICHROLL to receive an exclusive offer on your purchase👉🏼https://beopen.shokz.com/RichRoll-OpenFitPro   Check out all of the amazing discounts from our Sponsors👉🏼https://www.richroll.com/sponsors Find out more about Voicing Change Media at https://www.voicingchange.media and follow us @voicingchange

Transcript
Discussion (0)
Starting point is 00:00:00 This episode is brought to you by Caraway. Now, if you're like me, you care deeply about what you put into your body. But the longer that I've been on this health journey, the more I've realized just how many toxic things end up in our home environment, not just microplastics and food additives, but in all kinds of consumer products, cosmetics, even our cookware. And this is where Carraway comes in. Caraway makes beautifully designed and elegant non-toxic cookware, free from PTFE, PFOA, and other questionable chemicals. And honestly, once you make the switch, it's pretty hard to go back. I've been using their cookware at home, and what stands out is how easy everything feels. Food releases cleanly.
Starting point is 00:00:47 The cleanup is fast. And they come in a bunch of cool colors to match your kitchen aesthetic. Carraway has over 100,000 five-star reviews for a reason, and now it's your turn. Their cook set is a favorite. It can save you up to $190 versus buying the items individually. Plus, if you visit carawayhome.com slash richroll, you can take an additional 10% off your next purchase. This deal is exclusive for our listeners, so visit carewayhome.com slash richroll or use code rich roll at checkout. Carraway, non-toxic cookware made modern.
Starting point is 00:01:22 I'm not worried about AI destroying humanity. I'm worried about Sam Altman running an AI company that he will lead to destroy humanity. If this technology goes wrong, it can go quite wrong. Today we're speaking with Nick Bilton. Former New York Times columnist and Vanity Fair special correspondent. I was a reporter in Silicon Valley covering tech. And I do believe that the tech elite are evil. The fear that we're all going to be killed by AI is actually part of their fundraising.
Starting point is 00:01:54 You've got Sam and Elon and all these people out there being like, we're going to die. We need more money. And the bucket loads of cash come in. And it's all nonsense. They've crafted a certain public image tailored to what people seem to like. I also think that there's another thing. The biggest lesson I've learned in my life is that... Nick, it's so great to meet you.
Starting point is 00:02:17 Thank you for doing this. I'm so excited to talk to you. Thank you for having me. This is very, very exciting. You're somebody who I've been reading for so many years, you know, at least going back 10 or 11 years with your reporting in the New York Times. And you've covered Silicon Valley and the tech titans. And there is an obsession that we have with these billionaires, the technocratic class, these people who are, you know, lording over the devices and the apps that we're all using every single day. And these are people that you have spent time with who are also, for the most part, masterful storytellers and mythmakers.
Starting point is 00:03:00 What can you share about what you know about people like Steve Jobs or Jeff Bezos or Zuck or these people that is a common thread with respect to their myth-making superpowers that, you know, we'd all be kind of better off for understanding? Well, I do think that there is a through line. They're all obsessed with their own self-image, their legacy, and they are all obsessed with telling a story. And you can see that through one very simple metric, and that is how many people are on the communications teams of these companies and hundreds and hundreds and hundreds. Ironically, Elon fired all of his communications. folks because he could do it better, but in his mind. But they are always telling a story. And look, I mean, like, Elon's a perfect example. Let's take the boring company for a little, little storytelling lesson here, right? So he was driving to work one day. It was a bunch of traffic. He tweets, this is years ago. He tweets, there should be tunnels under the ground where you can, you know, you can drive so that there's no traffic in L.A. The next thing you know, he gets to the office at SpaceX, and he's like, the tweet's getting a million responses.
Starting point is 00:04:25 And he's like, let's start the boring company and we're going to build these tunnels. I went to SpaceX for a meeting a few months later. And there was a dirt patch across the street with one of these big tunnel boring machines. And I said to someone I was meeting with, I was like, what is that? And he's like, oh, it's just, it's like a little show Elon does to show that he's going to build these tunnels underground. It's never really going to happen, you know? And yeah, they've done some, but this was years ago, right? They never built, there are no tunnels under L.A.
Starting point is 00:04:57 I drove on the freeway to get here. There's, there is that sort of small network of tunnels under Vegas, right? Did he do that? Yeah, but it doesn't really seem to alleviate the traffic problem. It's sort of a novelty. It's also incredibly claustrophobic. Yeah. And people have panic attacks at them all the time.
Starting point is 00:05:12 But the point is, that's a great story, you know. And when you love it. look at these guys, they do this over and over and over again. And I think that they, and then they start to believe their own story. You know, I always, I got to spend, as a reporter covering tech, I spent time with all of them for very long periods of time. And there was always this, you know, you could see this thing happen. Like I knew Zuck and Jack Dorsey and all those guys and Kevin's sister, all of them when they were just starting out. And then there's this kind of shock of like, whoa, how'd that happen?
Starting point is 00:05:54 I'm a billionaire. And then there's this like, I'm a billionaire and I'm the one that makes the decisions and screw you. And it's and they just believe the they believe the story that gets told about them or the story they tell them about themselves. I mean, I wrote the book on Twitter and one of the greatest quotes someone gave me was the greatest brand, sorry, the greatest product that Jack Dorsey ever made was Jack Dorsey. And I think that sums up a lot of these people in Silicon Valley.
Starting point is 00:06:22 They create this product that is themselves, often taken from Steve Jobs. And his reality distortion field. His reality distortion field. And I have a funny story about that. And then they believe the story. And then sometimes the story breaks them, which is what happened to Bezos. So tell that job story because then I have a follow-up related to Jack and these other guys. So when I was first thing out starting out as a reporter at the times, I get a, I was writing a story. It was about the iPhone and something related to it. That's not important. And I was writing a story about it. And I called Apple PR. And I said, hey, I'm writing the story. And the PR woman says, give me.
Starting point is 00:07:16 me a second and then puts me on hold and says Steve's going to call you in a little bit. So the guy who ran the PR department was this guy named Steve Dowling. So I thought it was Steve Dowling who was going to call me. And instead, she was like, no, it's Steve Jobs. And I was like, Steve Jobs is going to call me? And, you know, I'm like a young reporter at the times. I'm like, is this how it works? So I hung up the phone and he didn't call. And I went out, I went out for dinner that night. And I get this weird 4-1-5, whatever it was, phone call. And I answered, had a couple of drinks.
Starting point is 00:07:55 And it was like, hey, it's Steve Jobs. And I was like, oh, shit. And so I like, go out, start him, trying to take notes on my phone. He was on the phone with me for 45 minutes an hour. He wouldn't get off. Wow. And he was telling me that my story was wrong and this, that, and the other, and so on. And I remember, I went in the next day and I wrote the story, the way I had been told by him.
Starting point is 00:08:14 And I published it. And then afterwards, I was like, wait, I wasn't wrong. You just got, you got snuckled. Completely played. And John Markov, who had been a tech reporter for many, many years at the time, said, he came over to me and he said one word. He said, not his three words. He said the Steve Jobs reality distortion feels, so four words.
Starting point is 00:08:34 And he said, I was like, what do you mean? He goes, he's what he does. He's like, he can change the way you feel about something. And it happened. I interacted with him a lot. And it would happen every time. And I got good at realizing, oh, the reality of its fortune feel is coming on. And then he stopped calling me because I wouldn't listen to him anymore.
Starting point is 00:08:53 But he, you know, he was a master at making you believe the thing that he wanted you to believe. And he and I think that people, you know, they looked at him a lot of these tech tight. like that's who I want to be when I grow up. And people, and what's fascinating from a storytelling perspective is they chose which part of him they wanted to tell is their story. So the assholes picked the asshole and made themselves more justifiable. The product people picked the product, you know, and I think, you know, I think he was largely more influential in the way these people think about themselves than anyone in even history, quite frankly. Yeah, these people who become billionaires and it has this warping effect on their
Starting point is 00:09:48 perspective on the world when they can literally, you know, basically create seismic shifts in society by dint of a simple decision and understanding that story is more important than truth and being able to wield a story that is in service to whatever that person wants people to believe, can literally change everybody's perspective on that person, dating all the way back to like Jack Dorsey's origin story with Twitter. I had a version of that, that kind of reality distortion experience. I had Jack on the podcast. This is very early on. I went to his house. I basically spent like the better part of a day with him. And I left that experience just enamored with this guy. And I was so charmed by his charisma and his warmth. And I thought, if anybody should be running Twitter,
Starting point is 00:10:39 it's this guy. This is a guy who's, like, created a world in which he carves out time to think deeply about these difficult decisions. And I just couldn't imagine, like, I couldn't understand why people were having such issues with him. Because I had been, you know, I had, like, gone into his environment and, you know, like, sort of marinated in the Jack Dorsey experience. experience and now looking at kind of like where Jack is now and these other people, you know, I sort of reflect back on that and was like, was I played? Was that genuine? I still don't know. I like to see the best in people and believe the best in people. But I can't imagine what it's like to be a billionaire who is wielding such unbelievable influence on the world. And I think what happens
Starting point is 00:11:29 is with that wealth and that power comes this sort of sense that, well, you were so successful here that you must be right about all these other things, like this galaxy brain kind of mentality that takes over that then spills into perspectives on politics and culture and all these areas that have nothing to do with being a successful entrepreneur necessarily. Well, you just summed up everyone in Silicon Valley. Like every single person who's successful who's a billionaire that I've ever met and I've met all, they all think, oh, I'm the billionaire who is this successful, who, you know, clearly knows what they're doing in this arena.
Starting point is 00:12:13 So I know what I'm doing in every single one. And I could cite a million instances where people, you know, I'm an, I'm the world's expert on real estate. I'm the world's expert on COVID-19. And it's like, and they, and then, you know, people, Like Elon, like, you know, Sam Harris is a good friend of mine, and he got into a debate with Elon about COVID-19 and Sam proved to be right. And Elon stopped talking to him. And it's, that's the mentality. If you're not in my yes man circle, then you're out. And so it becomes this reinforcing. Yeah, self-reinforcing cycle. And, you know, all the way to like, hey, is democracy really working? Like, maybe we should, you know, like go back to some kind of monarchistic society. Like when you see this dark enlightenment movement and these very influential people like
Starting point is 00:13:06 Naval Ravacant and like, you know, they kind of these like, you know, gurus of Silicon Valley who have such outsized influence as a result of becoming successful in technology that is deeply concerning to me because they do wield so much influence. And if, you know, these people say something online, like people pay attention and listen and, you know, kind of now here we are, you know, being lured into the web of, you know, kind of these, these, you know, crypto state recessionists, you know, where everything is in question and these people know better than any expert in their respective field.
Starting point is 00:13:48 Yeah, and I think that, you know, like Jack Dorsey is a perfect example because, you know, Jack, if you, as I researched with the book, he changed the story of how Twitter was founded hundreds of times, quite literally. Like it was like at first it was, he came up with it, you know, on a slide in a park with Bizstone. And then it was, oh, actually, I came up with it when I was a kid in my bedroom watching police scanners. Oh, actually, it was in Oakland when I was a babysitter with blue hair.
Starting point is 00:14:25 and it all changed. And the reality is he came up with one slither of the idea, as did other people specifically Noah Glass, as the book talks about. And Noah was written out of history. And the same thing is true with Square. Jack didn't come up with Square. A friend of his did, and he took the idea and so on and so forth. And we just saw it recently, you know, Square just laid off half of their workforce, almost
Starting point is 00:14:50 half of their workforce. And Jack cited it as, as A. Well, actually, if you go and look and you actually dig a little further than, oh, almighty tech guru is way ahead of the curve. The set, his employee base has doubled in size in the last few years for no reason other than it's not run well. And so the narrative was, let's get rid of half of the employees and say, we're doing because of AI. It's literally back to what it was a few years ago. It was just management, poor management decisions. And the stock went up.
Starting point is 00:15:25 because another example, look at the Tesla stock. Like Elon can talk about how he is going to have a million taxis, you know, in two years, and there's like 12 of them. Yeah. It doesn't matter if it's true or not. It never bears out just like the boring company story. Like he can just say, oh, we're going to have full self-driving by this day and he can move the stock price immediately. And every time you look through the rearview mirror six months.
Starting point is 00:15:55 later on things that he says. Like, these things never end up bearing out. But it doesn't matter. And this is mirrored in the Trump administration. Like, it's just rhetoric. Yeah. And just move on. And there never seems to be any kind of reconciliation or accounting for these statements. What I think is so. Even like self, like a normal person would think, yeah, I say these crazy things and they never happen. Maybe I should reflect a little bit on my relationship with truth and veracity. And like, what does that say about me? people don't do that. It doesn't matter. They don't care. The thing that I think is so shocking to me, and I do not understand it. I don't know if you have an answer. I don't understand. We are in a world where for the first time in human history, you can have a conversation with machines that can tell
Starting point is 00:16:46 you what is right and what is wrong. And for the most part, with the exception of hallucinations and so on and so forth, they can actually detail it. So when you go and ask, you know, Claude or Gemini or Chatt, whatever you talk to about the history of, of Squares, employees and the number and so on and so forth and revenues and blah, blah, blah, it will show you. But no one asks these questions. If you go and ask, is it realistic that Tesla is going to have a million self-driving cars in this period of time, or humanoid robots in your house and so on and so forth? Like, we, But no one wants to do that. They just say, oh, that's it, quick.
Starting point is 00:17:25 And the stock goes flying through the roof. And everyone's like, all hail Elon. And I think it's like, and Mark Zuckerberg, too, with social media and it's just to me, what I don't understand is why don't people ask questions? Why do they just take it as gospel? I think it has something to do with the power of storytelling. Like take Zuck, these people who are so image conscious, they've crafted. a certain public image tailored to what people seem to like that then becomes, you know,
Starting point is 00:17:59 like enamoring to people. Like for some reason, and maybe this is related to like monarchism, like, oh, a leader that I can look up to and I believe in and will fuel my cognitive dissonance. It doesn't matter what's true. Like, I like that guy. And for some reason, my affinity for this person makes me feel like I'm in proximity to them. I also think, I think you're completely right. I also think that there's another thing.
Starting point is 00:18:28 And if I were to say the biggest lesson I've learned in my life as a professional over the last 30 years, is that people are just people. You know, they all get headaches and they all have anxiety and they all want to be loved. And they all want to believe that they're good people doing the right thing. and acting, you know, acting on the half of the betterment of society. And they want to believe that. And I think that what ends, what happens is because of media, everyone, people become more and more famous and so on and so forth. And we, we look at these people and we're like, like, it's why people get so nervous when
Starting point is 00:19:09 they meet celebrities or they meet, you know, leaders. They're just people. Like, it's, I never get nervous when I go and interview someone or when, or because I'm like, you're just a person. You probably woke up this morning and you didn't sleep well last night because your dog was sick and your kids. All these things that we all experienced, but they make it look like it doesn't happen. And I think that so they create this bullshit narrative around who they are and what they are and that they're going to live forever or that because I'm a billionaire. I'm an expert on everything.
Starting point is 00:19:41 And it's all nonsense. And I think once we get to a point where people realize it's nonsense, maybe that's when it changes. but I don't know if people want to believe that it's nonsense. Yeah, I don't see that happening. I think that they want it. They believe that these are the gods. Right. And now we have AI.
Starting point is 00:19:56 So take open AI, for example, the storytelling around that and the cognitive dissonance with, you know, the titans that are at the helm of these respective AI companies who are obviously spinning a yarn and doing some very advanced storytelling around how we should think about this as they, you know, kind of accumulate power. and, you know, are participating in this race towards AGI, et cetera. Well, what's fascinating about the storytelling around AI is two things. One is the fear that we're all going to be killed by AI is actually true. Like, I genuinely worry about it.
Starting point is 00:20:34 But it's also part of their fundraising. And so, you know, you've got Sam and Elon and all these people out there being like, we're going to die. We need more money to make sure we don't. And or, you know, and the bucket loads of cash come in, it's all for them. It's about, it's a fundraising mechanism. And the other thing is, someone said to me, I did a storage of Vanity Fair a couple years ago about AI and creativity and how it was going to replace jobs and so on and so forth. And one person said to me, a very clever thinker on this stuff said, I'm not worried about AI destroying humanity.
Starting point is 00:21:11 I'm worried about Sam Altman running an AI company that he will. lead to destroy humanity or someone else. And the reason for that is because they are so obsessed with being first in the story of, you know, of the first person to create AGI that they put all the other things aside. And the goal is it's about them. It's not about the AI. It's about them as the leader of the AI company. And it's what's crazy is, you know, Rodin Hatching Twitter, this was a white box on a screen that you could type 140 characters into. Look what it did to the world. It changed.
Starting point is 00:21:50 It's the reason Donald Trump is in office. He ran for office many, many, many times before. It's the reason that we have all of the culture we live in today. And that was just a white box with 140 characters. Now there's this new box that we can type into that is way more powerful. And we're all just doing it without thinking about what happened. the last time and all these people don't care because they want to be famous and mention the creator of the last invention.
Starting point is 00:22:21 There's lip service to safety considerations, et cetera, but as far as I can tell, there doesn't seem to be a lot of evidence to suggest that any real efforts or deep work is going into ensuring that these things have guardrails on them and that they are in the public interest. It's just this race forward based on fear and lack. Like, we have to do it before China, et cetera. It's important. I'm the one who's in charge. You should give your money to me.
Starting point is 00:22:52 You should trust me. We have no reason to trust any of these people. And all the evidence suggests that this is going to damage us in ways that we can't even possibly begin to imagine. And yet, it's so convenient and helpful in the short term and entertaining in an advanced way in comparison to Twitter and 140 characters allowed us to kind of be lulled into this world of social media that, you know, I think it's pretty clear to everybody right now has deranged us in ways we couldn't have imagined when, you know, you wrote hatching Twitter. It's deranged us in so many ways. And it's, you see it with, um, uh, you see it with podcasts,
Starting point is 00:23:35 funnily enough, you know, like, Tucker. Because the incentive structure. The incentive structure, like Tucker Carlson, like the, the guests have gotten, Crazier and crazier, Megan Kelly, the anger about, you know, fighting with people that she was once close with or Ben Shapiro doing the same thing. It's like they're all, they all just come at each other. And then, and then the, you know, the Pod Save America guys, they like go even deeper on the left because it's, the incentive structure is the more insane and intense and scary and and the more I pull back the covers on the real thing, the more views I get and the more crazy we get as society and it just, it just is this spiral. I think about this all the time.
Starting point is 00:24:19 You know, yeah, podcasting has become all about like they're lying to you or the number one expert who's going to tell you the thing no one wants you to know. And unless you're platforming, just, you know, kind of conspiracy addled people or the person with the hottest, craziest take, the extremism in, you know, across the board on all sides of the spectrum, you have no chance at garnering eyeballs and getting attention. And the incentives, you know, the incentive structure is such that like if you, if you want that, then this is what you need to do and that's the dollars follow. And as somebody who's been doing this show for a very long time, you know, from the very early days into now, it's like, where do we even sit with this anymore?
Starting point is 00:25:04 You know, like, there's a reason why I do this show. There's a purpose and a meaning behind it that I care about. And I see these other people and, and how, like these other podcasts that have gotten really huge. And they're influencing the next generation of podcasters. And it has nothing to do with like asking yourself, like, why am I doing this in the first place? What am I hoping to accomplish? It's just like, here's how you get big. Here's how you get attention.
Starting point is 00:25:29 It's like attention for what aim, for what purpose. Couldn't agree with you more. And it's like, you know, I thought, I always thought that the debate about whether Hitler was a good person or a bad person wasn't a debate. It's like, bad guy, you know. This is where we're at. And, you know, from 140 characters to this and people like, well, maybe we should have that conversation. Yeah, Joe Rogan saying, oh, that's interesting about Hitler. It's like, it's, you're like, what are you talking about?
Starting point is 00:25:56 It's like where, how are we, how are these things even a debate? Like, how are they even a conversation? Like there are way more important things that we should be talking about that are real. And it all comes down to the incentive structure. And this, to me, is where, you know, I do believe that the tech elite are evil, quite honestly. I think that they know exactly what these things do. you know, Facebook knows there's like so much that has leaked about things that they know that are bad for kids and for society. And we, and the same with Instagram and Twitter and all these social platforms. And yet and TikTok and they do nothing to try to stop it. Nothing. And to me, look, I don't know if there is a God in the universe or there's something. But if there is, they're not going through those pearly gates. the things that they have done with intent and the unintentional but then done nothing about it,
Starting point is 00:27:07 you know, look, I mean, look at Snap, for example. There was a period of time that Snap was, you know, the whole point was to, so people could have disappearing messages and they wouldn't. It was a great intention, right? We want to make it so that you're not posting things on the internet that can come back and haunt you later. And then, fast forward to a year ago. the number one cause of death among teens in the United States is fentanyl.
Starting point is 00:27:31 You know one of the top places they buy it? Snap. And so and there had to be a congressional hearing for them to even make any changes. Like these things, they're unintended consequences always, but our responsibility is to fix those. We never know what they are with technology. And at the same time, there are consequences we know exist. One example is kids should not be using social media like this. It is just so unhealthy.
Starting point is 00:27:57 I don't have it on my phone because I can't stop looking at it. And I'm a grown adult who's written about it for 20-something years. And these companies that hide all that data, I just think it's completely and utterly evil. As some of you know, I am in a very different season of training than I've ever been in before. I'm rebuilding slowly, intentionally after this spinal fusion surgery that I underwent this past May. And I'm learning what it means to be patient with my fitness and how to prioritize sustainability over intensity. And I got to say that whoop, specifically my new whoop 4.0 wearable, has been this just enormously helpful companion in this process. It's a screenless, wearable health and fitness coach that gives me personalized insights into my sleep, into my recovery, my strain, and my overall health, helping me to really,
Starting point is 00:28:54 understand what my body is actually ready for on any given day. And that awareness is what is helping me really stay focused and consistent, which is essentially everything right now. I do have some meaningful goals ahead. I am very intentional about getting back to pain-free running and hopefully lining up for the New York City Marathon to celebrate my 60th birthday in the fall. And WOOP is helping me make the best decisions that are moving me the most expert expeditiously forward toward those moments with greater results and intention.
Starting point is 00:29:30 So I would suggest that you check it out. Go to join.wop.com slash roll for one month free of whoop. Here is the dilemma. When you choose headphones, you usually have to decide. Do you want to be fully immersed in what you're listening to? Or do you need to stay aware of what's going on around you? Well, most earbuds force you into one camp or the other. But Shocks has figured out how to bridge that gap with a new OpenFit Pro. It's their first open-ear headphone with open-ear noise reduction. What does that mean? That means you can actually focus on your podcast or your music without being completely sealed off from the world.
Starting point is 00:30:13 If you're running or riding a bike, you get that situational awareness that actually matters for safety. They're super comfortable. They've got incredible battery life up to 50 hours with the case. And crucially, the sound is just superior because it's optimized for Dolby Atmos and powered by this tech called Shocks Super Boost that provides really dynamic distortion-free audio. And for even more options, all Shocks headphones are worth checking out. Visit Shox.com and use code RichRole to receive an exclusive offer on your purchase. You're somebody who, when you put a book out, I read it right away. I think people would be surprised that I actually don't read that much unless it's a book written by somebody who's coming on the podcast.
Starting point is 00:31:05 Like that monopolizes most of my reading and it kind of crowds out the opportunity for pleasure reading. But both of your books I read like immediately when they came out and you have this unbelievable talent for like novelizing nonfiction and and, you know, creating these these books that that read like three. The book on Twitter and the Russell Albrecht book about the Silk Road. We're just amazing works. And you're somebody who has a lot of irons in the fire. You do all of it. Like you make documentaries, you produce, you write, you've done scripted television, you do long-form journalism, written for Vanity Fair.
Starting point is 00:31:46 You're even in the process of writing the book and the screenplay for this Martin Scorsese movie with the rock about the Hawaiian mafia. This has to be perhaps like the most high profile project that you've ever worked on. Yeah, I mean, I definitely, I have ADHD, if you haven't noticed that. But I, and I've kind of used it as a superpower to just do all different kinds of, there's a theme, which is storytelling, of course. And then there's obviously some subtext to it, which is somewhat technology, but not necessarily. technology is in devices because, you know, the mafia project you mentioned is set in the 1970s. But, but yeah, I mean, I just, I love to tell stories and I just, I love to take on a new
Starting point is 00:32:36 challenge. And I don't say that like trying to be hyperbolic or anyway. But if I've never done something before, to me, that is the most exciting thing because I'm going to figure out how to do it and I'm going to do it differently because I'm not doing the traditional route. And so, yeah, I've done journalism and podcasts and movies and docs and script. TV, all of it. And it's, it's just my ADHD at work. I mean, I have ADHD. I'm, I can only do one thing at a time. Yeah, like, how does that operate for you? It's funny. My wife came in the other day into my office and I had two laptops open and I was like working on them. She's like, what are you doing? And I was like, and she took a picture and sent it to a bunch of
Starting point is 00:33:17 friends. But I was like, I can work on two projects at once. And she was just like, you're out of your mind. Yeah, that's interesting. Mine works in the opposite direction. I can do very well if I'm just put blinders on and focus on one thing. And my wife is somebody who's doing lots of things at the same time and I'm just always marveling at that. Like I don't understand how that works. Can you get procrastinate? Um, I, I do, but I can lock into a flow state pretty easily if I'm super into the project that I'm working on and I can lose hours and hours and hours. And I really don't want any interference. Like, I just want to be, like, I want the world to be, to disappear and just to be immersed in whatever I'm working on. But I can't, like, gear shift very easily. Yeah, I gear shift.
Starting point is 00:34:03 I don't procrastinate. Not a, not a procrastinating. Yeah, you don't seem like somebody who has bone in my body. No, I do not have. Is it blocked something that ever, how do you think about that? No, I mean, well, I think in the age of AI, writer's block doesn't exist anymore. It's, it's gone. There's no longer a blank page because you've always got something to help you. But, But I, yeah, I just, if I find my, if I find the, the gear starting to kind of get a little bored and I'm, you know, I just am like, okay, next project and move that over and work on that. And one of the things I learned interestingly, I think it's, look, I don't know what we learn and what we have innate and what we find and so on. But I remember at the times when I was a technology columnist, I would be on deadlines sometimes and I'd have to like run home or I'd have to run into the office.
Starting point is 00:34:56 And I started writing columns on my phone on the subway or something like that. And then I would finish them. And you just kind of, I just learned to just, I could do, I could write anywhere. You know, it just doesn't matter. I'm just kind of zoned in and pick it up on something else. And I just kind of parlayed that into all the different kinds of writing I do on all the different projects. You didn't start out with the idea of becoming a writer, did you? I mean, your origin story into the New York Times is super interesting.
Starting point is 00:35:28 I think I'm probably one of the least likely people to have ended up as a columnist of the New York Times when I was there. I, like, there was a world I was going to end up in jail, quite honestly. I mean, it's literally like I grew up in England. My parents got divorced and I moved to Florida and immediately got in with like a lot of bad kids. And my dad was off doing his own thing. And it just was there was a world that was going a completely different direction. And I had a moment where I'd run away from home. I was a teenager.
Starting point is 00:36:06 And I was working at Jack's Burgers in the wall in Florida. And I feel like this was literally the defining moment. moment of my life, quite honestly, was I walked outside of Jack's Burgers. I was taking the trash out and there was a homeless guy going to the trash and waiting for the next trash back to and I literally was like, wow, that could be me if I don't pull my shit together. And you're all of 17 or something like that? 16 years old, 16, 17 years old. And I walked back inside and I was like, you know what? I'm done. I'm going a different direction. And that was it. And I like stopped talking to all the kids, most of who are in jail or dead now that I knew in Florida. And I was like,
Starting point is 00:36:42 to figure it out and my GPA was a 1.9 in high school you have to have a 2.0 to graduate and i was like how do i get i literally was like i've got four months to to get from a uh uh a 1.9 to a 2.0 i got to a 2.1 so you know by the time you graduated i mean what was going on at home what kind of chaos was happening that you were you know having such a difficult time it was just total chaos i i i you know parents were divorced they got married too young My dad was off dating and, you know, didn't want to deal with me and my sisters. And, and we, we were a lot. I was a lot.
Starting point is 00:37:21 I was getting in trouble a little time. I got arrested like nine times. It was like, it was, you know, not for like anything crazy, like stealing and drinking and fighting and, you know, complete nonsense. But still, like. Was drugs and alcohol part of that or just truancy and just. I was never. I was lucky that I never got into drugs and I never have.
Starting point is 00:37:41 and I just saw what they could do. But I drank a little bit. You know, it wasn't, it wasn't, it was more just, I just probably was looking for attention or direction or something. And, you know, there was a culture shock of moving from England to Florida. And then you had all these, I remember all these kids that I became friends with and they were similar. And like they lived in New York, their parents got divorced or whatever. And they got in trouble in New York. And their parents were like, we're moving to Florida.
Starting point is 00:38:16 And so you get like this pirate ship that forms. And I got on the pirate ship. But you have this moment of reckoning. I have this moment of reckoning, which 30 years later is still, oh, whatever, however many years it is, still literally fresh in my mind like it happened yesterday. And I was like, I've got to figure this out. Otherwise, I'm screwed. And I mean, there are kids that there are men now, but who I was very close with. I think, like, eight of them are dead.
Starting point is 00:38:50 One of them's on death row for murder. A bunch of them, you know, just one got, one was arrested for bank robbery with his mom. Like these were the people I was friends with. It's a crazy sliding door story. Totally. I mean, because your success is. so insane and, you know, in comparison to where you could have been. I mean, that really is, you know, this, this sort of inciting incident, like a real transformation as a result of
Starting point is 00:39:19 just you coming to it yourself, like not because you got locked out. It wasn't anything all that dramatic, right? It was just like a mindset switch. Well, I think what's fascinating about it is two things is one is, I think we all have an imposter syndrome. And I've, and I've always been of the mentality of, you know, the people who do have that are the ones that work harder and that appreciate it more and so on, the ones that feel like, oh, I deserve to be here. They often don't work harder. They don't push. They're just sort of like, this is, this is where I'm meant to be. So I've always, every time I've ended up in all these different roles as a director on a documentary for HBO, writing a movie from Martin Scorsese, as a calmness for the New York Times,
Starting point is 00:40:02 writing stories that are breaking, you know, changing laws or leading congressional hearings, things like that. Every time I'm like, whoa, like, how did I end up here? And, and, and it's, the mentality I always had was someone's going to tap me on the shoulder and be like, I'm going to go. And so I've got to do all this stuff that I get, I want to do until that happens. And it's always been this kind of driving force. But I also, the other thing is, like, when I think about that homeless guy, like, that homeless guy, it's probably dead, had no idea that he impacted someone else's life the way he did. And it always makes me think that every single solitary thing we do, we just don't realize, but it's all filtering out. It has meaning. Yeah, it has meaning. It has
Starting point is 00:40:48 purpose. And, you know, whether it's up to us or somebody else, it does. So you get this 2.0. I get 2.1. 2.1. What do you do with that? So I got into, the only school I could get into was an art school. It was a school of visual arts in Savannah Georgia. They had a campus. And I didn't, I mean, I literally couldn't write very well. I was, because I hadn't really studied in school. And I, so I made a comic book, uh, of my life story. And that was my college application. And they felt it was creative enough. And that was something I was, I was always very creative. Um, I was lucky to have that. And so it got me in to art school. Um, and, um, eventually I was spent a year in Georgia and then I transferred to the New York campus. And, um, and, um,
Starting point is 00:41:36 And I kept the ADHD clicked in. So I kept switching majors. And so I was like, I wanted to be a fine artist and learn how to paint photo realistically. And that took like, it's like six months. And I was like, oh, am I just going to do this for the rest of my life? And then I found graphic design. I was like, oh, my God, this is amazing. And then I started reading a lot.
Starting point is 00:41:57 Like, I fell in love with books. And my mom always read. I always remember when I was a kid, she was always, if she was like blowdrying her hair, she'd have a book on her lap. And I just started, I was a voracious reader, but I still never wanted to write. I ended up as an art director. I designed the very first Britney Spears doll. Like I just bounced around to all these different things and eventually became an art
Starting point is 00:42:20 director at the New York Press and then the New York Times. And the goal that I had, I'd read all these war photography books and these war books. And I was like, I'm going to be a war photographer. That's it. So that was my goal to get to the New York Times. So I get to the New York Times and there's two things that happen. The first thing is I end up in the business section as the art director, but because I knew tech so well and I'd always understood tech, it was just this thing that I could understand. I was always suggesting story ideas and like arguing with David Pogue that his thesis was wrong or whatever.
Starting point is 00:42:53 And so I also became good friends with a lot of the editors there because I would help them fix their iPhones and their iPads and stuff because they didn't know how to. And at the same time, I'm at the Times. And my goal is like, I have this portfolio and I want to get it to the photo editor, this woman Michelle McNally. And then I'm going to go off and pursue my dream of being a war photographer. I was interning for the printer for James Noctaway. Do you know who James Noctaway is? Is the most unbelievable war photographer of all time. his photos are just breathtaking.
Starting point is 00:43:32 And he had all these printers, and I interned for one of them. And I, you know, this was it. This was the goal. Yeah. So one day I finally get Michelle McNally to go out for lunch with me. And I sit there and I give her my portfolio and she opens it up. And it was this big orange book. And she flips through.
Starting point is 00:43:49 She doesn't say a word. And I'm watching. And she closes the book. And she goes, you're a good photographer. And she's like, you know, you'd probably be a good war photographer. She goes, but I'm not hiring you to be a war photographer. And I was like, why not? And she goes, because you're not fucked up enough.
Starting point is 00:44:04 And I was like, what do you mean? She goes, all of these guys and these women that do this, she's like, they're adrenaline junkies. Most of them are alcoholics or drug addicts. They or they need the adrenaline. They can't live in normal society. And she's like, you know, they live on, some of them have beautiful houses and they sleep on their floor.
Starting point is 00:44:22 You know, it's like, because that's where they're comfortable. And after that, I was like, okay, well, what's next? and that was when I became a writer. What's interesting about that is you could have then lobbied her by, you know, making your case for how chaotic your upbringing was and how fucked up you truly are. No, I'm fucked up. You're like, you don't know. I got a, you know, I had a one point, whatever, and I, you know, got arrested nine times as a kid.
Starting point is 00:44:48 I'm plenty fucked up. Yeah, I just, I think that I, I'm the kind of person who, I go off my intuition and my intuition told me maybe she's right maybe she's right yeah and then you kind of exploit this white space because this is what is this the early 2000s when yeah this is the online aspect of journalism is brand new yeah so it's early 2000s and I um we had the New York Times we had a research lab um that was up on the 27th floor I believe and the goal of it was like five, six people and the goal was to build prototypes for what the future of journalism might look like to then inform the newsroom and so on. And so it was 2005 or 2006.
Starting point is 00:45:40 And no, sorry, it was a little, it was a little late. It was about 2009. And the, we had an idea that there was going to be something like the iPad coming out. So we built like, we took a screen apart and made it look like a touchscreen that you could interact with the news on and so on and so forth. And I started doing a lot of public speaking about the future media and things. And again, with imposter syndrome, being like, why did they invite me? And I remember I got a job offer to go work at Google in the Google News Group. And I went out for lunch with the editor of the business section at the time, so I'd become close with. And I told him, oh, I'm going to leave. And he said, you know, it's a shame.
Starting point is 00:46:25 We can't keep people like you. And he's like, you know, all of the people at the paper, he's like, they're all brilliant and they're geniuses, but none of them are interested in tech. All they want to do is write for the print paper. And I said, what are you guys doing with that tech blog, the Bits blog, by the way? And he said, well, no one wants to write for it. We get one post a month. And I said, I don't even know why I said it.
Starting point is 00:46:48 It was literally like a puppeteer came and made my mouth work. And I was like, I would do it. and he said, really? Would you, like, would you really be interested in doing it?
Starting point is 00:46:58 And I was like, sure. And he goes, why don't you send me a proposal of what you would do? So I sent it to him thinking it was just
Starting point is 00:47:04 going to go to him. He sent it to the entire tech department. And I kind of pissed them off because I'd said how bad their coverage was on the blog. And that deal, they said,
Starting point is 00:47:13 let's, let's try it for a month. And that's how it all started. I was 33 years old. The first time I'd written a professional word and it was of the New York Times. It's so unusual.
Starting point is 00:47:23 usual. Like what an unlikely story for the New York Times to basically greenlight this guy who actually was not a writer to be a writer for the most prestigious publication in the world simply because nobody else wanted to do it and it was sort of treated like this bastard stepchild, which is all the more insane when you reflect on how tech obsessed we are. Like people just can't get enough of tech journalism. I think it was that, you know, I remember I remember this really funny moment when I first became a reporter. And you go to this at the times, you go to this, it's like an onboarding. Even if you've been an employee in this department and you go to this one, you still go through it. But when you first start, so I was now starting officially as a
Starting point is 00:48:09 reporter. And I, you go and you go to someone on the Mastead's house for like a dinner and and you meet all the people that have been there, their entire careers and lives and someone and you get to ask some questions. But everyone goes around the room and they kind of introduce themselves at one of them. And, and it's, you know, I'm Bob, so-and-so, and I graduated Magna Cum Laude from Harvard. And it got to me. It's like, what do I say? I got kicked out of art school.
Starting point is 00:48:39 And it's, it's, I think what it was, there was no scenario five years in the future or two years in the past that they would have ever hired me, but they were in a situation where they couldn't hire people that wanted to, you know, those people were going and working on starting their own blogs. And the other people were, I only want to be on the front page of the New York Times. I don't care about the internet. And so I just found this slither of a doorway that I could squeeze through by accident. And then it turned out I was good at it. I knew how to do it for some reason. And that was it. So, when, When people come to you or young people come to you and ask for career advice or how do I get into this, like, how do you think about, you know, how you were able to, you know, get those third doors open and, you know, create opportunities for yourself that's translatable for the younger generation who's thinking about, you know, a career path similar to yours.
Starting point is 00:49:39 Well, I think, I think the third, there's two things. One is, I would say, you have to, you have to be somewhat. fearless and willing to make mistakes. Like, I don't, I truly do not, this is like, honestly, my superpower. I do not give a shit what people think of me. Like, I care. I want, I'm a nice person. I want them to know that I'm a good person. But like, if they don't like my writing, I don't care. If they don't like that story, I don't care. Like, I'm just my- How do you square that with the imposter syndrome? The imposter syndrome is I'm not supposed to be here. They, I don't care is, is, is, a different part of it. Like,
Starting point is 00:50:20 the imposter syndrome is someone's going to tap me on the shoulder and say, it's time for you to leave. You're in the wrong building. And the, the I don't care is, there'll be another call next week. I'll find another building or, yeah,
Starting point is 00:50:33 or it's more of like, you know, I watch people who want to be writers and they have one idea. And that's it. And it's, and they think it's like, that's the idea. That is never going to happen.
Starting point is 00:50:47 You know, like, You have to, everyone I know who's a successful writer in Hollywood has 50 things they're working on at once. You know, 50 different ideas. And one of them, if they're very, very, very lucky, will get made. And you have, and I think that, and you also, as a journalist, you can't obsess about, or even an author that this is, it's, it's perfect. There's no such thing as perfect. There's 90% done and you can spend the rest of your life on the next 10%. And for me, I'm okay with 90%. Like, okay, let's get it out and we'll do the next one.
Starting point is 00:51:24 We'll do the next one. We'll do the next one. And my goal is to just get better and learn more and put these things out out there in the world. And I think that they're two very different things. Does that make sense? Yeah, I understand. So not being so precious about your work, holding it loosely, having a healthy relationship with expectations around like what's going to happen with it or how people receive it.
Starting point is 00:51:46 And I suspect that that indoctrination had a lot to do with the fact that in your early columnist days, like, you just had to be churned. You were churning out like so much, so many articles. You just had to do it and move on and like not get too caught up and, you know, making it absolutely, you know, turning these things into jewel boxes. Yeah, that's exactly right. I remember there was an editor who told me at one point when I became a columnist. he said there is a piece of there's a column of newsprint every Thursday that is going to have your name on it and the 1,200 words you write and if you don't file it, you won't have a job and that was it.
Starting point is 00:52:29 And so I had to literally ensure, and it wasn't just that, it was writing stories and so on and so forth. And so I did the math at some point. It was well over a million and a half words I'd written. Wow. And I barely remember any of it. Like it's, you know, I remember a few stories here and there or some themes and things like that, but they're just, you just got to write it and you publish it and you move
Starting point is 00:52:53 on to the next one. And sometimes you get beat up for it and sometimes people love it and you just, and that's just it. And I think that the advice I always give to people is two things is one is you're going to get a better education in writing from reading than you are from someone explaining the structure of, you know, Hemingway's first opening page and his repetition of words and all these things like that. You're going to get a better education from reading and understanding. And that's literally how I've learned how to write in every single form. And, you know, people like Corrin McCarthy
Starting point is 00:53:32 said the same thing. Like the writing is reading and iterating on what people have done before. you. And so my books are, they read like novels, even though they're narrative nonfiction, because I love novel. That's all I read is novels. And so I want to tell a story. And then they also kind of read a little like movies because I love reading screenplays and writing screenplays. And so, but that's all, I didn't go to school for writing. You know, I literally got kicked out of art school. Like, you know, it's. But you have a visual mind. Like, when you're, I mean, when you're reading your books, it's like, oh, you just see the movie, which made me think, like, how come these books haven't been turned into movies yet? They must be, they must have gone
Starting point is 00:54:15 through. They've gone through lots of, 20, you know, cycles of development at this point. But how come they're not up on the screen yet? Well, it's just, I think, you know, it's a longer conversation about Hollywood and how broken that industry is. Um, but, um, but yeah, I, I, I, I actually just learned, I have a two, two boys that are nine and 10 and one of them has, uh, really severe dyslexia. And my wife was helping him with some of the phonetics of reading. I can't do phonetics. I learned that because I couldn't, when she was saying things, I was like, that's how you do it. And what I realized is I also have some form of dyslexia. And what I, I don't imagine words in my head. I can't picture a word, but I can picture an image and then I can
Starting point is 00:55:05 describe the image. So that's just the way my brain works. And so that's how I write. So I imagine the movie and then I tell you what's happening in the movie or I imagine the scene. It's all visual and then it's put into language. As a writer, your talent is really storytelling, like this reverence for how to tell a story well, which is reflected in the books that you write, but also is mirrored in these tech moguls that you've done these deep dive profiles on and their relationship with storytelling. So I want to spend a few minutes talking about how you think about storytelling and it's importance and how we should all sort of be thinking a little bit more in depth about storytelling and how it operates in our own lives. Well, I think everything we do is a story, right? I'm telling you your story.
Starting point is 00:55:57 You're telling me a story based on what you're wearing, what I drive, where I live, the way I talk to people, my everything we do is a story. And there are different ways to approach it in different mediums you tell the story differently. Like one of the things I find fascinating about nonfiction versus fiction, even if it's a short story or a news article, is like in a news article you have what's called The Lead, which is your way into the story. Then you have the nut graph,
Starting point is 00:56:27 which is telling you, that's the second paragraph, which tells you what the whole story is about. And then at the end, you have what's called the kicker, which is the best part of the story, right? It's the part that you leave with.
Starting point is 00:56:37 In fiction, it's completely the way around. You start with the best part of your story, and then you kind of go through it, and at the very end, you kind of tell us what it's about and so on. And I think that with all different kinds of storytelling, you have to approach it differently. One of the things that I've found the most challenging as a writer is screenwriting.
Starting point is 00:57:00 I think it's the hardest form of writing there is. There's no more difficult form because you can't use exposition. You can't. Every scene has to ask a question that another scene answers then ask the question and so on and so forth. Every character is trying to get in the other's way. One character, every voice has to sound distinct and different. There's all these rules and you're showing, you can never tell. You can never tell someone's interior and what they're thinking.
Starting point is 00:57:30 And so I find it all very fascinating how the different forms of storytelling work for our brains to be able to kind of understand it. And that to me is one of the most fun parts of moving between all these different projects. Yeah, in screenwriting, every line of dialogue, every setup, every kind of. Every kind of slug line has to reveal something about character and advance the story and illustrate the themes. And you have to do it with such incredible economy. Like it has to be distilled down to its very essence. Yeah.
Starting point is 00:58:08 So the Godfather is one of the best examples because the theme is about family. The characters are all very, very different. And what's so incredible about the godfather And why it's cited as one of the best movies is you've got the godfather, right? And then you've got his four kids. And each kid is a different facet of the godfather's personality, which are all, you know, one's a goofball, the other is holding it together. The other things he's cocky, he's in charge. And there's the love and so on and so forth.
Starting point is 00:58:43 And each character drives the forward, the story. So it's all those things that are happening. and if you have one single line of dialogue that isn't right, your entire feeling about these characters changes. And everyone has to have agency and there's obstacles. It's a really, really challenging form of writing. And I think it's why there are so few good movies in Hollywood, quite honestly. Fewer and fewer.
Starting point is 00:59:13 Fewer and fewer. And I also think the other thing I would say is, I don't think there are a lot of good writers out there, great writers out there. And that's not a diss on society or people or whatever. I just think that there are, it is such a difficult thing to do really, really, really well, that there are only certain brains that can do it. Cornic McCarthy, for example. You couldn't teach what that guy does.
Starting point is 00:59:42 You cannot teach it. And he didn't even, when you do read the interviews and the view that he did, He always said, like, I don't know where it comes from. It's like, and I think this is some of the greatest artists say this. Like Chris Martin had this great line in a documentary where he said, this song came to me from wherever songs come from. And I, and Carmen McCarthy was like, there's something in my subconscious that like, that told me to do this.
Starting point is 01:00:07 And I think it's the people who are the greatest artists, I think, are the ones that are most in tune with that. And it doesn't, it doesn't mean. the people that aren't shouldn't right, they should do the things they want to do. But I think, like, the greats or there's so few of them. Getting out of their own way, opening this channel to the subconscious. And, you know, and kind of sitting in this space of allowing it for it to show up and flow from hand to page. Yeah. And Rick Rubin talks about this, about how, about that specifically. I think it's, I believe that, you know, I still don't.
Starting point is 01:00:47 don't know if I believe that there's a point to all this or we're just some little accident or simulation or whatever, but I do believe that there is something in the universe that that makes these things, these stories happen and whether it's a collective consciousness and it, you know, some people have a little pinprick doorway, but I do truly believe that that is really where a lot of this art comes from. Well, despite the fact that movies seem to be, not so good these days. We're not going to ever reach a point where we lose our appetite for great storytelling. But it is an interesting cultural moment in that, you know, we're kind of in this period of time in which we've lost our reverence for the great novel. Like, this is something
Starting point is 01:01:40 I talked to, I had James Frye on here. We talked about this. Do you know Bruce Wagner? I had him on, and writes these incredible transgressive novels that are really just fantastic. And it's just like nobody's reading these books. You know, we're not in that era that, you know, you and I were probably around the same age. Like when, you know, these people were like rock stars, you know, and everybody couldn't wait for their next book. And that doesn't seem to be the case anymore. I think there's a few things that have happened as far as books. There, funnily enough, the book is.
Starting point is 01:02:14 industry has not shrunk to the degree that people believe it has. It's actually grown in some years. And the reason is, there's a couple of reasons. One is that audiobooks have opened up a whole new, a whole new genre of reader. And then, and most of those, you know, it's like the werewolves and the, and the vampires and all that stuff. You get these kind of subcultures that have risen up as a result of the ability to write books like this. And the other thing that's happened is, um, I mean, I think one of the worst things that's happened is actually book talk and TikTok as a result of, because I don't think that they're driving a lot of the sales to some of the worst writing out there. And people will call me an asshole for saying this and they have before. But I just, it really saddens me when.
Starting point is 01:03:04 Meaning like just sort of low rent genre fiction. Look, we all love a good, a good crime novel or a romance novel, whatever. but there are also things, I believe with every ounce of my being that the reason that we are supposed to write these stories is to make people think. We are as a storytellers, our job is to hold a mirror to society through a story. You know, Ayn Rand said this. She was like, Arn Rand, she said that to her novel is a way to make society think about things.
Starting point is 01:03:37 I think she pushed people to try to think in a certain way. but like to me that is what the whole point of all of what we're doing is like sure to elucidate some truth about human nature or the world to help us make sense of correct why we're here and it's why we consume these stories it's why we want the emotion you know whenever you're pitching a story in hollywood people are always like what's the character's emotional drive and it's like and at first you hear that and you're like what are you talking about it's set in space it's great and you and then you realize like it could be set anywhere it doesn't matter it's about the character and how we relate to the characters.
Starting point is 01:04:12 And I think that, which is why there's never, you know, there's never a great movie about billionaires because no one can relate to being a billionaire. And so, or most of us can. And I think that what's both been great is there are more readers today who do consume in different forms. But at the same time, the sad part is that the greats are not. read like they once were. And there isn't that sort of monoculture moment where, you know, some genius just, you know,
Starting point is 01:04:47 drops some work that lands like a thud and rocks everybody. Correct. But I still think that there is something perennial about writing books that has withstood the kind of gestalt of everything that we consume was sort of uploaded in the last 24 hours. Like, there's a staying power. You know, if you write a great book, it can make an impact in a way that, you know, other forms of media still can't. I completely agree. I just, I wish that, I wish that people would put down their phones a little bit more and stop scrolling, doomlessly scrolling and just read a book.
Starting point is 01:05:32 I mean, I, every, I'm a voracious reader. And at night, I put my phone aside and I pull out my Kindle. and like, you know, try to go through a book a week at least if I can. And I read a lot of older stuff, mostly from 1950s, 60, 70s, a lot of old sci-fi. And because I just feel like it was like the height of it. But it's, I find it so much more rewarding than even most TV these days. This episode is sponsored by Rivian. When I think back on some of my fondest memories from
Starting point is 01:06:08 childhood, 100% of them happen outdoors, on mountains, in lakes and oceans, getting muddy in the local creek, riding my bike around the neighborhood, basic good stuff that leaves me thinking a lot about what kind of world we're leaving behind for the next generation. And this, in a nutshell, is what Rivian is all about. They're an all-electric vehicle company founded on a simple idea, keep the world adventurous forever. I've been around RJ, the CEO, and his kids, and it's so clear to me that this is his animating purpose, but he's not just thinking about them. He's making decisions based upon what our kids' kids' kids will inherit, which I love. And that philosophy is just deeply embedded in everything Rivian builds. These are zero tailpipe
Starting point is 01:06:57 emission vehicles without sacrificing power or performance. The interiors use thoughtful, sustainable materials that feel premium and intentional. And the first 10,000 miles are powered by 100% renewable energy with a growing charging network doing the same. It's not about choosing between exploring the world and protecting it. Rivian is like a passport to both, meaning that when I'm driving the vehicle Rivian loaned to me, I'm not just driving through the world I love. I'm driving for it, which is a pretty special feeling I want everyone to experience. One of the things I hear constantly from people in this community is, I have an idea, but I don't know where to start, to which I say, whatever you imagine is holding you back exists only in your imagination.
Starting point is 01:07:48 Let's say you have an idea to start a coaching practice, perhaps a creative project, a business, or a course, whatever it is. A great way to turn imagination into reality is by giving it a home online. And that is where Squarespace comes in. Squarespace is the all-in-one website platform design to help you stand out and succeed on the worldwide web we call the internet. You can claim your domain, build a beautiful site, and run your business all in one place. It's actually insane how easy it is, especially when I think about my past experiences, working with designers for months, a great cost. With Squarespace, you can start with their AI design partner blueprint or choose from a library of award-winning super stylish templates and customize everything with simple drag-and-drop
Starting point is 01:08:36 editing. No design skills, no coding required. And if you're offering services, consultations, coaching, events, Squarespace has built-in tools for scheduling appointments, sending invoices, and collecting payments. It's everything you need to turn an idea into something real. Head to Squarespace.com slash Richroll for a free trial, and when you're ready to launch, use offer code, Richroll, to save 10% off your first purchase of a website or domain. we have this fascination with these tech titans we can't get enough of interviews with them and there is a weird we hope we're going to learn something it's like this oh maybe i could be that person someday or what would it be like to be a billionaire like this lurid you know kind of
Starting point is 01:09:27 fascination with how these people live their lives that that lure us into kind of a romantic relationship with them while also like that's the cognitive dissonance piece like we know that that Sam Altman is most likely steering us off a cliff. Yes. And yet, we're like, what's Sam Altman doing? What's he up to? Is he a genius? What's happening?
Starting point is 01:09:49 And then we're using open AI every single day. It's like, yeah, they're not, they're not of them. It's insane. And the history of humanity is just barrel forward and break things and we'll deal with the repercussions later. But the repercussions in this context are so exponentially beyond anything that our species has ever faced. And yet we're still really not course correct.
Starting point is 01:10:10 for this? Well, I think that the, I think it's the first technology in human history that can wipe out human history. I don't believe the no nukes could have done that. There's a world, maybe potentially, but, you know, all of the studies that I've read, all the research I've read, State Department reports, so on and so forth, you know, you only need 150 people to survive for society to flourish. Society can come back from that. And, you know, the predictions were always, you know, we still have a billion people on the planet. Maybe Antarctica looks like Los Angeles, but like the planet still survives most of that. And the same with chemical weapons.
Starting point is 01:10:49 Like chemical weapons, they tried it in World War II. And the reason it didn't work out was because of the wind, because the wind would blow the chemical weapons back in the Germans face. And so that didn't work. AI is, in my opinion, the first technology that could literally wipe us off the face of the planet. it. And I think that what we've been very lucky at until now is whenever something goes wrong, it is not catastrophic. You know, hundreds of thousands of people died in Hiroshima and Nagasaki. And then we were like, wow, nuclear bombs are really dangerous. You try to make sure
Starting point is 01:11:30 that doesn't happen again. And the same with other technologies and so on. and so forth. The question is with AI, will it be too late once we realize, oh, that was a bad idea? What's your suspicion on that? My suspicion is that if you look at technology in terms of warfare, every new technology we build is one that can be used for both good and bad, right? And in the beginning, in the early days, when we were living in caves and stuff like that, when we realized we could kill something with a rock, we could kill one person, maybe a few, and then you were probably going to get killed with someone else's rock. You get to this point where guns come along and you can kill more people, you know, maybe, you know, maybe a few with the ones that you stick the bullet in in the old, like, you know, the olden days. then you get to machine guns, you can now kill hundreds and so on and so forth. Each new technology allows us to kill more and more.
Starting point is 01:12:36 So now we get to nukes. We're at hundreds of thousands. I believe AI is in the billions if used correctly by someone nefarious. And so the question is, is what is that number? And who is the person that ends up using it? Because what's happening with technologies, it's making, it does this, what I say people don't think small enough is the way I think about it. Less and less people can create the thing that can destroy the world.
Starting point is 01:13:04 And the thing that can be destroyed, the world is even smaller than the thing from before. And so I think that we will have an instance, my prediction is that we will have an instance where it is used in a catastrophic way. And the question is, does it kill 100,000 people or does it kill 5 billion people? And after that, there will be safeguards put in place, likely. But the question is, is how many people die in the process? And what is your sense of how it would kill people? Like, is it just, you know, creating a meltdown at the local nuclear facility? Or like, what is the means by which people are going to die as a result of it?
Starting point is 01:13:50 Well, if you asked me to put my screenwriter hat on for a second, like, yeah, that's the classic, like, you know, the power gets shut off. There's a State Department report that says if the power gets shut off in America, 95% of society is dead within a year. A lot more people, you know, hundreds of thousands are dead in the first few weeks. Like you cut your finger, there's no medicine that's coming to you. The water gets dirty. You die of, you know, deuteria.
Starting point is 01:14:18 There's all these different things that happen. Literally just turning off the power grid. Yeah, there's the just turning off the power grid. Yeah, literally, nothing else. Super easy. The thing that people, there's that great saying that there are nine hot meals between anarchy and, sorry, between society and anarchy. I think we're down to about four hot meals at this point.
Starting point is 01:14:41 I don't think we'd make it to nine. We're just on a trigger. You know, after COVID, everybody's ready to pounce. I don't think it would take that many days. And I think like. And we've learned that like there will be no comedy among men. Like, you know, we will just immediately pivot to antagonism. Completely.
Starting point is 01:14:59 One thousand percent. And we never think it all through. Like, I, you know, I was working on a movie about, like, an apocalyptic movie right before COVID. And so I was doing lots of research into all the things that could go wrong. And I found out about COVID in China before it was like a mainstream thing. And so I was like, oh, my God, we have to, I was like, this is going to be really bad. And we have to get food.
Starting point is 01:15:22 And I stocked the basement. My wife or my sister, everyone thought I was out of my mind. Like pre-NBA suspending its games. Way, way, way, way. Like two months before. Wow. And it was just like a luck. It wasn't like I was some foresightful genius.
Starting point is 01:15:38 It was just literally like, oh, my God, that could be really bad. And I'd been living in my own head about all the things that could go wrong in society. But you know what I didn't do? I didn't get toilet paper. I didn't get an extra can opener. I didn't get masks because I didn't know. So you can't even plan for the worst-case scenario. or extra bottled water and whatnot.
Starting point is 01:15:57 I had a lot of chips and pasta. But I think that the other thing is like there are things that an AI could do today that could kill billions of people. Like, for example, have you heard the stories about the bank robberies where they fake the bank, the senior manager's voice and they call to do a transfer? Have you heard about these? I haven't heard about this. There's versions of this.
Starting point is 01:16:25 There's like one, and I think it was somewhere in Europe and Italy or something, or they, they, or Sweden or something like that. But there was a transfer that was done because they faked the, the boss's voice using AI and told them to transfer $50 million or whatever it was to another account. So imagine that you have an AI that says, okay, we are going to poison the food supply. or poison the water. And they call, oh, can you ship this to the, or whatever, or they have them change some numbers or something like that. There are so many scenarios that you could just literally using a social engineering hack from an AI do things.
Starting point is 01:17:10 There's all the drone warfare stuff. There's a million things that could go wrong. And we can't think of them all. And so how do we plan for them? Yeah, I have no idea. I have no idea. But on the subject of AI screenwriting and pandemics, the kind of inciting incident for me reaching out to you, it took us a while to get our schedules in line for you to come here, was on the heels of me listening to Scott Z. Burns's audible. I guess it was a podcast, like a limited series, sort of like an audio book or an audio documentary, in which Scott Z. Burns, legendary screen. Reinhwriter responds to this seeming desire out in Hollywood and in the world to come up with a sequel to the movie Contagent. The movie Contagent sort of had this second life during COVID because it's so there was so much fidelity in that movie to kind of what we all experience.
Starting point is 01:18:14 I remember watching it at the very beginning of the pandemic and I've now watched it again since then and I was like, this movie is incredible. It's a great documentary. It's just unbelievable. Yeah. And everybody wants a sequel to it. Scott's like, yeah, but I just, I can't see any compelling reason to create a sequel to this.
Starting point is 01:18:31 I can't think of a premise that would be worthy of the time and investment that it would take to create a movie. And he goes on this journey thinking like, well, what if I use AI to come up with a valid premise? And it's this really engrossing kind of deep dive into what AI can do in terms of. of creative storytelling and, you know, Steven Soderberg is part of it. And there's one point where Scott reaches out to you. And you're sort of the AI optimist in this equation. So explain like your role in how you use AI as a writer. Well, I will say that I'm a, I consider
Starting point is 01:19:11 myself a technology realist. I can see the good and the bad and all of it. And I think every technology does have good and bad. You know, I think that some technologies have more bad than good. Social media, for example, is one of those. Cars are great,
Starting point is 01:19:30 but 1.2 million people die every year still to this day driving cars and you can nuclear bombs, nuclear power, like you can just go through all of them and there's always a good and a bad for it. So I do see the good and the bad. I think as far as AI goes,
Starting point is 01:19:47 you know, massive job loss, the potential end of humanity. It's like we're actually having this conversation. But just put that aside for a second. However, I also think it allows us to tell better stories quicker, to think about new ways of telling stories and enables people to tell more stories. I believe Hollywood is about to enter its MP3 moment. And its MP3 moment is where, you know, anyone with a computer or an iPad or whatever it is will be able to make a movie that looks like Mission Impossible in their bedroom.
Starting point is 01:20:30 There will be good parts of that, that we can all tell those stories. There'll be bad parts of that. And to me, you know, I had a writer's room at Netflix for a show I was doing right where, when ChatGTP came out and after the room was done, I was like, oh, could I make my own writer's room? And I've done that and I do that for my own projects. And it's not like I've replaced people's jobs because I wouldn't be able to hire my own writer's room for my own projects. But but it is, I think there's incredible benefits to it. I mean, I did a, I tried to count the other I mean, I think I use it like 5,000 times a day.
Starting point is 01:21:09 I have agents that are helping me write screenplays and fact-checking books and all these different things at the same time. And I needed a website that I was doing some work on a script. And whenever you use Claude and things like that, the quotes are always straight, straight quotes and they should be curly quotes. I couldn't find a website that would curl them and didn't have ads. So I just used Replit and I made my own website. curl my quotes.com.
Starting point is 01:21:38 It's like, I couldn't have done that before. And like, and I think that it enables you to do more in less time and tell better stories. So in the audio documentary, which is called, what could possibly go wrong? I think it was going to what could go wrong. You enter the picture and you kind of counsel Scott on how he can create his own writer's room essentially by giving birth to a bespoke series of AI bots, each of which has its own personality, its own backstory, its own degree of expertise. And so Scott creates this writer's room.
Starting point is 01:22:18 There's a virologist and there's a conspiracy theorist. And then these people, and he's got his studio head and his agent and these people are all in communication with each other. And there's like a whole ecosystem that gets created. And this is something that you've done, you alluded to and what you just shared, which is absolutely fascinating. Like, I'd never thought of AI in that context. Like, I use it as a research tool, but as somebody who's writing a book right now, like, I'm very reluctant to share, like, my actual prose with an AI. Like, I don't want any kind of, like, I want to write my book.
Starting point is 01:22:53 I want it to be a human-created book. I want to take advantage of AI as a research tool, but I don't ever want to be accused of, like, AI having its fingerprints on anything that, like, I've written myself. But I don't think it's, so I'm not, I, I'm not going on to, to Claude and saying, write this book for me, because it couldn't do it. It just, it couldn't. But I do go along to Claude. So one example of the way I use AI is I just, just finished this book for, with Twain Johnson on the company, the mafia in Hawaii. And we have 5.5 million words of research for the book. Newspaper articles from the 1940s all the way to the 1980s that were all taken from slides, like microfiche and so on and so forth.
Starting point is 01:23:41 I have a researcher whose name is Nick. We called Nick 2.0. And we went to the National Archives. We got all the court documents, boxes that had never been opened in 50 years, thousands and thousands and tens of thousands of page of court documents. When I wrote American Kingpin, there was no AI. So what I would do is I did like, I had an Excel spreadsheet that like had all the dates and the times and everything, but I would have to remember what to search for. Oh, like, oh, he was wearing orange. So I'd search for the color. And then I would piece it all together, right?
Starting point is 01:24:14 What I did with this book and what I do with the screenplays I'm working on and so on that are all based on some sort of reality is I use, I use cursor, which is actually most people use it for programming. And I use it for both programming and writing. and I have my own agents that are specifically designed for what I do. And then I can have them take the microfiche, turn it into text and so on and so forth. And then I can ask a question. Like, oh, what were some of the interesting things that happened during the trial? What were, you know, tell me the story about the murder of Monty and Fuzzy. Tell me, like, and so it gets all those things.
Starting point is 01:24:52 And then I can go read those parts of the transcripts, but I don't have to go through 5.5 million words of research. And so the other thing I do is while I'm writing, So do you know what a TK is? Uh-uh. So in journalism... Oh, like you'll get to it later. You move on.
Starting point is 01:25:07 Yeah. So in journalism, whenever you're writing on deadline and you need to fill something in later, you write TK. So two letters next to each other. And it's the greatest thing I've ever learned in my life because there's no word in the English language that has a T and a K next to it. So you can search at the end right before you go to press for TK. And if it's like a fact about like the number of floors of a building or the guy's, you know, job title or whatever, you fill those in. So usually I would go and I would do find and replace and do the research and so on. So what I do now is I, as I'm writing, I'll be like the person, you know, this guy's walking
Starting point is 01:25:40 down the street on TK Street and he runs into TK guy and did it. And then I write it and it's all in my style. And then I give it to one of my agents and I say, go go fill in the TKs from all the research. And it just goes and does it. And it gives me, I haven't given me all the things. So that to me has saved me a week's worth of work. So I'm still doing the writing, but it's filling in the little details that, you know, or like I'll remember from reading the dialogue between two characters and I'll write it and then I'll say go check it and make sure it's right and fix it if I made a mistake. And so it's doing that with a with a screenplay, you know, I'm writing the screenplay, but I'll say like this is one way a lot of people I know use it. Like this paragraph is too much exposition. Give me 10 versions of how to take the exposition out.
Starting point is 01:26:32 And then it gives it to me. And then I rewrite it again, and that's it's it's it's just thinking. It's not writing the book. It's not writing the screenplay. It's just helping me come up with things. And then at the end of it, I will, you know, one thing I did with the recent book I did, I uploaded the whole book and I said, are there any characters that need closing out? Are there any moments that don't flow, blah, but give me a full critique on, you know,
Starting point is 01:26:59 on it and it gave me a bunch of notes and I was like, great. And I went back and fixed them. Can you create bots for the characters like the Dwayne Johnson character? And then Tess is just something this person would say based upon all of this 5,000 pages of research that you have fed it? Well, so there's a couple of things I did that are like that. So one thing I did was I had an agent and they're super easy to make. Like they're really not. You can just go to Claude and say, tell me how to make an agent that does X or skills or whatever. It's not that hard. But one thing I did that was really fascinating recently was I was writing about these
Starting point is 01:27:41 mobsters specifically. And there was some parts I just didn't understand like from a mentality standpoint. So I had an agent that created the characters and then I could interview the characters. They're dead. Yeah. So I can't like. It's fucking wild. So I'm like, what's it like to bury someone alive?
Starting point is 01:28:02 Like, what does that feel like? And like, it's looking at like, it's got the information I have, the interviews I've done, and so on and so forth. And then it's also looking at research online and everything. And so I'm literally having a conversation with a dead person. And, or what's it like? Another example was part of the book takes place and the movie takes place during World War II. So there's this amazing opening of the,
Starting point is 01:28:27 book where I'm not going to give it away, but it's a very visual moment during Pearl Harbor. So I wasn't in Pearl Harbor, right? I could go read a few books and da-da-da-da-da. I said, I want you to be a person who is there in Pearl Harbor on day that the bombs dropped. I'm going to ask you questions. Like, what does it smell like? What does it look like? What are you hearing? Where are the planes coming from? All these questions. And it gives it to me. And then I go, I go right based on that quote-unquote interview with the robot. Yeah, yeah, that's wild. It isn't a binary thing.
Starting point is 01:29:01 I mean, that's incredibly helpful and powerful. And in the Scotty Burns audio documentary, it's like the premise that AI comes up with for this contagion two is a fucking banger. Yeah, it's great. I would watch this movie. And he was not coming up with that on his own. But I think that the misnomer is that you're going to, the misnomer is this. It's that AI will be, will be able to write the next Cormar McCarthy book. I do believe that is going to happen.
Starting point is 01:29:32 You do. I do believe that. I think that we are a long, which is terrifying. I think we're a long ways away from it. But I do believe we're going to, maybe not a long way, maybe a couple of years. I don't know. But, and I haven't wrapped my head around what that means. I do have some thoughts and we can talk about it.
Starting point is 01:29:49 But, but for the, for the rest of the. for now, you still need a human to write those stories. You still need a human to direct the AI because these LLMs have been created on the entirety of everything written in the past, you know, X number of hundreds of years. And I mean this in the kindest possible way possible. Most people can't tell a fucking story and most people can't write and they're really bad at it. And so it is being trained on everything, including that. And most of it is that. And so it's not looking at the best writers.
Starting point is 01:30:32 It's looking at all of it. And so. Treating them equally. And it's treating them equally. In fact, it's giving more weight to the worst ones because there's more of them. And so you still need a human to say, that's a terrible idea. That's cheesy. There's a thing that a lot of the, all of the AIs do when you ask it to write a screen.
Starting point is 01:30:52 scene in a screenplay. So one of the classic things in a screenplay is you come into the scene late, you leave the scene early, which means you never walk in the door and you never walk out the door. You come in mid-argument. It can be four lines of dialogue. When you ask an AI to write a scene, even if you tell it to come in late and leave early, it will always be like, it'll just, it always tells you the thing that you already know. It doesn't know how to do any of that stuff yet.
Starting point is 01:31:19 And I think it's a while away until it does. I mean, I take comfort in that, but as we progress towards an AI that can write like Cormac McCarthy or in a indistinguishable fashion, the question is, like, what is the human role in the midst of this? And, you know, there was a bunch of hullabaloo around these clips that were shared on social media where it was like a fight scene with Brad Pitt. And it looked pretty realistic, right? But you still look at it and you're like, yeah, I don't give a shit because, like, I know this is fake. and this isn't the real people. And, you know, as human beings, part of the attraction to that type of storytelling is the human element,
Starting point is 01:32:01 is knowing that there was, you know, somebody behind that with a creative inspiration to create that. The question is, does that matter when we are looking at something that's indistinguishable or not? I don't think it does. No, I don't think it does. You think that's sort of a romantic. romantic idea that humans are hanging on to. Because there is a, there is, I do feel like right now at least, and obviously we're in early days, you know, authenticity is at a premium. Like we're,
Starting point is 01:32:32 we're sort of tired of the AI, AI slop. And when you see something that you know is real or a person that you can trust, like that has value. Let me ask you a question. If I, so you like Corny McCarthy too, right? We'll use him as an example. If, if I came along and I said, I found his unwritten novel. It's amazing. You got to read it. And you read it and you're like, this is amazing. And then afterwards I said it was written by an AI. Would you feel differently? Yeah, I don't know. Like you would still have enjoyed the novel. Right. But then I would be disappointed, I think, upon hearing that news in the
Starting point is 01:33:09 aftermath. Yeah, you'd be disappointed in the aftermath. But my point is, is that you will appreciate the art equally the same, whether it's written by an AI or the real Comerick-McCarthy. And I think that if a, when you look at that fight scene that everyone was sharing, it still had a little bit of AI in there. Eventually it won't. And it just won't. And I think that once we get to that point, if it's a good story, people won't give a shit if it's made by a person or not. And they won't give a shit if it's a real actor or if it's a fake actor. I think that there will be some people in society that will say, like, I can imagine books in the bookstore that says written by a human.
Starting point is 01:33:52 Like, I'm not condoning any of this in any way, shape, or form. Like, if I could build a time machine and go back to 1960 when people wrote on typewriters, I would be happy to. But I just see that this is the future. It does seem inevitable to me, though, that we're in this recursive loop of degradation because these AI tools are only as smart as what we feed them. And as the internet is increasingly populated with AI generated material, it's like we're making facsimiles of facsimiles and we're not seeding it with the best of what humanity has to offer. And we're seating it with less human creativity and inspiration. So Cromack McCarthy is a one of one and he comes along and writes in a way no one else does. And that elevates the human spirit. it, but we need those people to come and kind of refresh how we think about literature or choose your art form or whatever the specialty is. And if those people don't and they're not feeding the AI, then we're just training it on
Starting point is 01:34:55 what exists and it becomes this lowest common denominator thing. And that can't help but kind of degrade in it. We just careen towards idiocracy. I think we've careened towards idiosync. We're past that point. I'm not condoning any of this. I want to say that. To me, I said this on a podcast last year and got all these people were mad at me, but I'll say it again, I don't care.
Starting point is 01:35:20 I think Colleen Hoover is a terrible writer, right? And her, she had at 1.7 books on the New York Times bestsellers because of TikTok and so on and so forth. And I don't hold it against her. I'm not saying I'm Cornyn McCarthy in any way, shape, or form. I'm just like I can list a million other terrible writers. But what bums me out is there are so many incredible books by incredible writers that make you think. And that's the stuff that's on the top of the New York Times best sell list. But it's because we go to this lowest common denominator now.
Starting point is 01:35:59 And my hope, and this is literally just my hope, is that what AI can do is help us. help guide us to back to something that is not slop. And I think if it's going to be slop, it will be, we don't need AI slop. We've got, we've got human slop, right? It just, it doesn't, it's not going to change. It's not, it's whatever. But there is a scenario, and this is my hope, is that the scenario allows us to make better stories that push us to think more and so on and so forth. The question is, is, is, is, anthropological, going to say, oh, we're only going to use the top 1% of writers and then we're going to train the AIs on that and then they're going to be more creative and so on and so forth.
Starting point is 01:36:46 Or is it what's going to happen? I don't know. I do believe that no matter what, that the most creative people will always have a job of telling stories, whether they are coming up with an idea and telling an AI to write the book or whatever it is. But I truly do believe that. I think it's going to be like the art world where you have 50 people that make a living and the rest of them, you know, paint flowers in their bedroom and, and that's it.
Starting point is 01:37:13 But like, but I do believe that that's the case. But I think that to say that we're not going to consume AI content because it's garbage, like we consume a lot of garbage today. Storytelling is not going away. But the thing that worries me the most in this shorter term window of rapid AI advancement is the incentive structure behind good storytelling. telling because we are already in a post-truth world. And a great story using AI and visuals that are indistinguishable from reality has the capacity to manipulate the masses into, you know,
Starting point is 01:37:56 name your idea, right? So this is something that is easily weaponized. Like, we don't know what's real anymore. And we don't know that person that we recognize who's saying that thing, whether they actually said it. And even if we're told it's fake, the research shows that we still kind of believe it, you know, even if we're told it's AI. And what is this doing to the human mind and to the capacity for the human animal to maintain coherent societies? Like, I just see, like, yes. Yeah, this is how we're going to destroy ourselves, I think, in the short run.
Starting point is 01:38:37 I completely agree. And that's the biggest worry, I think that, you know, in the attack on Iran recently, Iran was using just basic AI tools like chat GDP to make fake videos or Gemini. I don't know which ones they were using, but they were using these basic AI tools to make fake videos of them bombing Israel, destroying it. And then they were putting that on television for everyone in Iran to believe that they're winning the war. it's like, you know, there are news clips. There was a news clip that I'm on a text thread with a bunch of screenwriters.
Starting point is 01:39:15 And one of the guys was like, oh, my God, did you see Iran is agreeing to all of the U.S. terms? And it was a CNN clip of Jake Tapper. And the only reason I could tell it was AI was because Jake's hair looked too good. And but it was it was an AI clip. And it's like enough to convince your very smart screenwriting friend. Yeah. Yeah. And so. Just terrifying to me.
Starting point is 01:39:35 It's terrifying. And I think, but technology, technologies cometh and technologies are then cometh to take on the other technology. Yeah, I know, but this is not Napster. It's not, but the only way that we survive this is if the technology that is used to create it, there's other ones that are used to fight the bad ones. Sure, but that is analogous to the detection of performance enhancing drugs. Like the advancement is always ahead of the detection system and the, the correction system. 1,000% and I, I, I'm not saying, I'm not being an optimist in this.
Starting point is 01:40:13 I'm just, I'm just, I know that's where we'll end up. Again, it's the worry of how much bad happens before we figure out how to solve the. And what is your sense of the timeline here? It's, well, the problem is, it's, it's, what's fascinating is, you know, I've always been, I, you know, when I was a kid, I loved computers so much. I used to go on a weekend to the local, um, corner of, shop in England and get the new coding magazine and write, you know, basic code and stuff. And I've always been obsessed with tech and just something about it that I'm so, I mean, I, I'm fascinated
Starting point is 01:40:47 about technology from a human standpoint. It is, it is the only thing I think that really truly separates us from other creatures and other creatures make art and music and so on and so forth. Technology is the one thing that we do that really truly separates us. And yet it is inevitable that it will be the downfall of us and we can't stop ourselves. And I'm fascinated by that. And also the other thing I'm fascinated by is that we are obsessed with technology because we want to be able to do things quicker and easier and so on and so forth and advance more. And yet, when the technology destroys us, the thing that we immediately go to is back to
Starting point is 01:41:30 these cavemen that will beat each other up for food and so on and so forth. So there's this crazy dichotomy with the way we as humans work, and I'm fascinated by it. And, you know, if right now there was some nuclear attack and the power now, like, you and I would be, like, out in the street with baseball bats when we haven't had a meal in four days. And like, and we literally go to that lowest common denominator as us as people. We go to our animal instinct. So that I'm fascinated by. And as far as the AI question goes, I do believe that there's a scenario where it could be used for good. But I also know that there's a lot of scenarios where it will be used for bad.
Starting point is 01:42:17 And the thing that's crazy is I've been covering. That's already true. It's already true. It's going to get worse, though. There are remarkable advancements that are happening as like disease prevention and like cures and like just incredible shit. Yes. But there will be bad. Lots of bad. And the question is, and we haven't really seen the real bad yet. We just haven't. It hasn't come about. There's little video clips and this, that, and the other. There's people who get scammed and so on and so. We haven't seen like the, we haven't seen the real bad. It's coming. It's just inevitable. And what's been so insane to witness for me is someone who has been using technology their whole life, has been writing about it for more than two decades is how quickly it's happened. And I had this moment recently. So one of the things I do when I do pitches is I create these AI images that I walk people
Starting point is 01:43:06 through as I'm pitching the story. I find it really like a great way to tell a story. And I use mid-journey for it. And I started using it about a year ago for this. And I went to the bottom of my mid-journey feed. And the first images were so horrifically bad compared to what we can do today. And that's in like a year. And so there's no, I have never seen it.
Starting point is 01:43:29 technology grow as fast as this, it's astounding and it's exponential. It's going to continue to do it as it gets smarter. And so the question is, well, it becomes self-improving. It becomes self-improving. It's on a daily basis. You know, every single day, there's this competition between deep seek and Gemini and anthropic and open AI and so on to quickly get the more models and more tokens and more, you know. And the question is, is what goes wrong and when and how do we prepare for? And there's no answer to those questions. Yeah. And it is an interesting quirk of the human animal that we know all of this.
Starting point is 01:44:12 And yet we're like, well, we're just barreling forward anyway because that's what we do. That's the fascinating part. That sort of informs the argument that like maybe human beings are just intended to be the sex organs. of this new form of life. And that's ultimately like the endgame of our entire purpose. And it doesn't really matter whether we survive or not. You know, it's interesting. I heard the story years and years ago from a friend who was at the dinner where Elon Musk
Starting point is 01:44:40 and Larry Page got into the fight about, do you know the story? I don't think so. They were at Larry's house. Elon was sleeping on his couch. And they were talking about AI, way before any of us were talking about it. And Larry Page had said allegedly to Elon that, you know, robotics are the future of this of humanity. Like, so that's the next iteration of humanity, they're of evolution. Sorry.
Starting point is 01:45:11 And Elon got really mad. He was like, what do you mean? He's like, well, that's it. It's just evolution. And Elon and him started arguing. And Elon said, you're, no, Larry Page accused Elon of being speciest. and I remember at first hearing that and being like, what a psycho, Larry, pages.
Starting point is 01:45:29 And then you kind of see it all and you're like, well, maybe that's, maybe that's it. I don't know. And here we are and Elon is creating the robots. Basically, like they stopped creating the Model S and the Model X so that they could allocate resources towards their robotics department. Yeah. And it's like, we know what the first.
Starting point is 01:45:51 We know the future. We know what it looks like. But we're like, yeah, it's going to be bad, but here we go. Yeah. And what I find so fascinating, I'm sure you've had this conversation, is this, that exact moment. What you just did? You laughed. So when I talk to people, we're at a dinner party or something, me and my wife, and it'll come up.
Starting point is 01:46:08 And I'll tell them, like, all my theories of, like, what it is. And then people, like, there's a silence and then there's a laughter. Like, we know when someone almost gets hit by a car and they laugh right afterwards? Like, there's a silence and this laughter. And then people will be like, so what do you guys want to do for dessert? And it's like, there's a powerlessness. There's a total powerlessness that we have. And so it comes out and it's grinding our teeth at night or whatever, however it comes out.
Starting point is 01:46:34 But it's like, it's so fucking fascinating that we know it's going to go wrong. And yet, and I know it's going to go wrong. And after this, I'm going to go home and I'm going to sit at my laptop and I'm going to talk to an agent. And it's like. Yeah. And that's just a weird cognitive dissonance kind of thing. Because what's the other version of? I go live in the woods and wait for someone else to do it?
Starting point is 01:46:57 You're balancing these polarities of like existential dread while also being kind of amazed that we're alive at this period of time where we're giving birth to this thing. And there is like you're watching a movie. Like what's going to happen next? Like it's exciting. But how old are your kids? Nine and ten.
Starting point is 01:47:15 Nine and ten. Okay. So my kids are older. But your kids being younger are, that's even more of a oppression case. Like, how do you think about the world that they're going to inherit? And what do you say to young people who are wondering, like, where should I place my attention? What should I be doing right now to prepare for this thing that none of us can kind of really imagine what it's going to look like and what the skill sets are that we're going to need?
Starting point is 01:47:42 I truly don't know the answer to that. I truly don't. I don't know if I were me. And if you don't, If I were me, okay, and I was just coming out of art school at this point, I would not, I would have dropped out of art school probably, to be quite frank. But I always believe, I think I truly do believe that the most important thing that humans do is tell stories. And I would continue to try to figure out how to tell stories, good stories that try to, that have an impact on society that is positive, quite frankly. I know this may sound bullshit, but it's truly what I believe. So to me, if I were coming out today from art school or trying to get a job or whatever
Starting point is 01:48:31 it was that I was doing or going to be a writer, my number one goal would be how do I tell stories that try to make sure society doesn't end up in the worst catastrophic place, whether that's I wouldn't have done this but like becoming an influencer that's trying to talk about the good and the bad and the didla da la it's like
Starting point is 01:48:51 it's the only thing that we can do to control it is to tell stories about what it could be and to make people think about it and so my answer is that you know and it's why I'm still doing it now because I believe like
Starting point is 01:49:09 I believe that you know the the the The mafia book that I'm writing with Dwayne, it's sure, it's about the mafia, but it's about colonialism. It's about right and wrong and what happens to people. It's like it's about much bigger things. And I think that we as a species,
Starting point is 01:49:30 that's how we learn is through story. And the only way to try to save the species is to tell better stories. I think that I, that chuckle that we have when we're kind of, confronted with this reality is a reflection of our discomfort with uncertainty. And the truth of the matter is that the world has always been tremendously uncertain. And the human mind likes to fabricate rules and create structures that foster the illusion of certainty and predictability,
Starting point is 01:50:03 but nothing really is certain. It's just that with AI, the uncertainty factor is through the roof, all of a sudden. And we're just fundamentally wired to be uncomfortable with that. Yeah. But in terms of like the anxiety levels of the adults who are trying to make sense of this, perhaps there's something to be said for just being in a relationship with your relationship with uncertainty, you know, and trying to find some peace in that and understanding that, you know, it's been uncertain all along. I totally agree. I mean, it's why people are drawn to were drawn to religion because they're not as much anymore because that created boundaries around uncertainty, right? It said there are rules and there's a reason and you have to trust in God and
Starting point is 01:50:51 Jesus and whatever. The other thing that this religion you believe in says. And it's as a society, as societies, we've pulled away from that at a time when we probably need it more than ever. And I think it's a really astute point to say like you have to be okay. It's like that saying, in in AA, like about being okay with not being okay? Yeah, there is a, there is a, somebody who's been in AA for a long time. Like, it's all about powerlessness. Like, we really don't have control over all the things that we think we do. Like, we can control our behavior and our kind of reaction or response to things that happen.
Starting point is 01:51:33 And given an acceptance of that powerlessness, how do we find peace, happiness, meaning, etc. So I read this Nietzsche quote that was it was actually really interesting. So here it is. So what if someday or night a demon were to steal after you into your loneliest loneliness and say to you this life as you now live it and have lived it, you will have to live once more and innumerable times more? Would you not throw yourself down and gnash your teeth and curse the demon who spoke thus? or have you once experienced a tremendous moment when you would have answered him, you are a God and never have I heard anything more divine? And the point that he's trying to make is you know that you are living the life that you're
Starting point is 01:52:20 meant to live if you would be okay living it over innumerable times. And so that to me is like I am so fucking lucky that I ended up. telling stories because it's what I'm meant to do. I don't know why. I don't know. It's literally, it's, it gives me a calmness. It makes me, it's why I love writing books. Like most people hate writing books. I fucking love it. There's nothing I enjoy more than the challenge of telling a story in 110,000 words. And, and I think if you told me I have to write books every day for the rest of the, like, the civilization or whatever it is, like I'd be the happiest person ever. And I think that to me that's really what it's all about. And it's if you if you know that you're doing
Starting point is 01:53:12 that you that you're comfortable with the fact that you are living this life that you're meant to live, then that's what it's all about. That's the meaning. And that's man's search for meaning. That is it. And it could be like maybe you love branding, you love running, you love climbing towers, like you love podcasting, whatever it is. It's about finding that thing. And I think that to me and that to me is what it's all about until the AI kills us all. I think that's beautifully put. I would only add to that that perhaps amidst this uncertainty or being in the eye of this AI hurricane, that maybe we can appreciate the moment a little bit more.
Starting point is 01:53:51 Like if this is truly, everything's transient, but now it just is like, you know, it's on a shorter time frame, right? Like, okay, well, if this is going to go away or things are going to look very different in an unpredictable fashion, like in a very short period of time, let's try to really appreciate the lives that we have right now. And like yourself, I feel extremely lucky to be able to do what I do and sit across people like yourself and learn from them. No, I completely agree.
Starting point is 01:54:20 And I think one of the biggest, somebody said this to me 15 years ago, when the 17 years ago, so when the iPhone first came out, They had a very prescient point, which was my biggest worry is not the screen and this and that. It's that we will never have moments where we just sit anymore. And boy, were they right. And I think, like, one of the things, and it's hard, it is so hard. But one of the things that I really have been trying to do is, like, just put it away. Even if it's for five minutes.
Starting point is 01:54:53 Like, people used to sit on their porch and just think, you know? Yeah. And now, like, you're like, you're like, you're a. a urinal and you're like, oh, I can, I can read an email, you know? And it's like, it's, it's, I think this is honestly one of the worst inventions in humanity. It is literally, like, I wonder if Steve Jobs were still alive if he'd be like, oh, I did a great thing or, or like, I did a terrible thing. And, and I think it really is a, it's, it's just, you got a, you have to fight against the technology, even though we all need to use it to be in the society we live in. I think that's a good place to
Starting point is 01:55:29 End it. But I would ask you this final question. How is it collaborating with The Rock on this project? He's just a sweet guy. He's like really thoughtful. Always thinking about other people. He, he's funny. He doesn't send text. He sends voice notes. And so you get these long voice notes from Dwayne. And it's like, it's just, they're fun. You know, he's like telling his thoughts and his, you know. That tells me that he appreciates the fact that he holds a certain. stature, like if you get a voice memo for him, you're going to save that thing. It's his voice. He's speaking to you. Well, I think it's also, it's supposed to like just texting. He also, he like works out like three hours a day or something. So he's just, I think he's doing, he's doing, he's in the gym and sending voice meows out. Rather than like he's like, you know,
Starting point is 01:56:15 lifting 400 pounds with one arm and then sending you a voice. But he's great. He's been an amazing collaborator. And he really got in there. We were like, we did a lot of the interviews together, like with the, uh-huh, the former mob bosses and. And Scorsese and Scorsese has been, he's, he's, You know, it's, that was a surreal moment. I was with him in New York recently and we were breaking the story and batting around ideas and talking about good fellas and this, that, and the other. And he's 83, but like he's got more energy than me. You know, it's, it's interesting.
Starting point is 01:56:46 It's like, you know, they're, that's an, it's an example of, like, the people that are doing the thing that they should be doing. Yeah. And they're really good at it. And they're, you know, and that, that's. It's been really fun to be a part of that. In watching this Scorsesey documentary and there's that part where they're, you know, they're talking to his childhood friends and they're talking about how he's literally
Starting point is 01:57:11 storyboarding movies as this little kid and everyone's like, what are you doing? Like, clearly this guy is doing exactly what he's supposed to be doing. Like, he is living the fully expressed version of who he was always meant to be. I think that's what it's all about. Like, it's about finding that thing and it doesn't matter what. what it is. I mean, you don't want to be a serial killer if you're meant to be, but like, but like it's about finding your place on, on the stage in this play. Like, and, and that's it.
Starting point is 01:57:42 Because we don't know. We will never know. And AI is never going to tell us why we're here. Like, we're never going to figure that out. So we just have to go with the thing that we, we are comfortable with in that moment. And that's usually the thing that we want to be doing. And that's what it's all about. I think that's a good place to end it, man. Yeah. Beautiful way to conclude this conversation. Thanks, Nick.
Starting point is 01:58:06 Thank you for having me. It's a lot of fun. Yeah, very cool. The movie, obviously, is going to be a couple years before this thing is out. It's going to be a couple years. Yeah, the book is coming out next year. Yeah, so I have a podcast I'm doing with Dick Costello and Paul Kodrovsky. Dick was the former CEO of Twitter.
Starting point is 01:58:22 It's called The Nick Dick and Poll Show. we talk about all sorts of stuff like this every week. Nice. All right, man. We'll come back and share some more when the book's coming out. Yeah, I'd love to. All right. Thanks, Nick.
Starting point is 01:58:32 Thank you. Peace.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.