Passion Struck with John R. Miles - Douglas Rushkoff on Survival of the Richest: Don’t Believe Their Mindset EP 192

Episode Date: September 22, 2022

Professor and author Douglas Rushkoff (@douglasrushkoff) joins me on Passion Struck with John R. Miles to discuss how human behavior has become controlled by technology, as the "Mindset" of the obscen...ely wealthy is pushing us towards a world where they scale money infinitely, resulting in a fractured system with enormous wealth and income inequality. Douglas Rushkoff is a professor of media theory and digital economics at Queens/CUNY. Named one of the world’s ten most influential intellectuals by MIT, he has been exploring, celebrating, and bemoaning tech development since the early 1990s. He hosts the Team Human podcast and has authored 20 books. --► Purchase Survival of the Richest: https://amzn.to/3S3YMTi  What I Discuss with Douglas Rushkoff Today’s episode is a bit different from the ones we typically do on the show. Douglas and I will talk today about what the super-rich have termed "the event." Their euphemism for environmental collapse, social unrest, nuclear explosion, solar storm, unstoppable virus, or malicious computer hack that takes everything down. According to Douglas, tech billionaires are preparing for this event by configuring their doomsday bunkers. But today’s podcast isn’t only about this, it is about the impact of the future of technology. It’s about the future of humanity and the need to unite intentionally to solve the world’s issues. -► Get the full show notes for all resources from today's episode: https://passionstruck.com/douglas-rushkoff-survival-of-the-richest/  --► Prefer to watch this interview: https://youtu.be/0xUKGmJxbWE  --► Subscribe to Our YouTube Channel Here: https://www.youtube.com/c/JohnRMiles --► Subscribe to the Passion Struck Podcast: https://podcasts.apple.com/us/podcast/passion-struck-with-john-r-miles/id1553279283  Thank you, Dry Farm Wines and Policygenius, For Your Support Dry Farm Wines have No Chemical Additives for Aroma, Color, Flavor, or Texture Enhancement. Dry Farm Wines - The Only Natural Wine Club That Goes Above and Beyond Industry Standards. For Passion Struck listeners: Dry Farm Wines offers an extra bottle in your first box for a penny (because it’s alcohol, it can’t be free). See all the details and collect your wine at https://www.dryfarmwines.com/passionstruck/. Policygenius provides free quotes tailored to your needs with support from licensed agents, helping you get insurance coverage fast so you can get on with life. Save 50% or more on life insurance at https://www.policygenius.com/ Where to Follow Douglas Rushkoff Website: https://rushkoff.com/  Instagram: https://www.instagram.com/douglasrushkoff/  Twitter: https://twitter.com/rushkoff  Team Human Podcast: https://www.teamhuman.fm/  Medium: https://medium.com/@rushkoff Facebook: https://facebook.com/rushkoff/ LinkedIn: https://www.linkedin.com/in/rushkoff/ -- John R. Miles is the CEO, and Founder of PASSION STRUCK®, the first of its kind company, focused on impacting real change by teaching people how to live Intentionally. He is on a mission to help people live a no-regrets life that exalts their victories and lets them know they matter in the world. For over two decades, he built his own career applying his research of passion-struck leadership, first becoming a Fortune 50 CIO and then a multi-industry CEO. He is the executive producer and host of the top-ranked Passion Struck Podcast, selected as one of the Top 50 most inspirational podcasts in 2022. Learn more about John: https://johnrmiles.com/  ===== FOLLOW JOHN ON THE SOCIALS ===== * Twitter: https://twitter.com/Milesjohnr * Facebook: https://www.facebook.com/johnrmiles.c0m * Medium: https://medium.com/@JohnRMiles​ * Instagram: https://www.instagram.com/john_r_miles * LinkedIn: https://www.linkedin.com/in/milesjohn/ * Blog: https://johnrmiles.com/blog/ * Instagram: https://www.instagram.com/passion_struck_podcast * Gear: https://www.zazzle.com/store/passion_sruck_podcast  

Transcript
Discussion (0)
Starting point is 00:00:00 Coming up next on the passion struck podcast. I go to this crazy resort in the middle of the desert. They bring me out in a golf cart to this other part of the facility and I'm waiting in my green room for them to come with a microphone. They clip on you and you go out and do your talk. Instead of bringing me out, they bring these five guys into my green room and they sit around this little round table and they start peppering me with these questions about the digital future. And finally, they get to the question, Alaska or New Zealand. Where should they put their bunker for the coming apocalypse? And the question that ended up taking up the majority of the hour was, how do I maintain
Starting point is 00:00:39 control of my security staff once my money is worthless? Welcome to PassionStruck. Hi, I'm your host, John Armiles. And on the show, we decipher the secrets, tips, and guidance of the world's most inspiring people and turn their wisdom into practical advice for you and those around you. Our mission is to help you unlock the power of intentionality
Starting point is 00:01:02 so that you can become the best version of yourself. If you're new to the show, I offer advice and answer listener questions on Fridays. We have long-form interviews the rest of the week with guest-ranging from astronauts to authors, CEOs, creators, innovators, scientists, military leaders, visionaries, and athletes. Now, let's go out there and become PassionStruck. Hello everyone, and welcome back to episode 192 of PassionStruck. Recently ranked by FeedSpot is one of the top 40 most inspirational
Starting point is 00:01:35 podcast of 2022. And thank you to each and every one of you who come back weekly to listen and learn at a B-Better, Live Better, in Impact the World. If you're new to the show, thank you so much for being here, or you would like to introduce this to a friend or family member, we now have episode starter packs, both on Spotify and the PassionStark website. These are collections of our fans' favorite episodes we organize and to convene your topics
Starting point is 00:01:58 to give any new listener a great way to get acquainted to everything we do here on the show. Just go to passionstark.com slash starter packs to get started. In case you missed my episode from earlier in the week, it was with Rachel Hollis, a three-time New York Times best-selling author and host of the highly popular Rachel Hollis podcast, which has over 100 million downloads. And last week, in case you missed it, I had on Dr. Scott Berry-Coffman and Jordan Finegold, and we discussed and released their new book, Choose Growth, and explore how to transcend things
Starting point is 00:02:33 like trauma, self-doubt, and worry. I also had on former monk Hindu priest, Dada Pani, and we explore his new book, A Power of Unwavering Focus, and its impact on living an intentional life. In case you missed my solo episode from last week, it was exploring the concept of free will and whether it actually exists or not. And I wanted to say thank you so much for your consistent ratings and reviews. They go such a long way in helping expand the popularity of the show,
Starting point is 00:02:59 as well as increasing our following. We would so appreciate it if you love any of these episodes that I just discussed or today's if you love any of these episodes that I just discussed or today's if you give us a five-star rating. And I know our guests love to hear comments from the fans. Now let's talk about today's episode, which is a bit different from the ones that we typically do here on the show. We are going to talk today about the super rich and what they have termed is the event. Their e euphonism for the environmental collapse, social unrest, nuclear explosion, solar storm, unstoppable virus, or malicious computer attack
Starting point is 00:03:31 that takes everything down. The super rich, according to our guest today, are making preparations for this event in the forms of configuring their Doomsday bunkers. But today's podcast isn't only about doom and gloom. It's about the impact of the future gloom, it's about the impact of the future of technology, it's about the future of humanity, and the need for all of us to come together with intentionality. In order that we can solve the world's issues, now let's talk about
Starting point is 00:03:56 today's guest Douglas Rushkopf is a professor of media theory and digital economics at Queens, Tuni. He was named as one of the world's 10 most influential intellects by MIT. He hosts the Teen Human podcast. He has written many award-winning books, including Teen Human, and today we're going to discuss his new book Survival of the Richest. Thank you for choosing Passionstruck and choosing me. To be your hosting guide on your journey, creating an intentional life. Now, let that journey begin. I am so excited today to welcome Douglas Rushkopf to the PassionStruck podcast.
Starting point is 00:04:36 Welcome Douglas. Hey, I'm really glad to be here. Well, before we get into anything else, I just wanted to say congratulations on the launch of your new book. I'm going to put it up here. We'll put it in the YouTube video and show notes, of course, but I know how big a deal it is to go through the process of launching one of these. So congratulations. Oh, thanks.
Starting point is 00:04:56 Well, we're going to talk about that a lot today. We are also going to touch on another one of your books, and I'll put that up there as well called Team Human. But before we get into that, I always like the audience to get to know the guests a little bit. So I thought first, I'd ask this question. I saw that you study human autonomy in the digital age. What exactly does that mean? Well, it's interesting. I mean, when I was growing up, I could never decide if I wanted to be a theater director
Starting point is 00:05:24 or a doctor, oddly enough. And I was always interested, because, you know, George Bernard Shaw, the great player, right, was a doctor, Anton Chekhov, was a doctor. And for me, the thing that always made the two things tied together was both theater and medicine is asking, what makes a person alive? What is a living human? And what is a human in action? What's the difference between a live being and a person alive? What is a living human? And what is a human in action? What's the difference between a live thing and a dead thing?
Starting point is 00:05:48 And while I was really interested in theater, I got even more interested in digital technology in the late 80s and early 90s. Because here were these almost thinking machines, these technologies that would allow us to connect and do things and express really the human will in all of these strange new ways. But by the mid-1990s, I felt like business was taking these technologies and reversing them. So instead of looking at how technologies are going to express what it means to be human, these technologies were trying to get humans to
Starting point is 00:06:25 do stuff. Sticky websites, manipulative algorithms, we were using tech on people. So at that point, I became really concerned with how much are people going to be able to think and act of their own volition in a world where everyone is trying to program them. So that's why I kind of decided what my goal is to really study and if anything promote human autonomy, human will, human ingenuity and creativity in an age where it seems like they want to auto tune that out of our vocabulary.
Starting point is 00:07:00 Well, I think it's a noble cause. And as I think about the technology that's hitting us, almost all of it has driven, I think, unintended consequences, or maybe it's intended consequences and we'll get into this as this podcast unfolds. But either way, it's driving more individuality than we've ever had. And I really believe that this as a backdrop, because it's not as if this happened overnight. It's been going on for three plus decades now. But what I think it's bringing a rise to is the chronic states of loneliness, helplessness, hopelessness, that billions are feeling right now. I was shocked when I read this report
Starting point is 00:07:46 that showed over the last 20 years period, they measured loneliness across the planet and 33% of the globe, 113 countries responded, said that they felt lonely, 50% of resilience. So it's just rampant. It's interesting. You could say on one level, it's for the last three decades, since digital technology is amplified it.
Starting point is 00:08:09 But you could also say it's the last three centuries, since the invention of this fictional thing we call the individual. But it's really not we live in community. We live with each other. Being human is a team sport. It's a collective activity. And back in the Renaissance,
Starting point is 00:08:25 we kind of retrieved this ancient Greek notion of the Vitruvian man, the individual. And it was great for individual rights and one man, one vote, and everybody matters. But the idea that we're each in our own little trapped in our own perspective and fighting for our own interests over everyone else's, and then that dovetails with capitalism, with technology, and with the depression that so many people are feeling today. I completely agree with you, and it's one of the whole reasons I started this podcast was to try to do my part, find a way to help people get back to human connection, get back to living this life that we're supposed to be living that we're not.
Starting point is 00:09:06 I wanted to ask you a follow-on question. You currently teach at the City University of New York, but for a long time, you taught at NYU. What were some of the courses that you taught and was there one that stuck out more than the others? Yeah, I mean, the main thing I was doing at NYU, I started something called the Narrative Lab in their interactive telecommunications program, which is sort of their internetty school, but a kind of creative internetty school.
Starting point is 00:09:35 And what I was always interested in, really from the time I was very little, was the standard story structure that we always use in pretty much everything. The one that Aristotle identified. Crisis climax relief. We go up the incline plane of tension, the character makes a series of choices, they reach a climax, and then the thing is solved. So whether you're watching Door the Explorer, or even a Christopher Nolan movie, you get to the end and wooo, made it. And I felt like what it was doing was addicting people to endings that we only felt satisfied if the story ended. And I was thinking, well, we live in a world
Starting point is 00:10:13 where I don't want the story to end. I want the story to keep going. And we have these new technologies that unlike books and movies, these digital technologies, which are more game-like and hypertextual and people moving from place to place, what would it be like to develop narratives, satisfying narratives that don't end, that don't complete, but encourage people to keep going. What does this story like that encourages you to go outside
Starting point is 00:10:43 and meet more people and do more things and make more connections rather than ache for finality? Because I was really concerned as I looked at a lot of the religions we have that were really reaching for an end, reaching for an apocalypse or an Armageddon, all of our movies reaching toward endgame and big bang things. And so many businesses that people were starting
Starting point is 00:11:06 not to run businesses, but to exit. Everyone says, I had a great exit. Well, why did you exit? It's a great business. Why did you need to have an exit? Well, I think all of what you just said is a great lead in to the core topic we're gonna talk about today
Starting point is 00:11:23 and how this is going to unfold throughout this podcast. So I wanted to start out by referring to something that was in Teen Humans. And I believe in social impact theory, which means history repeats itself and then feeds itself. And I believe we go through these different periods and we're entering one now. But let's just talk about an example of this from your book, Team Humans, in Statements 76, you discuss how in Japan, our ancestors placed stone tablets in the ground
Starting point is 00:11:56 saying don't build anything below here. Can you talk about what happened when we did? Yeah, it's a lesson I'm trying to make about respecting our ancestors, the tech bros that I write about in my new book in particular. They everyone wants to feel like they're inventing new IP, that they're the smartest things ever, that they know best and we can just start over from square one. No lineage, no heritage, no credit, no footnotes or citations. It's just me. I was born. I'm
Starting point is 00:12:26 a genius. Here I go. The young, heubristic architects and nuclear power plant designers in Japan, they saw these old ancient stones that said, do not build below this point. I mean, why would the ancients have put markers there? Because there was a very, very rare flooding situation that would happen. But it happened enough every couple of centuries that they realized we better tell people because our houses got torn down and our great, great, great, great grandparents houses got torn down. Let's put some markers here so future generations know don't go don't try it. It really is. Nature is big. These people come the new ones and they say, oh these were ancients, building little huts. They don't know. They're primitive cave people. Who knows? We're going to build our nuclear power plant
Starting point is 00:13:20 right down below those stones. We'll leave the stones there. Ha. And then of course, what happens? Giant tsunami comes, wipes out the Fukushima plant that nuclear devastation of that is still being dealt with. I mean, just because we don't hear about it on the news every day doesn't mean they're not still digging around in there trying to solve that problem, which is spewing radiation. You got radiated fish in Seattle from that still measurable. It's the hubris. So for me, and I'm rising of you know, Ari Wallach's book about us being good ancestors to the future, it's also us respecting the ancestors of the past that they did have something to tell us. We're not so inventive, so modern, so technological that the laws of nature no longer apply to us.
Starting point is 00:14:03 Yeah, I agree. There's a lot we can learn from long path. I wanted to use what we just talked about to introduce a billionaire because we're going to be talking about the super rich and elite today. But I want to talk about Ray Dalio, worth 19 billion now. And he's been on the book tour and podcast scene recently doing conversations about how the world is on the brink and dangerously close to international conflict in the areas of trade technology, geopolitics and economics. And he even goes out on a limb and says there's a 40% chance of global turmoil.
Starting point is 00:14:40 What are your thoughts on his predictions? Yes, we live in a very brittle situation, whether it's our very long supply change, the which we get our food and our stuff, the economic uncertainty that's created by huge divisions of wealth, and a taxing the climate the way we do. But what I would challenge any Ray Dallio or anybody else who's got billions of dollars to do is to look at the impact of their money where their money is sitting right now. Even the Gates Foundation, they're doing wonderful things are trying to from the top down, but where is the massive money of the Gates Foundation
Starting point is 00:15:25 invested? It's in that same S&P 500 index fund that Ray Dalio's money is in and all his funds are in, and that capital, sitting where it is in mining companies and oil companies and other companies that really are trying to return shareholder value with little or no regard to the planet and its people is doing more damage than we can do in the best speaking tour in the world. In the end, the kind of environmentalism and economic and civilizational stewardship that he is calling for begins at home. Yeah, I bring him up only because he also is a study of the social impact theory and he's seeing
Starting point is 00:16:07 these patterns reemerge, especially with potential conflict between China and the United States, primarily because China plays this very, very long game. They think in terms of decades and centuries, and we think of it as tomorrow, and when the two don't align, you've got conflict. Well, I wanted to just mention that because you had a defining moment. I think we all have many of them in our lives, but yours came in a really interesting form. You were given this opportunity to speak with this very high valued price put on it. So you probably were thinking, I can't turn this one down, but you end up landing having to drive
Starting point is 00:16:53 three hours into the desert. And then what ends up happening? Well, I go to this crazy resort in the middle of the desert and wait till they come from me in a little golf cart with a little Patagonia branded those kind of guys. They bring me out in a golf cart to this other part of the facility and I'm waiting in my green room for them to come. The microphone, they clip on you and you go out and do your talk.
Starting point is 00:17:17 Instead of bringing me out, they bring these five guys into my green room and they sit around this little round table and they start peppering me with these questions about the digital future. And this I'm kind of used to because most of the kinds of people who hire me to speak, they're not really looking for my philosophy. They're not looking for what we're talking about. They're looking for what do I bet on to make the most money this year. So they first they start with a theory, a merbit coin, which cryptocurrency is going to go up faster, virtual reality or augmented reality,
Starting point is 00:17:48 which is going to dominate Web 3. And then finally they get to the question, Alaska or New Zealand. Where should they put their bunker for the coming apocalypse? And they said, interesting like Ray, they had said that they had had people do research and they had concluded that there was a 20% chance of a world-ending catastrophe in their lifetime. So they were taking 20% of their money and putting it in alternative survival strategy, 20 brain space programs and C-steading and uploading their brains to silicon chips and bunkers. And the question that ended up taking up the majority of the hour was, how do I maintain control of my security staff once my money is worthless? So they're playing these walking dead scenarios. They realize, okay, they're hiring.
Starting point is 00:18:42 They already have Navy seals, former Navy seals hired to come guard their various facilities, to fly in on their SWAT helicopter things, whatever they have, to be there, to guard them. But then they're thinking, well, if civilization really collapses, why is that Navy seal just not going to take the whole facility from me and run it himself? Especially if he's in charge of the Navy SEALs. And it was so funny. They started to go through things like, well, will robots be around by them that could do discipline or could we put shock collars or implants that everyone gets for one reason, but then we use it for discipline also. When push came to shove, I finally said, you know, the way to prevent your security staff
Starting point is 00:19:20 from killing you in the future is to be really nice to them today. And I joked, I said, maybe you should pay for your head of securities, but my daughter's bought mitzvah today, because then it'll be hard for him to shoot you later. And they laughed it off. But what I meant was, if you start treating other people well now, maybe you won't have to go to a bunker. Maybe you will have to defend yourself from the rest of humanity. What's interesting, your meeting happened
Starting point is 00:19:49 before Colin O'Brady's, but if you're not familiar with who Colin O'Brady is, he's an adventure athlete. He's set all these world records, but he was asked to speak similar to you. Instead of going to the desert, he was flown to New York and thought he was giving this big talk.
Starting point is 00:20:05 Ends up getting ushered in into this residence where again, he's surrounded by five or six and he thought the whole conversation was to hear his exploits and it was really a philosophical discussion of how do you find and pursue your Everest in life. So a little bit different, but a little bit more positive, a little bit more positive, but eerily similar. The thing that is interesting is I think that these bunker fantasies of these technology billionaires, that is their Everest in life. That the global disasters and climate change are not the reasons why they're doing it. They're the excuses for them to do something they've wanted to do all along. In the new book, I tell this story about hanging out with Timothy Leary back in the day.
Starting point is 00:20:53 Timothy Leary, the great counterculture figure from the 60s. I knew him in the 80s and 90s and he was reading one of the first books about the coming technologies. It was called The Media Lab by Stuart Brand about MIT's new media lab that was going to be about the digital future. And he reads this book and he's writing it, and I'm thinking he's loving the idea of a digital future and what we're going to do. Soon as he's done, he goes, blah!
Starting point is 00:21:15 And he throws the book across the room like it's this poison. I go, what, Tim? What happened? And he goes, first, less than 3% of the names in the index are women. That shows you something. And it's interesting. It really does. It goes, first, less than 3% of the names in the index are women. That shows you something. And it's interesting.
Starting point is 00:21:26 It really does. It goes, these are boys. And then he says, second, these guys want to recreate the womb. They want to use all this technology to make up for the fact that their mommies couldn't anticipate their every need. Now they want to go and cocoon in a virtual reality bubble where robots and computer programs bring them everything, but that they want to separate from humanity.
Starting point is 00:21:48 And that's the problem. And that's the way that they build technologies for us, not with this idea of separation and loneliness as a bad thing. For them, it's a good thing. They want to be alone and above and apart. They want to be above us, kind of like of like gods above humanity rather than just with us. Yeah, I think this is something, yeah, I think this is something that you call radical self-reliance. Am I getting that right? You could. Yeah, they call it, it's an interesting term. They call
Starting point is 00:22:18 it self-sovereignty. That's the phrase they use. And it's an interesting word. I keep thinking as I think back on these guys, they keep wanting to live outside themselves. Do we want to do what we used to call going meta? Remember going meta, it's like there's theater, then there's meta theater, the frame around the frame. Everyone wants to level up. So you get Peter Teele, the guy who started PayPal, the multi-billionaire, he runs Palantir now.
Starting point is 00:22:44 He wrote a book and it's called Zero to One. And the idea is that you and your business shouldn't be competing with everyone else. You should be one order of magnitude higher than everybody else. Stuart Brand told all of us, we are as gods and may as well get good at it. One level above normal people. Mark Zuckerberg, when Facebook starts failing, what does he do? Meta. He's going to be one level above. And as you talk to these people, listen to them, you realize,
Starting point is 00:23:11 or Ray Kurzweil wants to put his brain, one level above humanity in a main friend computer somewhere, what they don't get, what your podcast is about, is passion is about getting down and in, you know, passion is incarnate. It's in the body with other people. It's being not extra human, but truly human, feeling your heart, feeling your blood, feeling the sweat. This is where the core human experience resides, not on a silicon wafer. Yeah, I agree with you. Absolutely.
Starting point is 00:23:44 In the book, you talk about this in the words, quote unquote, the mindset. How prevalent do you think this mindset is? I mean, it's pretty prevalent. In some ways, it's contagious. The mindset, most simply, is this kind of a tech billionaire belief that with enough money and technology, they can escape the catastrophes that they're creating with money and technology. It's the idea that you can somehow build a car that can escape its own exhaust.
Starting point is 00:24:14 It just doesn't work. I couldn't believe I just read this, a great article by Cory Doctoreau about the Epson printer company. How they make these printers that seize up after a certain number of pages automatically, they lock so that you have to buy a new printer. And they have some excuse that there's a certain part that might wear out and they're worried for you. So they just lock the computer.
Starting point is 00:24:34 Don't let you replace the part. And I'm thinking the guy at the company who makes that decision, he's aware of climate change. He knows what's going on. Yet he is able to say, I think I can make more money selling extra printers to people than the damage is going to cause me. I'm going to have enough money to somehow escape what I'm doing to the rest of the world. And that mindset in some ways trickles down to all of us, particularly at times like COVID, when we're like, maybe
Starting point is 00:25:03 I'm just going to stay home, get an Amazon doorbell with a little camera, get fresh direct, and then grub hub, and door dash, and Amazon Prime, and I'm gonna kind of hold it out. But yeah, but the mindset is really, it's a hubris that somehow you can escape the externalities of your own kind of misdeeds. Yeah, I would just on to that is then if they have this mindset, then why are these super elite who are envied for being so powerful?
Starting point is 00:25:34 Why do they appear to be so afraid? We'll be right back to my interview with Douglas Rushkopf. I'm often asked, why get life insurance? We pay hundreds of dollars per year to protect our homes, our cars, and even our phones. But too many of us aren't taking steps to protect our family's finances. Things like mortgage payments, private student loans, and other types of debts that don't just disappear if something happens to you. A life insurance policy can provide your loved ones with financial cushion they can use to cover those costs and it can provide you with peace of mind that
Starting point is 00:26:09 even in a worst case scenario they'll be protected. And policy genius is an insurance marketplace that makes it easy to compare quotes from top companies like AIG and potential in one place to find your lowest price on life insurance. You could save 50% or more on life insurance by comparing quotes with policy genius. Option start at just $17 per month for $500,000 of coverage. Just click the link in the description or head to policygenius.com
Starting point is 00:26:37 to get personalized quotes in minutes and find the right policy for your needs. Head to policygenius.com to get your free life insurance quotes and see how much money that you could save. Now, back to my interview with Douglas Rushkopf. Because I think they've realized that it's going to be trickier, trickier than they thought. Used to just be, they do all this bad stuff, they make addictive programs on their iPads, and then they're sure to send their own children to Rudolph Steiner schools where they're not going to touch the technologies
Starting point is 00:27:08 that they happen to be making during. I met a lot of those guys. I don't let my kid use Snapchat. The ones who are programming it for my kids don't let their kids touch it. They're up in the hills of California with a goat share and organic food and Rudolph Steiner education and no technology and a Faraday cage around their house.
Starting point is 00:27:24 So they're like, wait a minute. I think once the forest fires came, food and Rudolph Steiner education and no technology and a Faraday cage around their house. It's like, wait a minute. I think once the forest fires came, once and nothing against right or left, but most of these guys are left. And once Trump was elected, I think they started to think, uh-oh, things are getting a little out of control. And the climate might be such a systemic problem that I can't build in a state in Malibu that's going to be capable of resisting forest fire.
Starting point is 00:27:50 I might actually have to do something about this. So half of them think, okay, I'm a tech bro, I know what to do and they come up with a software stack to save humanity. All we have to do is clear-cup these forests and build my thing. I've got gain B, plan two, kind of a sim city video game version of remaking society as I think. Build tunnels under the city and throw sulfuric particles into the sky. And don't worry, I'm going to do it. There's those guys, the great reset sort of people who have way too much faith in their technology's ability to do things without unintended consequences.
Starting point is 00:28:28 And then my billionaires who were more like, how am I going to get off this sphere before it collapses? Well, as I was reading your book, and I'm not sure if you're familiar with this book, I'm going to reference, but it reminded me of the 1957 novel Atlas Shrugged. How would you compare the character John Gault, who's in that to the people that you're describing? Well, it's interesting. In John Gault's industrial age,
Starting point is 00:29:01 you could always externalize the damage of what you were doing to someone else. Whether it's the conquistadors coming to South America and looking at the natives, like they're just trees, so we can clear cut the people as easily as the forest. Whether you're using enslaved labor in Africa or China, it's always somewhere else. And you can justify what you're doing to the others somehow. And never truly pay, I mean, maybe in afterlife or in your conscience or in your soul, you pay the price, but never actually physically pay the price. There was always enough room, enough territory, enough space that you could keep going. Go
Starting point is 00:29:42 west, go further, get more. These guys don't have anywhere to escape to anymore. They're kind of at the limits of capitalism. The other thing that's different is these guys, these days, they have what's called exponential growth. In John Gold's day, you had linear growth. You had Dale Carnegie growth, Rockefeller kinds of growth. So you could amass a certain amount of wealth, but the division of wealth wasn't quite as extreme. It's like right now we have not just capitalism. We have capitalism on digital steroids.
Starting point is 00:30:15 We used to say the stock market was the tail wagging the dog of real businesses. Now it's the derivatives market and the derivative of the derivative of the derivatives market. It's wagging the tail of the stock market, that's wagging the tail of the marketplace, that's wagging the tail of reality. The derivatives exchange, which is just for people who don't know, it's a way of kind of going meta on the stock market. Instead of buying shares of a company, you buy kind of theoretical shares a year in the future, six months in the future so that you can compress all that time and leverage more time a lot faster.
Starting point is 00:30:47 The derivatives exchange got so big that in 2013, the derivatives exchange purchased the New York Stock Exchange. I mean, think about that. The New York Stock Exchange, which is an abstraction of the marketplace, which is an abstraction of real people's goods and services and needs was consumed by its own abstraction. So, the John Gault thing is still there, but it's so much more leveraged. He has so much more impact, not just on other people, but on the environment in which he himself lives.
Starting point is 00:31:19 And that's what these guys are realizing. They need to escape from the environment that they made. Well, we're going to get a lot more into that. One of the things I wanted to dive into before we get there is that I believe we have entered into this current order where we have a few super rich and relatively a huge amount of poor. It's an imbalance in our history that I don't think we've ever seen as a species before, this income inequality that we have. And what do you think it really tells us about our current system? And I just want to preface this with you and I, we're talking about Seth Godin before we came on the podcast
Starting point is 00:32:06 and he talks about climate change is really a systems issue. I think we have a systems problem, not only about that, but in a much bigger way that's causing this inequality. Oh, yeah. We are not up against the hard limits of our global environment. We are up against the hard limits of a digital balance sheet. We have an economic system that was, for better or for worse, it was invented by monarchs in the 12th and 13th century. Really, at that time to prevent the rise of the middle class, people were trading with each other. They were starting to get wealthy. The former peasants were becoming trades people. They were going to the market and they had local monies and all these different
Starting point is 00:32:53 little industries. The problem was the aristocracy was getting poor as the people were getting wealthy. So they invented really two things. One was the charter monopoly or what we now think of as the corporation. Charter monopolies has without a charter, you're not allowed to work in that industry. So a little Joe, the shoemaker can't make shoes. He's got to go work for his majesty's royal shoe company as a wage laborer. So now he's no longer an entrepreneur or a business person. Now he's an employee selling his time. Second thing they came up with and this is the big one is central currency. Central currency was the only money you're allowed to use. So if you want to trade with someone else, you've got a borrow central currency from the treasury at interest and pay it back. And that was
Starting point is 00:33:36 a way for wealthy people to get wealthy simply by having money. So they were the exclusive purveyors of the cash that we were allowed to use. And that's really where we are today. We are living in an economic system where in order for it to work, it has to grow. Money is created at interest, so more money has to be paid back. That's why the politicians always say the GDP has to grow. It's not because we need more stuff. It's not because we need to get more food out of the ground or we need more models of our iPhone. It's because that number has to grow. It's not because we need more stuff. It's not because we need to get more food out of the ground or we need more models of our iPhone. It's because that number has to grow. So it's as if is a great quote by Korsypsky. The map has replaced the territory.
Starting point is 00:34:14 The balance sheet, the spreadsheet matters more than reality. And what Seth is showing us in his new book is this doesn't add up anymore. There is a real world that can no longer serve this abstracted economy. The tech bros are taking the other approach. They're saying, oh no, let's make the economy even more abstract. If it's not working with regular money, let's make digital money. Let's make Bitcoin and Ethereum and all those things. They're going meta on it. We're set the saying, no, no, no. Let's stop worrying about these abstract things and get back to reality. If you go the tech bro way, if you go into the Bitcoin way, what do you end up with?
Starting point is 00:34:55 It's insane. We are literally burning the planet as a way of showing our faith in a digital currency. You know, if you looked at it from space, you would see, oh, look at those humans. They're burning the equivalent of the nation of Argentina in order to prove that they love this sort of digital money. They're literally taking their planet, their energy, and turning it into digital symbols that do nothing for anybody. Yeah. And unless these industries start changing the way that they're doing things radically, we're
Starting point is 00:35:33 going to keep earning these fossil fuels. And there's not an alternative outside of nuclear, which even if we had nuclear, you know, no physics expert, but I've studied enough of it to understand we don't have the infrastructure to support everyone that's going to need it in the future. So I mean, it would be nice. And I love AOC and Green New Deal and all these plans and ideas. And even Elon Musk has a lot of them, but we can't transition that quickly to an entirely renewable energy system. It's not like everyone get rid of your cars and buy Teslas.
Starting point is 00:36:08 Everyone get rid of your generators and get the solar panel. If we tried to transition in just a few years from what we have now to a renewable energy grid, we would need to extract so many minerals, so much rare earth metals and lithium and batteries and things from the earth. We would destroy the planet in that 10 years really. And I say sadly, but I don't mean it sadly,
Starting point is 00:36:28 because I think we get a better life. We've got to turn down the dial a bit. We've got to use less energy as people and as companies. And it's not that hard to do. We don't need more stuff right now in my block. Every single one of the 10 of us on my block has a minimum viable product lawnmower from Home Depot or Lowe's. What if as a block, we had one high quality lawnmower that we shared. What's the problem with that? The only problem
Starting point is 00:36:59 with that, we have to share. Well, that's great. We get to know each other. None of us is using a lawnmower all day. I'm sure one lawnmower can serve 10 houses. It pretty easily. The real problem with it is when I suggest this is, well, what about the lawnmower company? What about them? Say, well, what if they have to make less lawnmowers? It's gonna be okay.
Starting point is 00:37:20 And then maybe we'll work less days per week. You know, it's like we don't need as much stuff. The only reason we need to buy and sell as much stuff as we do is to support the balance sheets of companies, not the needs of actual human beings. Well, and I saw that firsthand when I was in Fortune 50 world where your shareholders are driving in 50 world where your shareholders are driving a completely different metric than what you should really be concerned about. I mean, if you're constantly looking at quarterly profits, you're going to do that at the expense of everything else, where we should be looking at a lot of other important things as we're looking at these companies and how we value them.
Starting point is 00:38:05 And I think that's a big component of how things need to change. And that's why a lot of times I like private companies, more than public companies. Yeah, I'm interested in talking with you about that actually. I mean, because you've been in those rooms where it happens, I get these keynote talks. I did a keynote talk for a Fortune 50 company, it was a food and beverage company. And right before I went on, the CEO was there 50 company, it was a food and beverage company. And right before I went on, the CEO was there, and it's in front of all these shareholders. And the CEO is shouting, what was it? 5.2, 5.2, which was their percent growth target for
Starting point is 00:38:38 that year. And I'm thinking, I got up there and I said, geez, you're one of the biggest companies in the world making such revenue. If your only objective is that you have to grow in order to be okay, if you're already one of the 20 biggest companies in the world and you have to grow, aren't we going to run at a room? Well, I was at a tech company and while I was there, at the time they had five different presidents running different size businesses, kind of consumer products, small-medium business products, large-enterprise public products, professional services products, software products. And what shocked me was that the way that they were leading these individuals is by having
Starting point is 00:39:23 them do this enormous competition. And it was a huge competition. If you were on top of this list and they would get this huge trophy every quarter, you would make probably at the end of the year, $40, $50, $60 million. If you came in last place, you got fired. So there's this enormous pressure that's put on these people to act in ways that completely go to lining their pocketbooks, but doesn't go to really rewarding what the customer really wants, what the planet really needs, et cetera.
Starting point is 00:39:55 So. And you can company that way in the long term. That's what I don't know if you've ever had Alfie Cohn on this show. A brilliant business theorist, education theorist, and he distinguishes between intrinsic rewards and extrinsic rewards. And he talks about the word compensation. When you're compensating someone for something, that means the thing they're doing they don't like, because they're getting compensated for it. But you should have his intrinsic rewards. So, rather than giving someone extra money for doing well, you give them more responsibility in the
Starting point is 00:40:24 company. You give them more autonomy. There's back to my word. You let them get to the inner skunk work. So let them go to the innovation lab. You reward them with a more central place in the company culture, rather than with cash or vacation, which is nothing to do with the actual work. I want to use this as a way to segue because the company I was with who was doing a ton of mergers and acquisitions, which is the title of your second chapter. What I found working in these large companies and then I moved to a smaller one.
Starting point is 00:40:56 So let's say I was with a 60 billion company, I moved to a billion dollar company, which is still. Yeah, I'm in pot. Yeah. But this company at the time had the most sophisticated network of placing digital ads in front of consumers. And one of the things that I found extremely worrisome
Starting point is 00:41:17 is we did this deal with another company that allowed us to actually see inside your home. And this was 9, 10 years ago. And we could tell from your device that you were in the house, we could tell from this other company that you were watching TV. And we could marry the two so that as you were viewing what you were viewing on the TV, we could send you ads almost instantaneously to do it. And I use that story as this unintended consequences
Starting point is 00:41:51 of what technology is doing. And it's completely changed, I think, human culture in ways that we never thought it would. You talk about this in chapter two, but you also bring it up in your book, Teen Human, where you write that the internet fostered a revolution, but not a renaissance. And I wanted to ask, why is that? Honestly, I would say it's because in the mid 90s, we as a society decided that the internet was less important for the way it could unleash the collective human imagination,
Starting point is 00:42:25 then for the way it could save the NASDAQ stock exchange. So the whole bias, energy of dynamic, of digital technology changed. When I was interested in the internet, it was about, oh my gosh, what are we going to do together on this thing? Then, wired magazine came along and kind of recast it, reframed the internet as this business opportunity. But once you're betting, once it's a business, you're no longer looking for people to be creative and unpredictable. You want people to be predictable so you can make a bet and get the outcome that you're looking for.
Starting point is 00:43:01 So, we reversed the polarity of these technologies. We created websites, we called them sticky websites in order to get more eyeball hours from human beings. And even think of that phrase eyeball hours. That means we're using these machines to operate the people to get eyeball hours from them on our screen. That was the essential shift, that instead of people programming technology, technology is programming people. And we created these feedback loops. I mean, it sounds like magic,
Starting point is 00:43:35 but it's not. We develop algorithms until the algorithms get as much attention and money from that person as possible by any means necessary. Go. And then so we've got thousands of little algorithms out there looking for how to manipulate us with behavioral finance. I'm sure you've had people on who do behavioral finance and looking for what they call exploits. And it's funny when I was a hacker in the 80s, an exploit was a hole that you found in the machine. I remember a kid found an exploit, we called it,
Starting point is 00:44:06 in the computer system that ran the thermostat at the shopping mall, and he hacked into it and turned off the air conditioning in the mall. That's called an exploit, you find the hole. Now what they talk about is how do you find exploits in humans, in the human psyche, that you can leverage to get people to pay more attention to whatever it is you want or to click on whatever it is you want.
Starting point is 00:44:29 So once we have a multi trillion dollar technology industry there to operate humans, you've got a very different set of outcomes. I think it's really scary because there are some people who believe that our evolution is technology that we are going to be controlled by computers. And I found it very unnerving a few years ago. I wrote a piece on what AI was doing and how some of the foremost scientists in the world who were creating it were completely unnerved because it's not acting in the way that it was intended. It's learning differently than humans do. And I guess one of the easiest ways to explain this is if you look at a Tesla
Starting point is 00:45:11 and it's driving, the system learns what is the action that I can take that has the greatest probability of protecting the person in the car, which could be a decision that is completely different than what the human would do in it, because they might purposely put the car, which could be a decision that is completely different than what the human would do in it, because they might purposely put the car into an accident because they're seeing what might unfold in front of it. But I think that this is something we have to really watch for because if not what's going to be upon us, I think, is what was portrayed in the matrix. Yeah. I mean, the reality is, you know, computers or AI's, they will do what we tell them to.
Starting point is 00:45:49 And it's really hard to hold in your head all the considerations involved. So you can tell a computer, get my daughter into a good college and the computer can find out, oh, you know, applicants who have only one leg get into college because college is saying, oh, isn't that remarkable? That person did so well even with one leg. So the computer cuts off your kid's leg. You know what I mean? That's the kind of logic that we're facing. They're going to do whatever, whatever we say in all of the possibilities
Starting point is 00:46:20 that wouldn't have occurred to us in the moment aren't going to occur to the machine because they can't take everything into account that a human being will. And interestingly enough, I mean, these are the funniest stories in the new book, actually, is that a lot of the technologists I spoke with, the one thing they're afraid of is artificial intelligence.
Starting point is 00:46:40 They believe that they've gone meta on us, that they're above us, that they're... We humans are just controllable nonsense. And they'll always can hire their Navy SEALs or get ray guns or something to get rid of us. They're not so scared of us. They'll leave the planet do whatever. What they're scared of is AI, because they think that AI can go meta on them, that AI can control them the way that they mean to control us.
Starting point is 00:47:05 And one guy came up to me at another kind of private retreat of nicer, but also elite tech dudes. He came up to me, he was a big social media CEO guy. And he said, are you really comfortable with all that stuff you've been writing about AI lately? I'm like, what do you mean? Because you're being really openly critical of AI. There's a, yeah, what's the problem?
Starting point is 00:47:27 He goes, well, what about when AI takes over and they see what you've written about them? What do you think they're going to do to you? And I just like, well, what do you mean? And then he said, I don't write anything. I don't even tweet about AI because I don't want them to know. And I said, well, if AI is that smart don't even tweet about AI because I don't want them to know. And I said, well, if AI is that smart, aren't they going to be able to look at your posting pattern, know that you've been avoiding it and be able to infer how you feel about AI?
Starting point is 00:47:54 Because that's what AI's do. And this job drops. Like he hadn't even considered the techniques that he's using on all of us that they're all using to throw us into categories and to statistical buckets. That of course, there's a population of people doing exactly what he's using on all of us, they're all using to throw us into categories and statistical buckets. And of course, there's a population of people doing exactly what he's doing and the AIs are going to be able to figure out how he feels. And that was the moment I realized they are simultaneously so much smarter than me, right? These guys would do better on their SATs and no more about computers and math.
Starting point is 00:48:22 And I mean, they're're brilliant but they're so stupid at the same time laughably uneducated at the very same moment. I think it's because they drop out of school too soon. They get noticed by a venture capitalist when they're 19. They've got some good idea in their dorm room. They quit college to go be rich and they don't bother to learn history and economics. And most of all ethics, ethics, the kinds of stuff you talk about, the actual, the human to human, what makes us alive, what makes life worth living, that they're like cut off from that.
Starting point is 00:48:53 That's why you look at Mark Zuckerberg's metaverse, all the, these little creatures running around in there, these humans, they're cut off at the waist. They're torsos with heads. What does that mean about their sense of grounding, about their sense of what does it mean to be a human being on planet earth? It's like they don't connect with that. Yeah, I mean, it just scares me. I mean, one, he did the metaverse, kind of as politicians do when they get in trouble to divert the attention to something new.
Starting point is 00:49:21 But to me, it's like Ready Player One, the movie, where you're going to have the society that you have the shell of a body and your mind is constantly in this metaverse instead of being physically present. And when you start losing that human connection, as the grant study showed at Harvard, people become extremely unhappy and they lose joy and fulfillment in their life because we need to give love and feel love and you can't do that through a cell phone or a keyboard. You've got to do it by interacting with people and we're getting further and further away from that.
Starting point is 00:49:56 Or even with cash, which we think, oh, I could just give money. It's nice to give money, but we found is I refer to the studies in my book is wealthy people that compassion for others, their empathy erodes. They put billionaires in MRI machines and show the pictures of people suffering and the parts of your brain that are supposed to light up when you see someone out suffering, the empathic circuits of the brain, the frontal lobe, they don't light up with these people. It's as if the wealthy have had a kind of a brain damage that when you get to a certain point of wealth,
Starting point is 00:50:28 you lose your ability to identify with other human beings. That's a really toxic effect of extreme wealth. Yeah, and I think the other thing people really need to be aware of, and I'm glad you covered in the book is these war chests that the tech companies have been building to influence and lobby for policy change.
Starting point is 00:50:47 And it's crazy to me that you see these tech companies that have been lobbying these politicians and the politicians are doing what they say. And then it's starting to get out of whack and the population is starting to get angry about it. And they bring them up on Capitol Hill and berate them when they've been buying them for the past 10 years. To me, it's a colossal failure, and you look at this,
Starting point is 00:51:13 and it's some of the biggest companies that are out there who are the biggest culprits. Each year, is it Facebook that's the biggest lobbyist in the world or is it Google? And then they're just following what the oil industry did before them and what GM and the auto industry did before them and the rail industry did before them. This is what you do is you lobby government to get the thing you want. It's just that with digital, it happens so fast at such scale that it's a little bit more unnerving. But when you actually do and you've met them too, when you actually meet and speak with these guys, you realize emperor has no clothes. The technologies are good at
Starting point is 00:51:52 ripping things up, but they really don't do. They don't work as advertised. And they lead us to devalue the kinds of things that you're talking about. They lead us to devalue our passion. You listen to the way Zuckerberg talks about the metaverse. It's very flat. This is a passionless universe. It's a utilitarian universe. I'm thinking about that now that the Queen's dying. And as an American, I'm like, really, kind of who cares? It's a queen. And she's old and everybody's and there was a great guy named great. Well great for a while named Walter Badget in England and he really wrote a lot about the difference between kind of efficient government and sort of spiritual government. He
Starting point is 00:52:36 may even talked about it in that TV series The Crown. It's great that we have Parliament to do all the efficient things but people need something else too And that's what the queen does, the queen and the monarchy are there to fulfill the kind of our sense of dignity as human beings. And that's what we're really losing in America is what is our faith in our intrinsic dignity as human beings for being alive. It doesn't matter how much you produce, how much you consume. It's, you're okay. Like Mr. Rogers used to say, you're okay. Just the way you are. You are special. And that's what we need to be communicating to people. Not you need to get richer. You need to be more like Elon Musk. No, you need to be less like
Starting point is 00:53:18 Elon Musk. Well, the thing that worries me in many occasions, I mean, you could play this two ways. You could say that in some ways, we need the Elon Musk's in the world because we need their mind to take us to the next level. But in the same way, we demonize them for what they've achieved. Now, it's interesting to me because you've got people who are building bunkers physically in Alaska, New Zealand, and in the forest, and preserves, and other things. And then you've got Elon who wants to take us to Mars.
Starting point is 00:53:58 And I remember seeing an article last year where I can't remember the magazine, but it might have been time where he said, we'll be at Mars in five years. So I took that and I asked my Naval Academy classmate who used to be the chief astronaut and I said, what are your thoughts about us getting to Mars in five years? And he goes, there's no flipping way. You've got to just think about this. How in the world when we haven't even figured out
Starting point is 00:54:25 how to get a Volkswagen off the planet in Mars, are we going to land this thing? That's going to be the size of probably two buses on it and get that thing back off, not to say that it's not possible, but then once you get there, how do you support this bubble that you're living in? When, as you point out, we haven't even been able to do it in the biospears here on Earth.
Starting point is 00:54:49 So, I think one of the fatal flaws in this whole thing as you bring it up is you can be in this contained system, but you're going to have to be serviced by other people who are not in that system. And how do you keep that ecosystem fluctuating? I just don't get it. Right. And there's a lot of science fiction movies like that, where you basically have Earth now
Starting point is 00:55:11 as the enslaved population servicing this base station or Mars colony where the wealthy get to live. But that's not gonna work either. Honestly, you're gonna have an easier time trying to survive in a climate, nuclear, devastated earth than in the toxic clouds of Venus or out on Mars. It is harder, but the problem with Elon, I mean, yes, he's marketed technologies very well.
Starting point is 00:55:37 We were having problems getting people interested in electric cars in America. It's not that he invented the electric car. Anything technological or did anything special, but he had capital and he has marketing ability. And he sold American men on the idea that an electric car is cool. That was a good thing he did. I mean, of course, all of the green savings of every Tesla that's ever been made was more than outdone and undone when he went ahead and bought a billion dollars of Bitcoin. Just that act alone was like 30 years worth of Tesla's gone, pollution-wise. The problem with Elon is he's something, he's what's called an accelerationist, another
Starting point is 00:56:20 science fiction idea, that what we have to do is pedal to the metal, let this society burn itself out so that we can get to the new one. What Elon says is they're going to be one day, trillions of people in the universe, that's fine. And what I'm saying is no. There is no more ends justifies the means logic that will work. You can't sacrifice anything now for the future. You really can't. The way to ensure a beautiful future is to start living beautifully and lovingly today. If you're not doing it in the moment, you're not actually doing it.
Starting point is 00:57:12 Yeah, it's interesting. I read this article as I was researching to do this podcast. It was written by this writer in Texas named Jason Road. I'm not sure if you've read this, but he writes in this. I'm going to just quote it, what are the tech preppers really worried about? Not death by fire, quake or ice, not the rising seas or the zombie plague, not the return of Christ or rogue comets, seen clearly the calamity that the wealthy fear is democracy returning to the United States.
Starting point is 00:57:40 Every tall tail they tell, involved the specter of the mob. And he said the tech prepers understand at a deep level that their ill-gotten gains are predicted on an unjust system, deep in the brain where Reptire Impulse's live, tech froze now, hoarding his wrong. And I'm wondering if you would agree with some of his sentiment. Yes and no. The mob coming after the tech bros with pitchforks is not democracy. Right. That's something else. That's the revolt of the masses and revenge. Democracy is a coordinated effort by people to come to consensus. Mob rule or QAnon people storming the capital, that's not democracy, that's a revolt, that's something
Starting point is 00:58:25 else. There's still the infrastructure for a functioning democracy that has unfortunately been corrupted by money and undermined by would be a tharterians, but we could still, if enough of us, decided to participate appropriately in representative democracy. We could very easily minimize the impact of the tech bros, but I don't think we have to go after them with pitchforks, even though they're afraid of that. I don't think it's a matter of revolt.
Starting point is 00:58:56 I think the easier path is to live in a world, how do we raise kids that don't want to emulate Elon Musk or Peter Tio or Mark Zuckerberg? You help them laugh at these people. When you see Mark Zuckerberg tell publicly now he says he wants to give back 99% of his money. Then you say, well, look, Mark Zuckerberg wants to give back 99% of his money. What if he had made Facebook 99% less extractive and harmful? 99% less girls cutting
Starting point is 00:59:26 themselves and have it going into anorexia. 99% less competition and depression. That would have been a better way than taking all that and then trying to shove the money back in after the fact. I agree that the tech bros, what they're most afraid of is us, but they've been afraid of us from the beginning. That's what they're running from. That's what they've been running from for 3,000 years when Francis Bacon said that the object of the game, that he's the founder, the forefather of empirical science, he said, empirical science will let us take nature by the forelock, hold her down, and submit her to our will. So it's basically a rape fantasy of control and domination
Starting point is 01:00:06 that has gone unquestioned to this day. It's not just the witches of the medieval Europe that are being held down. It's all of us that feel held down by this tiny population of uber wealthy technologists. And yeah, there's gonna be a come upence, but I don't think our best path forward is to attack them. I'm trying to write a book where we can laugh at them and then move on
Starting point is 01:00:32 and get to the serious work of meeting one another again, of not being like them, but being like human beings. Yeah, and I leads me to the question, you saying the book, you came to the realization that the billionaires are actually the losers. And I think it goes into what you were just talking about. So I just wanted you to expand on that. Yeah, I mean, they get what they want is what? Is the ultimate virtual reality goggles where they have simulated algorithm girlfriends, loving them in whatever ways that real people don't.
Starting point is 01:01:04 They get the predictable, closed, dry, isolated, techno bubble with nothing, nothing unexpected. Life is the opposite. Life is about gaining the resilience that you can start to welcome the unexpected, that you look for novelty, that the idea of getting up in the morning is not to have the most predictable life as measured by your stock savings, but to have a day, I don't know who I'm going to see today, who I'm going to touch, what kind of encounters I'm going to have, who I'm going to help, if what you wake up thinking is, how am I going to get something from someone else today?
Starting point is 01:01:45 That's such a sad way to wake up rather than Who am I going to be able to help today? Who am I going to whose life am I going to change positively? Who am I going to connect with? It's the opposite. So if you define winning if you define play really the play of life If you organize it back to what we started with, if you organize it through the mindset of, I am playing in order to win, then you're going to end the game. That's what winning does.
Starting point is 01:02:15 If you're playing to win and you're playing to end, if you're playing to play, then you're playing to keep the game going as long as possible because you love Playing and that's what real life is about. How long can we keep this game going? How many people can I play with? How deeply can I play? How meaningfully can I play? And that's not what the billionaires are thinking the billionaires need to get to their end game. They need to win We've already won here to win. We've already won here. Yes, well, one thing that history has taught us is that human societies are drawn together in times of peril. And I kind of wanted to end our interview by talking about something that you bring up at the end of team human. And in your
Starting point is 01:03:02 statements, 97 to 99, you say that things may look bleak, but the future is open and up for invention. And then in statement 99, you say, you are not alone, none of us are. And so what I wanted to end talking about is how do we reassert our humanity together? I mean, the first step is Find the others when I say find the others to people first they think and it's true
Starting point is 01:03:31 Find the other people who are looking to connect find the others who understand that life is not a competition that life is a Collaboration that evolution is not the story of survival of the fittest individual, that evolution, even as Darwin wrote it in his book, evolution is the story of how different species collaborate and cooperate to ensure mutual survival. That's the book he wrote. That's what we learned in this book secret life of trees that they're sharing resources under the soil through a network of mycelia, that that's what's actually going on here.
Starting point is 01:04:04 So find the others who get it and then find the others who don't and connect with them, make eye contact in real spaces, breathe with other people. Funny, the word conspiracy, what it really means is conspire, breathe together. It's so funny that we would think of it as breathing together with another person is itself a conspiracy. Well, it is. It's the human, the great, beautiful human conspiracy. We are alive together, conspiring in this great game and ultimately against the systems that have been imposed by people who have long since left the building, systems to control us
Starting point is 01:04:41 and extract value from us, systems that the billionaires are blindly following, blindly digitizing when we're the only ones who are alive here. So find the other living people, find out what they need and be willing to share with them. And more importantly, be willing to let them share with you. I find that most of us are sort of more willing to lend something to a neighbor than to take something from a neighbor. Because once you take something, you're obligated. That sense of obligation is community. That's what knits us together.
Starting point is 01:05:11 So just meet people. Take one day a week. We used to call it Sabbath. Take one day a week where you don't drive. You don't buy. You don't sell. Walk around and meet people. See who your neighbors are because those are the people in times of peril who you are going to be depending on.
Starting point is 01:05:28 Well Douglas, we've covered a lot here today and if there was a core thing that you wanted to the reader to take away or the listener of this podcast to take away from this book, what would it be? The world doesn't have to end and we are all going to be better off rather than looking for how to insulate yourself from what's going on in the world, step outside and let's roll up our sleeves and make the world a place that we don't need to hide from. Okay, and there are a bunch of different ways to reach you. I will put many of them in the show notes, but I was hoping in case the audience doesn't read the show notes, you could give just a few for them. Oh go to either rushcoff.com is my website or look for team human. Team human is my podcast, the book before this and a community of people that are trying put passion into practice,
Starting point is 01:06:23 I would say. Yeah, I would also tell them to go to medium where I first read this article. And let them on there. And get the book, Survival of the Richest, Escape Fantasies of the Tech Billionaires. And let's just plug your podcast just a little bit more. Tell the audience a little bit about Team Him
Starting point is 01:06:41 and the type of show you do. And I started it because I felt like I made it. I'm cooked. I've had 20 books. People know who I am. I'm doing well. So I will do a podcast that platforms others that deserve an audience. And that's sort of the way it started.
Starting point is 01:06:58 And then it became more, I mean, it's still that, but it's now, it's a way of modeling live human interaction. So I'm not really interviewing people. It's more like what we did. It's more like a conversation between people. I'm trying to model intimacy because that's the key. Everybody's gotten so afraid of each other. Team human is a weekly conversation with someone else where we just bear our heart and soul to each other and connect on a level and hopefully engender a spirit of connection and contact for everyone who listens to it. Well Douglas, thank you so much for being
Starting point is 01:07:35 here today for joining us on the PassionStruck podcast. It was a little bit different than the typical interviews we do, but I think one of the most important things we podcasters need to do is give voices on a myriad of different topics so that people can hear all angles of the discussion. So thank you very much for being here. Oh, thank you. And thank you for what you do. I completely appreciate that. Thank you so much. I thoroughly enjoyed that interview with Douglas Rushkopf, what an interesting talk that was. And I wanted to thank Douglas and WWNorton for the honor and privilege of interviewing him here today on the show. Links to all things Douglas will be in the
Starting point is 01:08:13 show notes at passionstruck.com. Please use our website links if you buy any of the books from the guests that we feature here on the podcast. All those proceeds go to supporting this show and making it free for our listener. Advertisers deals and discount codes are in one convenient place at passionstruck.com slash deals. Videos are on YouTube at JohnRMiles, where we now have over 400. Please go there and subscribe. I'm at JohnRMiles, both on Instagram and Twitter, and you can also find me on LinkedIn. Then if you want to know how I book amazing guests like Douglas, it's because of my network. Go out there and build yours before you need it.
Starting point is 01:08:46 You're about to hear a preview of the PassionStrike podcast interview I did with Harvard Professor Max Bezerman and you see Berkeley professor Don Moore where we discuss their newly launched book, Decision Leadership. We talk about organization's decision factories and as more of the effortful labor gets delegated to automated systems, the decisions of the humans in the the people affected by the organization's operations, all of those depend on the effectiveness of the decisions of the people inside. The fee for this show is that you share it with family and friends when you find something interesting or useful that you can apply in your life. If you know someone who's interested in exploring the humanity of people, definitely share this episode with them. The greatest compliment
Starting point is 01:09:44 that you can give this show is when you share it with those that you care about. In the meantime, do your best to apply what you hear on the show so that you can live what you listen. And until next time, live life passion struck. you

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.