Modern Wisdom - #557 - Douglas Rushkoff - How Billionaires Are Preparing For Doomsday

Episode Date: November 26, 2022

Douglas Rushkoff is a media theorist, futurist, writer and documentarian. The world's richest people are preparing for the end. They're buying up farming land and below-ground bunkers in New Zealand. ...What catastrophe is it that they think is about to happen? And what happens if technology creates a techno-hell instead of a techno-utopia? Expect to learn what happens when you're invited to brief billionaires on the future downfall of the earth, whether having your own sovereign nation at sea is a realistic idea, whether it's possible to live in an underground missile silo with an indoor infinity pool, what mindset is driving this techno-paranoia and much more... Sponsors: Get the Whoop 4.0 for free and get your first month for free at http://join.whoop.com/modernwisdom (discount automatically applied) Get a Free Sample Pack of all LMNT Flavours at https://www.drinklmnt.com/modernwisdom (discount automatically applied) Get 20% discount & free shipping on the best Ketone Drink at https://ketone-iq.com/ (use code MW20) Extra Stuff: Buy Survival Of The Richest - https://amzn.to/3XzAdRq  Follow Douglas on Twitter - https://twitter.com/rushkoff  Get my free Reading List of 100 books to read before you die → https://chriswillx.com/books/ To support me on Patreon (thank you): https://www.patreon.com/modernwisdom - Get in touch. Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx YouTube: https://www.youtube.com/modernwisdompodcast Email: https://chriswillx.com/contact/  Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Hello friends, welcome back to the show. My guest today is Douglas Rushcough, he's a media theorist, futurist, writer and a documentarian. The world's richest people are preparing for the end. They're buying up farming land and below ground bunkers in New Zealand, but what catastrophe is it that they think is about to happen? And what happens if technology creates a techno hell instead of a techno utopia? Expect to learn what happens when you're invited to brief billionaires on the future
Starting point is 00:00:30 downfall of the earth. Whether having your own sovereign nation at sea is a realistic idea. Whether it's possible to live in an underground missile silo with an indoor infinity pool. What mindset is driving this techno paranoia and much more. But now, ladies and gentlemen, please welcome Douglas Rushcough. It seems to me, the kind of inevitability, I've had a lot of conversations on the show about existential risk, talking about the work of people like Toby Ord and Nick Bosterim are we going to get turned into Grey Goo or a big bunch of paper clips, you know, this small background risk of a gamma burst destroying us all or is the polar, the polar ends going to flip and then all of the
Starting point is 00:01:31 technology is going to fall out. It seems unsurprising to me that the people that have the absolute most resources on the planet who are exposed to this kind of thinking, who are probably technologists in one form or another, techno utopians would take that, run with it, and think I essentially have an unlimited number of resources. How can I try and insulate myself from this problem? That to me doesn't seem very surprising at all. No, but I wonder what the order of events really is. Is it, oh, I've got all these resources, so I have a lot of stuff that's at risk and I want to spend a lot of my energy protecting myself, or is it, I've achieved
Starting point is 00:02:15 my great wealth and all my technological monopolies and stuff by treating other people in places as if the world is already ending. So I really like to have some evidence that the world is ending so I can justify having essentially used a bomb shelter mentality all along. How would you say millionaires have treated the world as if it was already ending or billionaires have treated the world as if it was already ending, or billionaires have treated the world as if it was already ending from before now. Well, I mean, there's so many examples. Like one I just heard was on a piece that Cory Doctoro, a friend of mine, wrote about Epson printers. And apparently they make a certain Epson printer is pre-programmed to brick itself after a certain number of pages. is pre-programmed to brick itself after a certain number of pages. And he interviewed the company to find out why and they justified that there's a tiny little sponge
Starting point is 00:03:11 somewhere in this printer that absorbs the loose dust. And after however many five or 10,000 pages, they believe that that little sponge might be filled up and some of the dust might sprinkle out onto the filing cabinet or a piece of paper under which you give where you put the printer. And to prevent that disaster from happening and because you can't replace that two-cent sponge, I mean my god, that would be an impossible feat of administration, they will save us all by breaking the printer and forcing you to throw it into a waste pile somewhere, you know, a big toxic waste pile that some Brazilian children pick at to find the renewable,
Starting point is 00:03:52 you know, rare earth metals in it, and then send some other children to get the rare earth metals out of a mine to build a new printer for you. Now the guy at Epson who makes that decision is not stupid. He knows I am contributing to climate change. I'm contributing to the end of the world by doing this. But I'm going to make enough margin selling an extra few printers that I'm going to be able to distance myself from the reality I'm creating by earning money in this way. So this series of decisions that are made or the guy, the guy I spoke to at one of those
Starting point is 00:04:26 food camps who was one of the guys who put the algorithms in the social media feeds that addict teenagers to, you know, clicking and whatever and worrying about themselves. Doesn't let his kids touch this stuff. He's got a private, you know, organic farm with a goat share. His kid goes to a Rudolph Steiner School. It is not allowed to touch anything. So he's already got a kind of organic farm bunker in the middle of nowhere. And behaving as if everyone else's kids are the ones who are being left behind. So if you've got an apocalypse coming, you can kind of say, okay, that's why I'm doing it this way.
Starting point is 00:05:05 Okay, so you're saying that a large apocalypse justifies this sort of zero sum, me and mine, versus fuck everybody else, that sort of mentality is born out of that. On the one hand, yeah. And I guess on the other hand, what I'm looking at is this just a sort of innate, techno geek nerd. And I share it sometimes too, fear of women and nature and unpredictable
Starting point is 00:05:34 stuff. And a lot of us who got involved in tech from the very beginning liked it because it kind of creates a bit of order. I remember when I was with Timothy Leary when he was reading Media Lab, the book by Stuart Brand about Nicholas Negropontes. Oh, MIT's got this big thing called Media Lab. It was really a first place where they were building all the computers. The first place to look at digital, Marvin Minsky was there in AI and Stuart Brand wrote this book about the Media Lab. And I was with Timothy Leiri when he was reading it.
Starting point is 00:06:06 He's circling everything with flair pens and I'm thinking, oh, he loves all this talk about the digital future and anti-aliasing and robot consciousness and all. And when he's done with the book, he goes, blah! And he throws it across the room like it says, and he goes, first, only 2% of the names in the index
Starting point is 00:06:23 are women. That's how you know they've got a problem. And then he said, in second, these guys are trying to recreate the womb. Their mothers were unable to anticipate their every need. And now they want to build this technological bubble where they can sit and have algorithms and robots bring to them what they want before they even know they want it. And so there's that kind of fantasy that goes along with the perfect tech solution that again, an apocalypse is a really good justification
Starting point is 00:06:54 for wanting to build that private sanctuary. It's kind of like permanently existing in a science fiction novel that has probably inspired a ton of the work that you've done. Talk to me about what spending time with Timothy Leary was like. Always, it was great and a little bit horrible at the same time. Being with Timothy Leary was kind of like being on acid. Kind of like being on acid? Or it was being on acid. You know, kind of like being on acid or well, yeah, it was being on acid.
Starting point is 00:07:30 Sometimes it was being on acid, but sometimes it was just like being on acid. So it was a beautiful zone in some ways, but it was unforgiving. It was like he would he would play kind of dominant psychologist with you all the time. Youlecting back to you, whatever he saw as a neurosis or an impediment to whatever you were, just say it. What? Come on. And he's like, geez, just can't you chill. So the social moors, the sort of typical things that people would be allowed to just let slide.
Starting point is 00:08:03 Timothy would decide to point the finger at it and call out the elephant in the room. Right. So it was sometimes a taxing. I have some friends that are like that. Yeah, I get it. No compromise. Especially, you know, and I knew him by the time I knew him, he was already in his 60s and probably thinking, I don't have time for anything, but, you know, and if you did treat you like that, it's because he considered your friend. You know, someone who would come in, a lot of times famous people would come into the house like an Oliver Stone or Liza Mannelli or someone and do stuff that was obviously kind of of silly or
Starting point is 00:08:43 egotistical or something. And he wouldn't say anything till they were gone because he didn't see. Also, because they're high status people, you don't just tell Oliver Stone. I don't want to do your friggin ayahuasca that you brought in from South America in a bottle. I don't know what that is. You know, it was sort of that funny. Okay. So when it comes to billionaires and how they've been preparing, I mean, I've seen an absolute ton of articles over the last five years, I would say mostly, the farmland that's being purchased in middle America. New Zealand seems to be one of the most popular places. Sea-steading, these opportunities for people to air gap themselves from society in one form
Starting point is 00:09:27 or another. Obviously we've potentially got multi-planetary species, not a million years away from now happening. How did you get introduced to thinking about this seriously? I mean, I don't know, seriously is the right word, but certainly thinking about it was this talk that I was supposed to give to some wealthy tech investors, they hired me, like the higher people like us, basically. They think, understand something about the future, because we said something they didn't
Starting point is 00:09:56 understand once, they go, oh, that must be a future. So they hired me to talk about the digital future out in the middle of the desert. You know, it's a big money thing. So I went, you know, I subsidized my thinking with whatever that is. These, these kind of hired intellectual dominatrix sessions. Let's bring in an arcosidicalist to punish us for our wealth. I love the idea of you standing there full leather outfit with a whip and everything, but it's a what? Instead of you hitting anyone, it's just you gesturing with a whiteboard behind you. I could see that.
Starting point is 00:10:32 Yeah, really. Now, by PhD is my whip, right? Yeah. But I'm not there to do this thing, and instead of bringing me to do the talk or making me up, they bring these five guys into the green room, and they sit around this little table and start peppering me with all these questions about the future. When it started out, the regular binary tech investor questions, Bitcoin, Ethereum, virtual reality, augmented reality, and I'm not the person to ask that either.
Starting point is 00:10:57 I would have said Betamax instead of VHS. I would have said CompuServe instead of AOL. I'm always wrong because I pick the better thing, not the one that's going to win, right? Myspace, whatever. Myspace gives us more, it's better, it gives us more. We can do a pick around backgrounds. Or HTML, I would have said, I don't need friggin' WordPress.
Starting point is 00:11:18 So I'm always wrong with that. I'm right about the big picture, but always wrong about which company to bet on. But anyway, so they're doing that with me. And then finally, one of the guys says, so Alaska or New Zealand. And then the rest of the thing, rest of the time, was them peppering me with questions and asking me about, you know, how to survive. And I was really shocked and started really asking them more questions than they were asking me, like, all right, so how are you going to deal
Starting point is 00:11:43 with, you know, germs and where are you going to get your water and are you going to have your own fresh water supply? You're going to be depending on underground water that's being contaminated by whatever happened in the outside world. How are you going to guard yourselves? And then they said that they hired Navy SEALS to come fly in. And it's interesting because it's always Navy SEALS that say nobody picks Army Rangers. And it's like, for my movie experience, I like Army Rangers,
Starting point is 00:12:05 especially if you're on the ground, if you're not sea-stepping, I want Army guys, right? They're great. That's something that feels more, I don't know. I like them. I like their media image. Their movie image better than Navy SEALS for me, terms of my protection, but all of them had Navy SEALS.
Starting point is 00:12:22 I don't know if they had the same ones, double contracting, all contracted to fly out and like helicopters to come to their thing at a moment's notice. And then I'll have a look at you guys. Oh, so these guys, these Navy SEALs or whatever on hair trigger alert to be wheels up within 90 minutes if the, if the new, if the new alarm goes off or something. Yeah. One of them has people at his plane all the time ready, you know,
Starting point is 00:12:45 if you go to go yeah to one of two or three different locations Like jeez and then but the thing of that I asked them was Why do you think your Navy SEALs are gonna protect you once the event is what they called it once the event happens You know here you're you're you're your money is going to be worthless at that point even if it's Bitcoin So why are they not just going to take over your Devin G. Lightwred Machiavelli 101? Like, how do you take care of you? And they hadn't really considered that. Then they start, once they start musing on what they could do, like, oh, I could be the
Starting point is 00:13:17 only one who knows the combination to the safe, you know, where the food is kept. It's like, oh, so Navy's he'll never try to get secrets I'm not anymore before now would really stump them you're right or they was talking about implants That would be controlling who gets to go where in the compound and worst case they could be used for discipline It's like oh right Navy's he was gonna really respond well to being dissuade to the equivalent of shock collars so it was a it was just ludicrous so for me it just showed the equivalent of shot collars. So it was just ludicrous. So for me, it just showed the sort of almost middle school science fiction logic of their scenarios. And then it made me question the sort of the visions of all of the tech bros, these kind of, you know, whatever they are, these kind of methamodern, blue sky, game B, you know, all these theories about kind of rising from the chrysalis of
Starting point is 00:14:13 matter is pure consciousness or, you know, adapting to this great metaphorical, future thing. And I started to see them all in the same bucket as people who were, you know, for time, for centuries, afraid of women, afraid of nature, afraid of the unpredictability of real life, and actively fantasizing about the opportunity to escape, to level up, and to go, as Peter Tiel would say, from zero to one, or Zuckerberg, to go meta, or Kurzweil, to want to put his brain up in Silicon. Everyone wants to go up. And it's like, I don't, it's not a theory of change that I can really take that seriously.
Starting point is 00:15:02 So, do you think that some of these guys secretly are hoping, waiting, almost praying for this to happen in a way? Does it justify their worries and concerns? Does it offer them a better kind of life in some ways? Well, I think they're impatient for it to happen partly because they've been raised on Western narrative, you know, on Marvel movies. You've got to have your end game, right? Or it doesn't, what did I pay my money for if I don't get the end game?
Starting point is 00:15:29 If I don't get the thing, the moment partly it's because they've got a kind of techno solutionist bias where they're used to just just reboot or, you know, a version two, version three, you know, let's go from web one to web two. Let's go from web one to web two. Let's go from web two to web three They just want to kind of that that that urge to kind of reboot up there and And yeah, it's it's a a Way to Realize that it's I mean, I think a Steve Banan kind of the same way although for different very different reasons
Starting point is 00:16:05 You know I mean, I think a Steve Bannon kind of the same way, although for different, very different reasons, you know, tear this thing down, this thing is corrupt. I mean, he's scared of these. He's the one of the only ones who takes all these teal and musk and the technocrats and great reset people. He's the only one who takes them really seriously. And you hear it, you know, that, oh no, these techno guys and Epstein and implants and gates with nanobots and all they believe these technological systems are really coming and that we, the people through
Starting point is 00:16:33 blood and soil are going to fight that thing. But it's that same, what they share in common is this urge to tear this thing down. When I see the way Musk relates to Biden and government and all that stuff, it's like, oh, come on, we don't need that. We have a blockchain that or a blue sky or a program and we could be free to be you and me in a totally different fractal that you, you know, old woke Democrats can't understand. Let's just do it. Pedal to the metal, you know, That's sort of that drive. What about sea studying?
Starting point is 00:17:09 Did you have a look at that? Have you spent much time researching this? Under what end it be kind of fun? Let's go to the sea. If they build my own little raft with solar powered propulsion and eat fish and grow out falfa or whatever and decalinate. What's alfalfa?
Starting point is 00:17:27 You know, it's like you put it on, it's a salad green or something. I think whatever. One of those three. I thought it was a type of algae. Okay. Cool. Yeah, or you can do algae, web even better, right? Like blue-green algae or what's that stuff that people eat?
Starting point is 00:17:42 I forgot what it's called now, but there's all those though. Yeah. We could make that stuff that people eat? I forgot what it's called now, but there's all those though. Yeah, we could make that stuff. I mean, the real fantasy though is for frictionless community that I can float my sort of hexagonal raft to the nation of my choice and attach to it. And now I'm in this nation of people all in their raft and we have our own system of government and we approve cloning, but don't approve abortion and we do approve this and that and these
Starting point is 00:18:11 are our rules and we have a blockchain, we have our own currency and all that. And then if this country starts to have a rule or something I don't like, get detached my raft and bzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz You know solar power float to that country over there. They got the roles that I want So it's that it's that idea of of a nation with no skin in the game with no exit cost You know, it's like you could change it's easily as you could change your cell phone company You can change your your nation and I don't think real community works that way. I think you do Gotta be with people who believe stuff. You don't like and there's that way. I think you do gotta be with people who believe stuff you don't like and there's some rules that you disagree with. Was part of that not the justification for the way that the United States was put together though, that you basically do have 50 kind of small countries all put together. Yes,
Starting point is 00:18:59 you're sharing a currency and a language, but broadly it is, you know, you don't like it here, you can move there. What's wrong with somebody who wants to just up and leave when stuff doesn't go the way they like? Oh, any federation is like that. It's certainly fine for my master done servers, you know, you're on one and talking and you're like, ah, I don't want to be on this one, I'm going to go over on that one, but they're all federated so that you can still follow anybody in any one of them. No, I mean, and certainly in a digital realm or in a brand loyalty, it makes total sense as a way of of fostering long term difficult relationships between people over time. I don't know. I don't I don't have a problem with frictionless. If people want to change their town every week and move around, it's just what is it biased toward? So when I look at the plan, when I hear them describing it that you've
Starting point is 00:19:55 you don't like this, you'll just float away. It's a free market version of civics. And I get it and there's cool things about it, but to me it betrays what it's all really about. It's, I don't like the rules on your country, which they don't, right? We can't just develop tech. I'm not allowed to do all the cloning I want. And there's these regulations about genetic engineering. So I'm going to take my ball, my dollars, and go and make my own frigging country out in the middle of the ocean.
Starting point is 00:20:27 Yeah, yeah, yeah, yeah, you can't do anything to me. And I have to do it out in the ocean because I don't yet have enough money to get my rocket ship, my blue origin, up to Mars and terraform that place. But when I can, I will, you know, see ya. I understand the place that this comes from, I think. You know, like personal sovereignty, individual agency,
Starting point is 00:20:48 all of these things are things that almost everybody that I respect is pushing forward at the moment. It's the antithesis of the victim had narrative at the moment. It's taking control and responsibility for the things that you do. I think that's pretty much a universal good. Yeah.
Starting point is 00:21:03 I also understand what happens when you take that too far and it encourages people to not properly integrate into the local group that's around them. Like, you know, you could say, and I see this particularly amongst dads, particularly particularly amongst wealthy dads that are maybe late 40s to 50s, they have spent a long time building companies and acquiring wealth. And then they get to this stage and
Starting point is 00:21:33 there's something about it where they seem to kind of say, fuck you a little bit to the rest of the world. And it's like a retreat in a way. It's more individualistic. It's more atomized. It's like a retreat in a way. It's more individualistic. It's more atomized. It's like me and my family and fuck everybody and everything else. And I do, I do see this. So I do understand what happens if you take that personal sovereignty thing and you roll it forward across many tens of millions of dollars and a lot of
Starting point is 00:22:04 resources and a lot of spare time and a family that you don't want to have heard. I understand how that can be about. Or a family you don't even want to be with anymore, for that matter. Or you want to agape yourself from everybody including your ex-wife, right? Yeah, or your current wife, yeah. But I'll tell you, you know, I haven't said this before and I'm in a lot of discussions about the origins of this and It's it's on the one hand it is and Trish traditionally I kind of blame it on Libertarianism and the West Coast
Starting point is 00:22:37 The West Coast kind of libertarianism that dovetailed so perfectly with techno solutionism It was John Barlow telling us, you so the Declaration of Independence of Cyberspace, nations of the world get off here. This is for us, and we didn't realize if you get government away, corporations grow, and roam free. So there is that narrative is an important one, and I don't minimize it, and I still feel like, oh, damn, when Wired came, and blew Monto 2000
Starting point is 00:23:04 out of the water, you know, when when the business guys came and turned the net from our great psychedelic rave space into this fucking techno capitalist nightmare. It's like, wait a minute. This wasn't, why are you taking all this stuff public? Why are we having IPOs? Why are we after eyeball hours and all that? So there's that.
Starting point is 00:23:21 But the thing I haven't said and it's probably too dangerous to say, and I might get canceled in whatever, but it's also an almost necessary, or it's an almost inevitable response to Marxism. And Hannah Arendt wrote about this in the banality of evil and in other places that Marxism not workers rights, but Marxism as a kind of a scientific version of workers rights has as a kind of an endgame this total equality. It's like, I understand we don't want anyone to be deathly poor that they're sick and can't have it, but there's poverty. There's inequality.
Starting point is 00:24:08 That's sort of what happens in a society. And this almost scientific desire to annihilate all forms of inequality is itself provocative to the self-sovereign individual. So, and you could look at it on the personal level. If you see, oh my God, everybody's getting canceled for everything here and there and there and there. And then I think back, you know, there was a girl in middle school.
Starting point is 00:24:36 I mean, before sex or anything like that, there was a girl in middle school that my friends convinced, I really had a crush on her. And kind of told her, then my friends convinced me I really had a crush on her and kind of told her, then my friends convinced me, no, no, it's not cool, you'll be going to be hated and not liked in whatever. And I ditched her for no reason. That, you know, I mean, that's as bad as I'm, I'm, I'm, there's all sorts of equivalents of that that a whole bunch of guys look at a cancellation
Starting point is 00:25:06 movement and think, someday it's going to be me. They're going to come for me. Or those of us who are involved in social justice understand that the social justice terms that we use in one year may be obsolete in the next year. And the video of us saying the thing that was appropriate in one year then becomes bad word to use the next year or the year after. So when people sense that that there's this moving target that they will never be able to meet and there's wiggle room. There's no place for reconciliation or apology or error. It does, it activates a reverse pole to something, you know, total self-sovereignty and anti-woke and fuck
Starting point is 00:26:01 that man that life. I want Andy Wocker. I want to use bad words and you look at a musk and it's on that level. It's refreshing. He's not, pre-canceled. And there's that energy behind a lot of these escape fantasies, I think, of like, wait a minute. Society's getting too woke. So when you hear, like, Jim Rutt, who's a really smart, he's a smart system student. He's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like, he's like,, of like, wait a minute, society's getting too woke. So when you hear, like Jim Rott, who's a really smart, he's a smart system theorist. He used to be the head of Santa Fe Institute. And he's the main guy behind something called Game B, which is a place where the ideas we're all living in, Game A now, which is regular competitive society, but we can elevate and go to Game B, which is the next phase of human existence. And he always talks about the obstacle to it as those woke people.
Starting point is 00:26:48 There's goddamn woke people. And so I understand a wokeism as an extension of Marxism and Marxism as a version of Marxism as this trigger for people to then do this other thing. Ah, okay. So the reason that Jim Rutt, yeah, the reason that Jim Rutt is saying that the difficulty in moving from gay may and ascending to gay B being the woke movement is not any of the first order effect of wokeism. It's the second order effect of the knee jerk reaction when the pendulum swings back against wokeism, which is hyper individualistic. It's atomized. It's doom or optimism. It's Elon Musk shitposting his way through AOC. It's QAnon. It's all of this
Starting point is 00:27:40 stuff. Right. Well, he might be, I don't know whether he's a first or second-order effect of it, but yeah, I think he's more in the camp of partly game B is to get away from this stuff. I mean, I think he's a first-order, he's in the first-order effect of it, but yeah, and it's tricky, and it's because, and again, this is really hard to say, because it makes it sound like there's an equivalence, and there's not. I don't mean to equate, right, and again, this is really hard to say because it makes it sound like there's an equivalence and there's not. I don't mean to equate right and left or Trump and AOC or anything like that. There are different degrees. There's way different levels of stuff.
Starting point is 00:28:14 I mean, I would waybath or, you know, be an AOC's country than Trump's country. But there's, the problem is not just Trump and his cronies. They are the figures, they are not the ground, they're the images, they're not the atmosphere. It's the atmosphere itself, it's as if the air we breathe has become totalitarian, has become that in this extremist thing, it's like an accelerant. Like we're not breathing air, we're breathing nitrous or something. You know, like nitrous does to an automobile, it's like, and everything is amplified and accelerated and extreme.
Starting point is 00:28:56 And I feel like any political movement, any cultural movement now, on whatever side of the political spectrum it is, ends up kind of revved up partly by the digital media environment that we're in and partly by whatever this thing is going on in the air that there are these currents that are moving things around. And then you've got guys like, you have some wizards out there. Like, Bannon is a wizard, musk is a wizard that make it look like they're the ones actually doing, making these currents, but they're not.
Starting point is 00:29:33 They're just seeing them and riding on them. I don't think Trump was, did anything. I think Trump was more like Charlie Sheen. There was this kind of standing wave of culture that was happening. And he just jumped in it. It was like, ah, you know, so we look at him as the picture, but, but it's not. It was he just he just he just embodied the phenomenon. It is very interesting to think about this, this knee jerk reaction. That's a model that I think I'm
Starting point is 00:29:59 going to try and try and use as I continue forward. I mean, when you consider the prepping movement's been around for a long time, right? You know, people predicting the end of the world and trying to use as I continue forward. I mean, when you consider the prepping movement has been around for a long time, right? You know, people predicting the end of the world and trying to avoid alien abduction or whatever it might be. And then, doomer optimism, which I'm not sure if you're familiar with, is a, that's the middle ground version of this.
Starting point is 00:30:23 And then the end game is the billionaires. And I had a look at, does that company called VIVOS? They sell luxury underground apartments in converted cold warm, unitioned storage facilities and missile silos. So that's taking it up to absolute extreme. Yeah, so it's to plow shares, baby. It's almost biblical, right? Except it's not really plow shares. It's so it's to plow shares, baby. It's almost been look all right except it's not really plow shares It's so it's to shelters
Starting point is 00:30:48 Yeah, I mean they sell them. There's a the ones I like are in Europe They're called like O'Pitym or something those look nice. They have swimming pools and Saunas and things you know wouldn't go there I Yeah, do I have to wait until the end of the world or is there an opportunity for me to just go now? I know that's the thing can't we just go now, but not have to wait until the end of the world or is there an opportunity for me to just go now? I know, that's the thing, can't we just go now, but not have to do it underground?
Starting point is 00:31:09 So one of the guys, yeah, there was one, one of them, and it's interesting, because one of them was from a very Christian company called, I forgot what, Rising S Corporation, and they build, they take shipping containers and stick them under the ground originally for people for like tornado shelters and fallout shelters and things like that.
Starting point is 00:31:28 Then when the rich people got into the apocalypse bus, they're like, oh, there's a good business. So instead of just putting one train car down there, you know, they'll put like 12 shipping containers all network together and they've got pools. They've got heated pools and fake daylight. And I'm, one of that was one of the guys I was talking to was doing one of those.
Starting point is 00:31:48 And I was like, dude, it's like my neighbor has a pool, not right down there. They have a pool, and there's always a truck in front of there, bringing some other part or gizmo that broke, like a heater or a regulator, you know, a thermostat thing, like where are you going to get that? Where's that guy in your plan? Here's all the, you have a 3D printer making the parts for your pool. You know, can they do that yet?
Starting point is 00:32:12 Of course not. And he's like, oh, he opened his little bowl skin buck. He's like, parts for pool. Like, okay, you do really work this out, buddy. That's funny. Oh yeah, so what what looking at what you know about the current approach that the super rich have when it comes to saving themselves from the apocalypse that's coming, what are the most fragile elements of their preparation?
Starting point is 00:32:41 Security definitely. You know, they, they, any of them that are land based, we can go take them over. There's more of us, even if, you know, a thousand of us get shot on our way in. Plus, if we've got motorcycle gangs with oozees or whatever they have, I think these are very peniturable. These are peniturable. There's some of them have farms and shit. So there's that. The ones that are totally locked away their agriculture systems are
Starting point is 00:33:11 totally self-contained. A lot of them have like their topsoil is in these little rubber tubes and then you try to grow. I mean, have you ever seen somebody doing vertical farming at home or whatever? You get a bad batch. It's like you get a bad panel and at home or whatever, you get a bad batch. It's like, you get a bad panel and you just take it out and go get some top soil and build another one. You can't do that when you're locked away. What do you do? You go out with a crew with a couple of guys with machine guns to run and find some more sterile top soil. But of course, you got the nuclear fallout and the zombies and the killer bees and whatever it is out there that you're
Starting point is 00:33:46 supposed to be hiding from. So the self-contained universe, the thing that failed with biosphere will fail with a lot of these because they think everything's going to regenerate. So those are those and they're not really sealed from germs. And I mean, COVID got everywhere somehow, eventually somebody's bringing something or a bird's going to poop and you're going to catch whatever people have. As I see it, these things are, they're so brittle. They're brittle from the get go.
Starting point is 00:34:23 They're brittle. They're a brittle. They're brittle from the get go. They're brittle. They're a brittle approach to survival. If you talk to a real prepper and I have smart preppers, they're prepared news plans, always involve the communities in which they're embedded. They prep by teaching people in their communities how to prep. They do foraging classes and farming classes and self-defense classes because they realize we the only way to be prepared is to be prepared together not prepared alone. The loan prepper does not does not survive. How comfortable or uncomfortable is it for you spending time around people that have both
Starting point is 00:34:59 the desire, means and resources to be able to enact something like this? I had a conversation a little while ago with a friend and he mentioned that he'd spoken to someone that's maybe in a similar position to the gentleman that had paid you to go and teach them how to survive the end of the world. And this guy said, I'm an apex predator. Apex predators don't need to concern themselves about what happens to their prey. And this guy, as far as my friend was concerned, was like, he meant it. Apex Predators don't need to concern themselves about what happens to their prey. And this guy, as far as my friend was concerned, was like, he meant it.
Starting point is 00:35:29 Like he genuinely believes that he is close to the absolute top of the tree of this entire planet and that the externalities are kind of the same as stepping on a bug. And that story stuck with me, that was like two years ago that I heard that story. It really stuck with me because I think, wow, there's people out there that have that kind of mindset and that have the resources to be able to live the philosophy that that mindset creates. Yeah, I mean, that's the mindset
Starting point is 00:35:56 that I'm really writing about in my book. I mean, that's really what it's about. Most of the people wanna know, oh, you wrote this, how do I survive? What are your tips and techniques? I'm like, no, no, I'm kind of trying to undermine. It's called escape fantasies of the tech billionaires. But you're right.
Starting point is 00:36:12 It comes from, and that's why I look at really two strains. One is a certain thread of the scientific revolution, which was Francis Bacon. And in Francis Bacon was sort of the father of empirical science, and he's credited with saying that the purpose of empirical science, the promise is that it will allow us to take nature by the for lock, hold her down, and submit her to our will. Okay, great.
Starting point is 00:36:41 So science is basically a rape fantasy, right? We're going to take nature by the hair, hold her down, and have our way with her. That's what science will let us do. Again, it's that apex predators understanding, yes, you are man. You are man. So we can take, we are in charge of nature. We can dominate nature. And as any truly enlightened person realizes, that's not even the way you get power,
Starting point is 00:37:05 much less the way you survive. If you don't dominate nature, you learn to work with the patterns of nature, the patterns of your body. You know, some of the stuff that you've talked about too, about the natural cycles of waking and sleeping, and if you learn the patterns, you ride them, and you become strong and healthy and all that, as opposed to trying to defeat the patterns of nature with what was speed and sleeping pills and prosack and you know, because you can't recognize you because you refuse to submit to the day and night. It's seasons.
Starting point is 00:37:38 Sorry, there's seasons. You can't change them. Go with it. Just go with it. It's okay. It's going to be okay. Something will come out again. So there's that. And then the capitalism is the other one. We know you talked about externalities. And externalities are built into colonial capitalism
Starting point is 00:37:57 in order for the markets to grow, which they have to grow because we're on this central currency, interest-bearing economic operating system. In order for it to grow, we're on this central currency interest bearing economic operating system. In order for it to grow, we got to take over more places and slave their people and extract their resources. And those are externalities. The enslaved people, the pollution, but if we have technology, which is the latest part of this puzzle, we can build a car that goes fast enough
Starting point is 00:38:21 to escape from its own exhaust. There will be externalities, but those are other people's problems. And if it's too much, if the whole world gets filled up with the exhaust, then well, then we go. We get blue origin, and we go to the next one. I remember having a conversation, ages ago, with an astrophysicist that's specialized in detecting alien civilizations. So there's seti, and then there's meti as well, and meti is messaging extraterrestrial intelligence, and there's a lot of
Starting point is 00:38:53 big questions being asked around whether meti is even a good idea in the first place. And I asked what are some of the most likely signs that other civilizations would have, because some would be underwater, some would be a silicon-based, some might have thoughts that take decades to go through, et cetera, et cetera. And he said that global warming, that the use of any type of energy is inevitably going to cause kick out of pollution
Starting point is 00:39:23 into the environment, which has to change the environment. There are only a small number of ways that you can do stuff like smelting, like generating any kind of energy and the externalities that you get from that. And it made me think like, maybe it is the case that we can't techno utopia our way out of this.
Starting point is 00:39:42 I have some friends that believe that the solution to the side effects of technology is more technology. I know that you talk about this, but I know increasingly, I know, kind of a little bit less certain about that. I know, and then it's funny I was talking with Tyson Junca Porta, who is this really smart Australian, indigenous person and scholar, and he says he thinks that Western civilization is in a state of
Starting point is 00:40:13 depression after having seen the movie Avatar. Because Avatar is the opposite, right? Avatar is a civilization that is basically expending no energy beyond your sort of what they eat and you know beyond the their metabolism of their bodies you know in harmony with nature. And it's this picture of something that we could I can't imagine how we could you know go forward to that. I mean, I was gonna say get back to that, even if you can't go back ever.
Starting point is 00:40:48 But how could we move forward? And I do believe that the most sustainable and most fun society that we could live in would look a bit like a permaculture farm or something. And there would be less tech. I mean, I'm not anti-tech. I love tech and it's been fun. It's been fun, but I love nature and people and flash and stuff more.
Starting point is 00:41:18 I don't know. If we could de-growth is a bad word, I won't say that, if we could sort of unwind, simplify, simplify. Yeah, there's nicer ways of saying it. We would, if we move to a situation where we don't need to, to have the GDP grow every year, or we don't need to do more just for the sake of maintaining the balance sheet, but we only did more that we needed to do more in order to feed and clothe and care for other people.
Starting point is 00:41:55 We could get rid of a lot of unnecessary stuff, all the plastic that we're buying at Walmart and throwing away. And you know, I get it. It's good for the economy that if everyone on the block has a lawn mower, it's better for the economy than if one person has a lawn mower that we all share. But it's a nicer society. If we have one lawn mower on the block and everyone shares it.
Starting point is 00:42:18 Yeah. So there's the ego, which is the embedded growth obligation, right? Which is what you're talking about. The GDP needs to continue each year, et cetera. Yeah. It feels to me like there's an eco as well, which is the embedded growth obligation, right, which is what you're talking about. The GDP needs to continue each year, et cetera. It feels to me like there's an eco as well, which is an embedded comfort obligation that humans never want to regress back to a less comfortable type of life
Starting point is 00:42:36 than the one that they're in at the moment. Yeah, and then we got to also then think about how are we defining comfort. I guess it's more comfortable to sit in this Costco chair than on a rock, right? but there's I don't know there's there's comfort
Starting point is 00:42:56 There's many kinds of comfort. Well, I mean less less comfort would come from not having spinal issues You know in your 50s 60s and 70s, because you've not spent a ton of time sat down. It would come from suffering less with diabetes because you were moving around a lot more. The problem that you have is that what is good for you in the long term often feels difficult in the short term.
Starting point is 00:43:16 So you'd switch out the C in eco comfort for convenience, you know, and better convenience obligation. Like why is it that I should walk to the store to get my 10,000 steps in, and, and, and, and, and, and, and, and, and, and, and, and and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, and, to buy a $5,000 piece of kit so that I can find discomfort. I elect to find discomfort and inject it into my daily life because my natural existence is so bereft of anything that looks even remotely difficult. I have to go out of my way to try and find podcasts or books that are going to mentally tax me. I have to go out of my way to go into a building which has a selection of handles with weights attached to them because normally
Starting point is 00:44:07 I don't have to pick anything up that's heavier than a sandwich. Like I am having to artificially inseminate so much discomfort into my life because the eco that embedded convenience obligation continues to get more and more and more convenient over time and humans often mistake a convenient or enjoyable experience
Starting point is 00:44:24 for a worthwhile one. And until I don't think that that's fixable, I simply don't think it's fixable to be able to get humans to look past. This is difficult now, but good in the long term. Even if you've given it to our caveman ancestors, even if you've gone within one generation, you can go from rocks and fire and no wheel
Starting point is 00:44:44 to all of the stuff that we've got now, even they wouldn't have wanted to go back because it is a race to the bottom of the convenience stem. Right, although some civilizations didn't, you know, they lasted a long time in a sort of homeostasis, it has happened. Would they have ever got, they wouldn't have ever stepped into that technological advancement and then regressed back, right? They simply wouldn't have made a step forward. Right. But of course, because they didn't move into the technological advancement, they got
Starting point is 00:45:10 clobbered by colonial civilizations that did, right? See ya. Sorry. If one civilization is putting 50% of its effort into munitions, then the one that's, you know, not putting any into it is going to get killed pretty fast. No, I hear you. The other thing is that technology accelerates that process. When Kevin Kelly or any of the technology theorists talk about tech, they always say technology
Starting point is 00:45:36 gives us more choice. It gives us choice, but we haven't yet developed the capacity to make wise choices. We make, like you're saying, the short term convenient choice. And then on the other end, again, because of this totalitarian environment, on the other end of the spectrum, the so-called long term thinkers are thinking about the civilization of 40 trillion bots spread out through the universe rather than the 800 whatever 8 billion people alive today. So they go so far on the other extreme that, wait a minute, what about just sort of medium-long term then? The next 500 or a thousand years. Let's look at that.
Starting point is 00:46:20 It's a difficult one man. It really is. He talked to me about you had this story about a banker that went to burning man and came back and had seen a different side of the world. Yeah, well, I get invited to these things. They always have the word human in them, you know, you know, festival of the new humanity or, you know, the next level of humans and whatever. I got invited to one of them, and I don't know why I finally agreed to go to this one. And it was a thing where they go and you got like some shaman gets paid to come and chant a bit and you do guided visualizations and it's all like hedge fund dudes and bankers and whatever that have gone to Burning Man,
Starting point is 00:47:05 done mushrooms or LSD or IOSCA for the first time. And then they see the light, right? Oh my God, I'm just trying to plan it. And they come back and then they wanna process it with the spiritual people. And the problem is they have the insight and then they decide, oh, I should be the leader of the climate change revolution.
Starting point is 00:47:27 As if there haven't been people, they don't want to join the existing climate change thing. I said to one of them who was like, oh, I've got to lead. I'm going to create a new organization that's going to aggregate the this and the that. I'm like, well, have you heard of extinction rebellion or the sun rise movement or nature conservancy? There's out there. So what? No, and if I haven't heard of them, how good could any of them be? It's like, well, you weren't frigging looking. I'm telling you, this is your first hour as a newly minted
Starting point is 00:47:56 climatologist. Maybe you shouldn't run the thing, but you see this all the time. You know, even bless his heart, the owner of the LA Dodgers or former owner of this guy started something called unfinished and project liberty. It's this giant, huge, zillion dollar foundation to develop a multi-racial, open, democratic, block chain society. And has this conference that he's doing in New York and this giant facility and invites everybody. It's like you don't have to put their CEO mentality.
Starting point is 00:48:34 So they want to run, they want to be in charge of the whole thing. So there's like all these kind of competing future of humanity efforts out there. Okay. So you're around the, I mean, to be, I think that not knowing about extinction rebellion is a good thing. As far as I can see, they glue themselves to the roads in the UK and stop people from getting to work and don't do very much else at the moment. That seems to be their primary objective. Well, they were based, trying to be based more on kind of the yellow jackets, you know,
Starting point is 00:49:03 citizen councils and all, but there's problems and not, but you should at least understand the existing territory. Yes. If you're going to step into a market, you'd usually do some research beforehand. The wood hubris keeps coming to mind a lot. Yeah. I get hell of a lot. That'd be the one.
Starting point is 00:49:20 That'd be the one. Yeah. That's why I mean, it's funny. And that's why I sound like I don't mean to be an apologist for right or left or anybody, but their hubris is almost more of a psychological problem than anything else. You know, I was like, I've read their philosophies and ideas and I know while there's some problems
Starting point is 00:49:44 and certain ones of them none of them would be that catastrophically dangerous if we weren't like in this frictionless new space of kind of digital media It's like everything's on ice you push in a direction it. What do you you know? Do you mean bring bring that down to earth a little bit for me? I feel like What do you mean bring bring that down to earth a little bit for me? I feel like Ideologies are more are more dangerous now because they're not tempered by community and the real world You know, so you build an ideology you're building it. Okay, so Twitter is gonna build blue sky and we'll have this space
Starting point is 00:50:27 It's not real. It's meta. It's in a different realm or because of the way things get accelerated and amplified in digital media. Someone can say something that, okay, let's take some time to unpack it. But before anybody's even taken a moment to unpack it, the guys canceled by this group and becomes the Messiah of that group. And it's like, no, wait a minute. Let's let's let this mature. Let's discuss this idea. Let's, you know, so there's no, I feel like there's no filter.
Starting point is 00:50:56 There's no governor. There's no, there's no way to kind of slow down or metabolize and work on ideas. They just come out like as if they're fully hatched. Yes. I suppose in that environment, good ideas and bad ideas have equal footing under the sun in terms of how quickly they can go viral.
Starting point is 00:51:18 In fact, some bad ideas. I know about extinction rebellion, gluing themselves to the roads. I'm sure they didn't glue, but they sat there. But I don't know. I don't know. Yeah, people in the UK and in France have glued themselves to the road. Because the reason for that is that the police have to come over
Starting point is 00:51:36 with anti-solvent stuff and they slowly peeled their hands off. My point being that there are other things that they've done, right? But what I know about, what my primary exposure to them is them being dickheads in the middle of a motorway in the UK. So it is just as easy for poor philosophies to grab a hold of the internet. In fact, it might even be more because outrage is so clickable. It might make it go even quicker. All right, so given all of this together, how has this informed the way that you move through the world? Because you've been talking and thinking and writing about technology for quite a while, how are you dealing with the crushing weight of existence itself and not losing your mind being exposed to all of these ideas. For me, by trying to slow down,
Starting point is 00:52:32 to try to have less hubris myself, to, I mean, I'm on sort of book tour or something now, so I guess I'm talking a bunch, but to sort of talk a little bit less, to adopt the comportment more of a kind of a country doctor, you know, I'm just a local country doctor looking at problems and offering ideas, but not, you know, grandstanding, great solutions, encouraging the multitude of local solutions rather than singular, giant, top-down things. It's the $100 million X prize, you know, geo-engineered,
Starting point is 00:53:17 let's put sulfur particles in the, whatever, in that atmosphere. But those are the things that I'm sort of although all of them are attempting because you just push a button. Let's see. I'm trying to engender a multitude of small solutions and give people more faith in there that doing something locally really does count, you know, and trying to resist scale. And for me, it's tricky because if I see scale as the problem, as in most cases, if scale is the problem, then how do I operate, as I go, whatever it is that I am, a writer and thinker, not at scale? And I don't think, compared to a lot of people, I don't really have scale. I mean, Gary, not at scale. And I don't think, I mean, compared to a lot of people,
Starting point is 00:54:06 I don't really have scale. I mean, Gary Vee has scale. I'm still, you know, in the Twitter verse, I'm a 60,000 person, which is nice, but not one of those people. So it's funny, on one hand, I, it's sweet that I worry about scale, operating at my scale.
Starting point is 00:54:24 I'm like, oh no, am I too big? Am I too powerful? You know, it's sweet that I worry about scale operating at my scale. I'm like, oh, no, am I too big? Am I too powerful? You know, it's like, okay. I'm glad that I'm worried about it. But when I look at an objective, I don't think that that's really my, that's going to be my great problem. But yeah, and to try to get, in person to person, person you know every founder
Starting point is 00:54:46 that i can convince that it's okay to end up with fifty million dollars instead of five billion dollars is a win it's why because if they don't feel obligated to make five or ten billion dollars with their company then they don't have to pivot to something awful and extractive
Starting point is 00:55:04 when you hear now the mark zookab, oh, I'm going to give back 95% of a money I made on Facebook. I was like, well, what if you had made Facebook 95% less manipulative and awful and extractive? You wouldn't have, the world would be different now. That kid, that poor kid, plucked out of his freshman year at Harvard, you know, by Peter Tiel or whoever it was, and you transfer his parental authority onto this venture capitalist and pivot's away from a, I just want to make a platform that's going to help nerds
Starting point is 00:55:34 like me get laid to this, you know, data, you know, a data mining nightmare. Boy, we'd be living, we could be living in a different world. I mean, we'll probably someone else would have done it or what would happen some other way, but. Yeah, it's scale, it's people being satisfied. It's just so hard. You know, again, it's sort of like your convenience thing.
Starting point is 00:55:59 Once you get it, why would you go back? You know, once you have a million and a half Instagram followers, why would you go back? You know, once you have, you know, a million and a half Instagram followers, why would you undo that intentionally? Well, you also see, I mean, Austin at the moment, and a lot of the people here are talking about home studying, they're talking about community housing in one form or another where they're gonna them and nine other families and their kids are going to go and
Starting point is 00:56:26 they've got a hundred acres of land out in lock heart and they're going to they've been learning about regenerative farming and they're such and such a certain person's wife is a teacher and this person is a doctor and you know, we're basically going to set ourselves up here and I understand the impetus because there is a direct line to draw from what you've just said there, which is kind of this more simple type of life where you are looking at slowing down. But it doesn't take much of a change in direction for that to fall into the very individualistic, me and mine versus the world mentality that we were just lambasting 10 minutes ago. Like, it's not that far away at all. Right. And then you got to look, I mean,
Starting point is 00:57:10 and while I certainly appreciate what those families want to do, I wonder, I guess not everyone can do that, right? There's not enough land for everyone to do that. We, and it's expensive. It's very, the other people I know that are doing that are pretty rich. Right, so they can go and build an eco-farm with solar panels and hydroponic blah, blah under the
Starting point is 00:57:27 ground and get their alpacas and goats and all the good stuff. Yeah. Exactly. It's tricky. It's like the guys I know in the tech world who still go home to their organic farms with the with the with the goats and Rudolph Steiner tutors and whatever. Yeah. I certainly understand the impetus too, but that's
Starting point is 00:57:47 almost a heavy lift. That's a heavy lift. And it may go against some core simplicity principles. But yeah, I mean, for me, it's just meeting the people where I live, trying to spend less time online, more time helping not assuming I know what's best for people, but what do they need? This woman needs a lift every Thursday to get freaking groceries. This kid needs a math tutor. You do what you can and it embeds you in your community in a way that feeds your heart a lot better than, you know,
Starting point is 00:58:25 60 likes on that tweet, you know, it really does. Okay, that's what you're doing on an individual level and what's the fix on a more macro level? Is there anything that can be done or are we just along for the ride in the slipstream of billionaires at the moment? Well, I mean, I think there are things that could be done. One, I don't think, and I know this is controversial, but I don't think the billionaires truly have
Starting point is 00:58:49 our best interest at heart. A lot of them anyway. I think they, and the ones that do, don't. The ones that do are deluded. So I think that one thing we can do is deflate their power by not using their platform so much, not buying their stuff so much. I'm now, I'm checking out Mastodon as a new alternative Twitter social network place
Starting point is 00:59:17 that's a federated non-property thing to see, just because it's like, I don't know if I want to support this kind of troll thing happening on Twitter and elsewhere. You know, I don't want to Tesla either. You know, it's not like, I know Musk is trying to do good things and parts of his personality are quite benevolent, but others, it's just brittle. It's a brittle single point of failure, you know? And I feel like there's there are some loose screws there that are that may not that need to be balanced out with other people doing other things.
Starting point is 00:59:50 So yeah, I guess my my macro solution is for everyone to avail themselves of opportunities to engage locally to realize that, you know, we don't all need opinions on every big thing. I remember when my example when Biden was pulling out of Afghanistan, three or four different journalists came to me to ask me for their articles. What's your opinion of Biden's Afghan withdrawal strategy? And I'm like, I really know nothing about how you withdraw from a war. I just, well, just weigh in. I can't weigh in. I shouldn't weigh in. And everybody's tweeting, oh, I listen to that in the other.
Starting point is 01:00:32 I'm like, how many? What if we just set aside 100,000 of us as the experts in it? Just 100,000. It's like I used to say, every time that Brittany Spears would have a nervous breakdown, there'd be like 100 or 200 camera trucks outside her house. I'm like, couldn't we cover this with five camera trucks and share the feed, you know, and maybe take the other 95 and put them in situations that matter, war zones and learn about other stuff. So there's
Starting point is 01:00:59 there's that. It's like the world, the big things matter, but we can take the weight off a lot of these big things. We can make the macro problems less brittle by being more self-sufficient, I don't mean self, but community sufficient, locally sufficient, as places. So, yeah, find where does your food come from? How local can you get it? Where's your CSA, your community-supported agriculture group? What's going on in your community? That most of us can operate at that level, I think, and have like representatives that work at these more macro levels on our behalf. All right, Douglas, let's bring this one home. Where should people go if they want to check out the stuff that you do online?
Starting point is 01:01:50 Rushcoff.com is me, teamhuman.fm is my lovely podcast. And check out this new book, which is not as depressing as it might sound, survival of the richest escape fantasies of the tech billionaires, which the real purpose of this book was to be a comedy, to help people laugh. If we can laugh at these guys, then the whole thing starts to feel less scary and urgent in that brittle way. And you know what I mean? It's just like, oh right, I get it.
Starting point is 01:02:21 They're just silly. Don't worry about them. They're silly. Now, don't try to be musk. Let musk be musk. And you can be you. All right, Douglas. Thank you. Thank you.
Starting point is 01:02:34 Oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh, oh you

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.