The Daily Stoic - Will MacAskill on Creating Lasting Change

Episode Date: August 17, 2022

Ryan talks to professor and writer Will MacAskill about his book What We Owe The Future, how to create effective change in the world, the importance of gaining a better perspective on the wor...ld, and more.Will MacAskill is an Associate Professor in Philosophy and Research Fellow at the Global Priorities Institute, University of Oxford. His research focuses on the fundamentals of effective altruism - the use of evidence and reason to help others as much as possible with our time and money - with a particular concentration on how to act given moral uncertainty. He is the author of the upcoming book What We Owe The Future, available for purchase on August 12. Will also wrote Doing Good Better: Effective Altruism and a Radical New Way to Make a Difference and co-authored Moral Uncertainty.✉️  Sign up for the Daily Stoic email: https://dailystoic.com/dailyemail🏛 Check out the Daily Stoic Store for Stoic inspired products, signed books, and more.📱 Follow us: Instagram, Twitter, YouTube, TikTok, FacebookSee Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript
Discussion (0)
Starting point is 00:00:00 Hey, prime members, you can listen to the Daily Stoic Podcast early and add free on Amazon music. Download the app today. Welcome to the Daily Stoic Podcast, where each day we bring you a passage of ancient wisdom designed to help you find strength, insight, and wisdom every day life. Each one of these passages is based on the 2,000-year- old philosophy that has guided some of history's greatest men and women. For more, you can visit us dailystow.com. Hi, I'm David Brown, the host of Wendery's podcast business wars. And in our new season, Walmart must fight off target,
Starting point is 00:00:46 the new discounter that's both savvy and fashion forward. Listen to business wars on Amazon music or wherever you get your podcasts. I feel like we've talked about this a lot of times, but Stoicism isn't just an individualistic philosophy. The idea is, it is a collective philosophy. We think about not just how our actions impact other people, but what we owe other people and how we orient our actions and our lives around that.
Starting point is 00:01:18 Meditations, Marks, really says what's bad for the hive, is bad for the bee. Talks about the common good good dozens and dozens of times in meditations for a reason. It was the guiding idea of his life, the his north star as a leader, as the emperor, but also as a stoic. And this is something I'm thinking about a lot recently. I'm working on this series on the Cardinal Virtues,
Starting point is 00:01:41 as I've told you before. I've just finished the discipline book, which will be up for pre-order here really soon with a bunch of awesome bonuses. But as I think about this idea of justice, one of the people I've been reading and talking to a lot, because he's a friend, we've worked together on some books before, and someone who really, who very much changed my worldview, I like to think it's made me more generous and more widen my perspective.
Starting point is 00:02:08 It's the one and only William Magasco. Will is an associate professor of philosophy at the University of Oxford. And one of the pioneers of the effective altruism movement, which I actually think is very aligned with stoicism because the stoics were not just concerned with the common good, but they were also concerned with rationality and reason. They would have looked not just a moat and want to help people in the abstract, but they want to look at
Starting point is 00:02:34 how to help people and how to help them most effectively how to have the biggest difference. I really liked Will's first book doing good Better, but his new book is absolutely fantastic. What we owe the future and what we owe the future is really about this concept of long-termism, which I think is an important breakthrough as we think about doing good, as we think about autism, as we think about socialism. It's not just how do we do good for the people around us, but how do we do good for the people in the future? People we will never meet.
Starting point is 00:03:07 So you could argue Mark's race is very concerned with the common good, but he fails the common good, long-termism in a way that Hadrian, who was much more selfish than Marcus, did not in that Hadrian sets up a succession plan that lasts for like almost 50 years. And Marcus are really hands over the empire to comedists That's up a succession plan that lasts for like almost 50 years. And Marcus Aurelius hands over the empire to Communists who quickly drives it into a ditch.
Starting point is 00:03:31 So this idea of long-termism, not just what do we owe the people around us, but how do we make decisions and actions that set up a world in the future that is better. One that we'll never actually experience, we've talked here before about that great expression that a good world is one where people plant trees in whose shade they will not know. And I think that's a really important idea here in this book. Will is a fantastic writer, a fantastic thinker. He's the co-founder of 80,000 hours, which is sponsored the podcast before. He's one of the creators of 80,000 hours, which is sponsored the podcast before. He's the one of the creators of giving what we can, which is a global community of effective
Starting point is 00:04:09 givers that's known for the giving what we can pledge, which is about donating 10% of your lifetime income to the most effective charities. I think he's had an enormous positive impact in the world. I think you're really going to like this book. Check out what we owe the future. I'm actually going out this weekend to see some folks, and I'm giving away four of the copies. The group I'm going to see is a billionaire, a great medical professional that I know, and a tech investor, and I'm giving them this book
Starting point is 00:04:37 because I think they should read it, and it's really, really important. And I'm hoping they will make a difference from using the ideas that we'll talk about. So check out what we owe the future and enjoy this conversation with the one and only William Magascale. You can follow him at willmagascale on Twitter or go to williamagascale.com and again his books are doing good better and what we owe the future.
Starting point is 00:05:15 So it seems like at the crux of effective altruism and then also the new book is this idea that we should not just care about people will likely never meet. I think people sort of intuitively feel sympathy for those people, but you're arguing that we should actually change our lives, our decisions, and our actions on behalf of these people that will never meet, whether they live in a war-torn country on a different continent, or if there are three generations from being born? Absolutely. So, effective altruism is about trying to figure out how can we do as much good as possible and then actually putting it into practice.
Starting point is 00:05:53 So, changing our careers to have more social impact or donating some of our earnings in order to spend that money on whatever the most effective non-pluff it's are. But I think people would say, shouldn't I care more about people that are around me, around my own family, people are light that are like me. Why should I be prioritizing even one minute of my own pleasure now for someone that again, like essentially doesn't even exist to me? Well, I think I could give you a lot of arguments. I think the first thing is just almost certainly you do care.
Starting point is 00:06:30 So Peter Singer gives this very influential argument, which just imagine you're, you know, walking past a shallow pond and you see a child who's in that pond and they're face down, they're going to die, they're going to drown, unless you reach in and save them. You'll have to run in, you'll ruin your suit, your suit costs, it's like thousands of dollars. You're on the way to an interview, let's say. Now, if you just walked by and let the child drown, I think you and I and everyone would agree that you'd be a total asshole.
Starting point is 00:07:03 And people don't want to be an asshole, they want to be good people. And that means that, you know, we would save that child, even if it costs us many thousands of dollars. And now it turns out the world we live in, that you can save a child on the other side of the world. You can protect them against malay, from dying for malaria. So, statistically, you will save that child's life for just a few thousand dollars. And given that, you know, given that you can do that, I think it's just follows from the values that most of us have, the large majority of us have, that that's like a better way to live a life.
Starting point is 00:07:41 Have you read Camus the Fall? I have not read that book. It opens with the story. The guy is walking along the streets of Amsterdam and he hears someone go into the water whether it's a fall or a suicide and he convinces himself that he doesn't hear it, right? That, oh, it was nothing. But he comes to find out that someone did die because of his indifference or the rationalization in his head. And basically, the rest of the novel he's sort of haunted by this idea, I would agree we should care.
Starting point is 00:08:16 It's just easier to not care when it's out of sight, out of mind, which is kind of what the trauma or the indifference we have towards either the trauma we inflict on or the indifference we have about what's going to happen in the future, there's a kind of all begun, you'll be gone, nist to it. Exactly. So effective altruism in general is often about bringing what's sight and therefore out of mind, making it more salient, putting it in front of your eyes and saying, okay, you can't ignore this. And so that's true for the people, the plight of people in other countries who, you know, you may never interact with or meet the thousands of children that die of malaria every day. It's also true, as I argue in my book what we we are the future for the people who are yet to come.
Starting point is 00:09:08 The generations of people who are going to follow us, they have moral worth, they have moral standing in just the same way that people alive today have, but we don't pay them much attention, or barely any attention at all, because they're not around to lobby for their interests, they don't get a vote, they can't tweet, they can't buy articles. And I'm, you know, trying to take one step towards collecting that.
Starting point is 00:09:31 Well, there's kind of, there's kind of a hollowness to the what aboutism. So right, if you're like, hey, no, you should try to find out a way to donate some of your excess money to save people who are dying of malaria, you should invest your work in such a way that you help people in the future and someone goes, but what about all the suffering that's happening around me right now? But the truth is, you're not doing anything about that either, right? It's a way to, I think what we often do is we go like bad thing happening over here. Let me find another bad thing that cancels it out, right?
Starting point is 00:10:02 This is an exactly an equivalency, but people do this, like if someone points out, you know, sort of like violence against minorities by the police, they'll say something like, but what about black on black crime? As if pointing out another bad thing, you're essentially trying to find a way to not have to care. And thus not have to do anything about it,
Starting point is 00:10:23 about either of them. Absolutely. So if someone makes the argument we should care about people that have owned communities to me, and they're like, they're giving 50% of their income to this stuff, they've devoted their career, they're helping the homeless, they're helping the incarcerated
Starting point is 00:10:40 or criminal justice the form, I'm just like, go you. I'm like, you're a fucking legend. Keep doing your thing. This is, I'm like, you know, the thing where opposing with effect of altruism is not in effect of altruism. It's just apathy where most people don't even try to do good in the world. And we want to inspire people to do more.
Starting point is 00:11:00 No, that's a really interesting point. Because there is a way we use effect of altruism and you say, here's all really interesting point because there is a way where you use a factor of altruism and you say, here's all the things people are doing. And this thing that you care about, volunteering at this soup kitchen with your church, is actually, you know, number 700 on the list of making a positive difference in the world. Therefore you shouldn't be doing it. That's not what effect, that would be to misuse a factor of altruism. What I think the most compelling argument for it is,
Starting point is 00:11:27 actually do you, the person who's not doing anything, understand how easily and cheaply you could do something, you don't even have to get up your ass and go to a soup kitchen, you can give $10. Save one tenth of a life for whatever it is. Absolutely. And the vast majority of people that we're convincing are not people who were already invested in some particular cause and then gave that up to switch. Happens occasionally, but much more often it's people who weren't really doing anything. Often they felt like they weren't really being spoken to by existing kind of non-profit messaging,
Starting point is 00:12:10 or they were just early on in their life, for going out to do with the rest of their life. And yeah, we're being inspiring for them. And for sure, I can give arguments for thinking that you're good, you can do more good by focusing your charity overseas or by focusing it on some of the issues that will impact the long run to the jetty of the human race. But like I say, if you're just out there on the streets helping people, you're already
Starting point is 00:12:38 doing far more than most people in the world might absolutely commend you. Yeah, I think one of the things that prevents people is the totality of the problems facing human beings seems so enormously overwhelming that they don't know exactly where to start. And I think one of the things that I've taken from effect of altruism is that like, actually here's a list of things you can do to start and make a tangible, I think it may have been in that New Yorker profile of
Starting point is 00:13:10 you. And it may have been you that you were talking about how, hey, if I donate X amount, you know, over the course of my life, even just if I'm making $50,000 a year, and I donate 10% of that, you know, over the course of a life, I might save a thousand human beings who otherwise would have died. And that effectively makes you like a superhero, like to save a thousand people, like when you hear about someone like Schindler, who saves these Jews from the Holocaust, you know, how many people did he personally save say it was a lot, but it wasn't a million, right? It was the amount that a human being with a small amount of discretionary income could potentially do over the course of their life with some of these ideas.
Starting point is 00:13:57 That's absolutely right. Like I remember when I was younger, my brother was 17 and we were surfing I remember when I was younger, my brother was 17 and we were surfing on the coast in the UK. And a woman got taken out by a reptite and he ran into the water and he saved that woman's life. And that just put aside ethics and morality. Like that's one of the big moments in the course of his life. Like that was a big deal in the course of his life. Like, that was a big deal in our local community.
Starting point is 00:14:28 Exactly, yeah. If you do has nothing else, that's a pretty good life. Yeah, and it's not like he talks about it a lot, but I'm sure for him, it was a little of being something that has stayed with him and made his life more meaningful. Now imagine if you did that every week, the next month, this, say someone from, like, fire, third month, it's like, I'm like maneuver.
Starting point is 00:14:49 You know, you'd think like, wow, I'm just this kind of in this special situation, like, there's literally something special going on, but you really can be like that person, where, yeah, it's a few thousand dollars to save a life. You can, depending on exactly what you're earning and how much you're able to give, you really can be saving a life every single year of your life. So you can be like that person saving someone
Starting point is 00:15:13 from downing every single year. And look, I mean, again, like put the moral arguments to the side. That's just, I'd know, that's cool. That's inspiring. That's living a life in accordance with the values we all have. I had John Mackie on the podcast and we were talking about the story of the star, the boy in the starfish. Do you know this story? I know it well, yeah. Yeah, and
Starting point is 00:15:36 you know, the, basically the moral is like, you know, there's a thousand, there's a thousand starfish on the beach who throw one back, someone says, what does it matter? It's just one starfish. And he says it matters to the starfish. I think that's something that we kind of miss when we think about these things. Like, okay, so your brother saves one person, but think about what that matters to that person. Or if you save one person dying of malaria,
Starting point is 00:15:59 not just what does that matter to that person, but what does it matter to that person's family? And that is the interesting thing when they look at, like again, someone like Shinler, when they look at what those, once you get a big enough sample, you get like, okay, this person saved 500 people from certain death.
Starting point is 00:16:18 It's almost a statistical certainty that some of those 500 people will go on to have a pretty large impact, right? One of them will be a professional athlete. One of them will have 10 children. One of them will discover a cure for this or that. Once you get those numbers big enough, all of a sudden you could have like more than just a tiny ripple, you know, in the future of the world. You could have a potentially enormous impact. I just read David McCullough's book, Night of the John's Town Flood.
Starting point is 00:16:51 I don't know if you've read this, but there's this boy who's rescued out of the flooding. It's this horrible natural disaster at Dan Burst. Anyways, I'm just reading about one of the characters and I Google him after this like five yearold boy who's pulled from the wreckage goes on to discover the cure for some tropical disease that saves millions of lives, right? I think maybe it was leprosy. I forget what it was, but the point is you also can't underestimate the impact that the person you have an impact can go on to have. Yeah, absolutely.
Starting point is 00:17:28 If you think that the trajectory of human civilization is positive, which I do, and I think we have so on the reasons to think so, then just, yeah, every time you're saving a life, every time you're doing a good thing, you're not only helping that person, but all of the good things that they go on to do, which then has have the of their own.
Starting point is 00:17:49 And so you actually have an impact that just, I mean, maybe you don't know exactly what it is, but it propagates potentially indefinitely for the very long time indeed. And this framing of, you know, you mentioned like there's a lot kind of doom at us or just focusing like how bad and big the problems are and it's overwhelming. And when thinking about this podcast, I had a thought of in this way, effective altosism is a
Starting point is 00:18:17 little bit like stoicism applied to the world's a whole like out here altistically because one important lesson from soism, like suppose something's bad is happening gonna happen to you but there's nothing you can do for to change it then it's just neutral like that is now just part of the back crown of the world and like you should be focusing on the things that you connect where you can actually make a difference and one thing I've certainly found before I started thinking about effective altruism to then engage with, you know, people tend to make the world a better place. And figuring out what should I do? Is there's just a huge amount of doom at this and a huge amount of like these problems are so bad, we should feel really guilty, like let's all get together and feel horrible. But like,
Starting point is 00:19:00 that's just not helping anybody. What we want to do is make the world better and die such that we have contributed to the world being a better place in which of our existence. And yeah, there is enormous amount of suffering in the world. The world is extremely unjust. That doesn't mean that the right thing to do is to like self-flagulate every day about how bad it is. Like sure, you can have those moments and that can be important.
Starting point is 00:19:27 But the primary thing is just what you're doing to try and actually make it better. And that's what I really try and focus on. Well, General James Mattis, the former Secretary of Defense and student of the Stokes, he had this great line that I heard. He said, cynicism is cowardice. Right? So when you say that it's impossible that you can't do anything about it, that when you give into the dumerism, yes, you're sort of punishing yourself and that you're feeling
Starting point is 00:19:52 sad, but you're also freeing yourself in that you're saying, it's hopeless. It's not on me. I don't have to do that. You're freeing yourself from the obligation of having to try to chip away at it, still potentially having your heart broken, but like you're exempting yourself from the solution. Like I think about this with the people, I know some people are like, we're so worried about climate change. We don't want to contribute to the problem, so we're not going to have kids, right?
Starting point is 00:20:25 Yeah. And I think about that because that's precisely who should be having kids, right? Because like the other side, the people who don't care about climate change, they're just procreating like crazy, right? They don't care. You need more people on your team, so to speak. Because you're conscientious, because you feel about it, because you think that it matters, you should be engaged. And so when we go, like, yeah, it's hopeless, I'm not going to do anything about it. You're essentially seeding the field to the forces that you are saying are
Starting point is 00:20:55 bad or evil. Absolutely. And I talk about this very issue of having kids in the book, where there is this idea from the climate community of, oh, it's bad to have kids because of the negative impact they'll have. But that's so one-sided. So here's a couple of thoughts. One is just like if you can offset your kid. Well, in the UK, total emissions are like seven tons per person per year, in the US it's a little higher maybe like 15. It costs at the current margin like a dollar to offset one ton of CO2 if you're focused on the very most effect on profits. Let's say it's like $10. Now kids cost a lot like $10,000 a year would be about the typical probably more. So okay let's say you add on a thousand dollars on the very most targeted climate nonprofits,
Starting point is 00:21:47 you've more than not done the climate impact. But even putting that to the side, you need to look at the positive effects that kids will have as well. If you bring them up well, they will innovate, perhaps they'll innovate in clean energy. They will contribute to society. They will vote for, if you think them up well, they will be moral change makers as well. And again, unless you think the world would be better if there were no one on it at all, just a pretty dark view and an out of view that I think is well
Starting point is 00:22:16 justified, then you've got to think that like humans are net good. Like yes, we cause a lot of harm, but we also make the world better as well. And for any ethical decision, you've got to look at both sides of the ledger. I, what I liked about the new book, which is called What We Other Future, it's essentially you're doing a very stoic thing, which is you're saying you have to zoom out, right? You have to get a big, like like that are immediate human perspective is too limited and too biased towards what's happening around you right now. And you need to take what they call Plato's view, which is, you know, we call this the 10,000-foot view. But you think
Starting point is 00:22:58 about it, even like in the ancient world, like, could anyone ever even see Rome? Like, how hard would it have been to see Rome? Right? Maybe you could get on top of one mountain, but you couldn't really see all of it. Like, the perspective that we get when we get up in an airplane or a spaceship, these are like relatively new modern perspectives.
Starting point is 00:23:19 Like, maybe a hundred years tops have we been able to do this. And we, as human beings, have to not just get that perspective to see like, hey, this border between America and Mexico is a fucking bullshit made up thing. And there's human beings on either side of it and we should care about both. But then as you're saying, which is like human beings today, like you have a good analogy in the book, you're like, if you leave a glass bottle on the ground and it breaks, you would feel bad if your child cut their finger on it. But why should you feel less bad knowing
Starting point is 00:23:53 that a child a hundred years from now broke their finger on it? It's the same thing. You have to zoom out and get this bigger perspective so you can know what to do and what not to do and what to care about and not to care about. Absolutely. And I mean in support of your idea that this is a stoic view, I could quote Marcus Aurelius at you if you're interested. Of course. So he says, you have the power to slip away. Many superfluous troubles located wholly in your judgment and to possess a large room for yourself, invading in thought the whole cosmos to consider everlasting time, to think
Starting point is 00:24:30 of the rapid change in the parts of each thing, of how short it is from birth until dissolution, and how the void before birth and the after dissolution are equally infinite. So Marcus Aurela's himself was a long terrorist. Seneca was too, I have a great quote from him. But yeah, exactly. I appreciate your place. I mean, there's actually a kind of striking thing because it goes both ways. People often argue, like, take your...
Starting point is 00:24:56 Like, look at everything that happens in your life. And what we know of them, modern science is that this is just this, you know, the 80 years you might live, it's vanishing fraction of, you know, 14 billion years behind us, hundreds of billions of years to a less stars burn out. Everything that happens on Earth is just this tiny pin, pin, pick or a pin, pin, and people feel, wow, that's so humbling. And I think that's absolutely right. But it's actually also inspiring in the sense that I think our actions can have truly cosmic significance because it's plausible that humanity will last for hundreds of millions
Starting point is 00:25:34 of years till the earth is no longer habitable or billions of years until like if we take to the stars. And then the actions we do today, I think really could have impacts that last over that whole timeline. So, you know, we are actually able to take actions like as a cosmic scale. And that is like, you know, on the one hand absolutely terrifying, but on another hand, you know, quite inspiring too. I know the world is crazy and businesses are going through all different kinds of cycles, but one thing that's always consistent is that you need the best possible people in your company. And LinkedIn Jobs is here to make it easier to find the people you want to talk to faster and for free.
Starting point is 00:26:23 You can create a job post in minutes on LinkedIn jobs to reach your network and beyond to the world's largest professional network of over 800 million people. You add your job in the purple hiring frame to your LinkedIn profile and it spreads the word that you're hiring so your network can help you find the right people for your business.
Starting point is 00:26:40 Simple tools like their screening questions make it easy to focus in on candidates with the right skills and the experience so you can quickly prioritize who you'd like to interview and hire. That's why small businesses rate LinkedIn jobs number one in delivering quality hires versus their leading competitors. LinkedIn jobs will help you find the candidates you want to talk to faster. Did you know that every week, nearly 40 million job seekers visit LinkedIn, You can post your job for free at LinkedIn.com slash stoic.
Starting point is 00:27:07 That's LinkedIn.com slash stoic to post your job for free terms and conditions. Apply. Is this thing all check one, two, one, two. Hey, y'all. I'm Kiki Palmer. I'm an actress, a singer, an entrepreneur, and a Virgo. Just the name of you. Now, I've held so many occupations over the years that my fans lovingly nicknamed me, Kiki
Starting point is 00:27:34 Keep a Bag Palmer. And trust me, I keep a Bag Love. But if you ask me, I'm just getting started. And there's so much I still want to do. So I decided I want to be a podcast host. I'm proud to introduce you to the Baby This Is Kiki Palmer podcast. I'm putting my friends, family, and some of the dopest experts in the hot seat to ask them the questions that have been burning in my mind.
Starting point is 00:27:53 What will former child stars be if they weren't actors? What happened to sitcoms? It's only fans, only bad. I want to know. So I asked my mom about it. These are the questions that keep me up at night. But I'm taking these questions out of my head and I'm bringing them to you. Because on Baby This is Kiki Palmer, no topic is off limits.
Starting point is 00:28:09 Follow Baby This is Kiki Palmer, whatever you get your podcast. Hey, prime members, you can listen early and app-free on Amazon Music. Download the Amazon Music app today. Well, one of the ironies of Marcus Aurelius is that he writes repeatedly in meditations about the worthlessness of posthumous fame, right? He's like, you're saying like, people in the future are going to be good too. He's like, hey, don't, don't forget. There's fucking idiots in the future too. And they suck.
Starting point is 00:28:39 Yeah. And you don't need to perform for them, essentially. He's basically saying that like these conquerors who would try to achieve enormous things for the sake of being remembered by history, they were chasing something that didn't really matter because they weren't around to enjoy it, right? Which I think is important.
Starting point is 00:28:58 He goes, you know, Alexander the Great, like what good did conquering the world do him? He's not around. It doesn't mean anything to Alexander the Great that Alexandria still exists, right? But the irony of Marcus Relius is that he's writing these notes in a journal to himself, they're not thinking of posthumous fame.
Starting point is 00:29:18 And then you and I are making decisions here in the present moment, ideally for the good of other people precisely inspired by the work that he did 2,000 years ago, and we do remember who he is. So it's this kind of paradox of like being remembered and having a grateful future doesn't do you any good, but maybe the way actually to be remembered by the future and to do good is to try to focus on what you can do now and not think about that at all. Yeah, so I think we should distinguish between self-interest, what we're doing just for ourselves and what we're doing for the good to have a positive impact.
Starting point is 00:30:00 Yeah, if you just care about yourself, I do believe the view that, you know, when you die, conscious experience ends, that you can't, nothing that happens after that can contribute to how well or poorly my life goes. But things I do now can contribute to how well the long term future goes, and often in ways that will not be the membered. So, I mean, of all the scientists in the past, of which modern, for which modern science builds on, how many do we have a member? What flexion? I know Isaac Newton, but like most people don't even know Leidenitz, like most people, you know, there's the few famous people, but all the armies of people who helped contribute to the scientific project are,
Starting point is 00:30:47 they do not get remembered, but they are having an impact, because, well, it's this science, it's this cumulative project where one scientist or innovator builds on another. And this was an insight that was appreciated by Seneca. It's actually my colleague Toby Orde included this quote in the precipice. Altain is quite long. So, okay, Altain I'd keep it shorter, but he says, the time will come when diligent research of a long period will bring to light things which now lie hidden. A single lifetime, even though entirely devoted to the sky, would not be enough for the investigation of so vast the subject. And so this knowledge will be unfolded only through long successive
Starting point is 00:31:33 ages. There will come a time when our descendants will be amazed that we did not know things that are so plain to them. Let us be satisfied with what we have found out and let our descendants also contribute something to the truth. Many discover these are reserved for ages still to come when memory of us will have been affected, or will have been affaced. So in like, it's this amazing insight in like the first century AD to appreciate just like how little you knew, and correctly, how little you knew
Starting point is 00:32:00 in first century AD, but yet at the same time, you can contribute to this project by making a little bit of project, a little bit of progress that the next generation will make progress on. The next generation again will make progress on. And I think that's quite inspiring, the thought that like, look, all of these benefits from past generations that we've been bestowed to us. Well, we can benefit the future by engaging in this intergenerational project, you know, making things a little bit better again, and then future generations can build on that even further. What's interesting to think like Stoicism itself is like a five, six hundred year
Starting point is 00:32:37 project, even if you, if you cap it at Marcus Aurelius, right, and you say that it's a dead philosophy from that point forward, you know, it's like 500 years, which is an incredibly long project of many torches being passed from one generation to another that creates this thing that's still impacting people today. And so, yeah, it's like, it's humbling and also inspiring. It's like, hey, you know, it's never just one person who changes everything. It's all these sort of little contributors adding up. And then that kind of takes you down a peg, but it also should empower you that, hey, this regular person can become one of those torchbearers and that you can have a positive impact. You can, you can bend the arc of the moral universe, not enormously, but a little
Starting point is 00:33:28 bit. Absolutely. And this illustrates, so one of the things I talk at length about in what we are the future is moral progress versus kind of technological and scientific project, because Seneca is entirely right that you have this indefinitely long-lasting impact by contributing scientifically or through innovation. But what you're doing is like moving forward the curve of technological progress, a little or scientific progress a little bit, because as long as there's no huge catastrophe we
Starting point is 00:34:00 don't go extinct, I think it's extremely likely that one day we will figure out everything that is possible to figure out. And so you're just making that happen faster. Sure. But then when we think about model progress, it's actually very different. Like, what's the law of the universe that says that one day we have to get to the very best model views? I think it's much more contingent. We could end up with best model views. I think it's much more contingent. We could end up with great model views, we could end up with really dark, dystopian, fascist views. And again, thinking about the history of stoicism, it can be illustrative there, where early millennium, first few centuries AD, like what would you have bet on as the guiding philosophy
Starting point is 00:34:48 for Western Europe and then from there the world? You wouldn't have bet on Christianity, right? Instead, you might probably would have bet on stoicism. But Christianity kind of won the war of ideas in Europe and then has become like, you know, the dominant model world view of the entire world. It seems to me, and like the more I've learned about history over the course of the years writing this book, the model development society has taken is really quite contingent, really could often go a different way. And I think that could continue into the future where which model views we promote and defend today really can shift the trajectory of the values that future generations live by. Will you talk about the abolition movement in the book?
Starting point is 00:35:40 And I think that's interesting. I would just read that book, Barry the Chains, about the creation of the abolitionist movement. That like basically a group of like seven people got in a room and were like, hey, should we care about people who are enslaved? And they came away with the answer, yes. And the entire arc of Western civilization changes as a result of that conversation, not immediately. And in many cases, not in the lifetime of those people. And they couldn't have anticipated where it would go exactly. But yeah, the moral innovation, I'm thinking about that.
Starting point is 00:36:17 You have the graphic of that slave ship in the book. But if you've seen the famous drawing of the black man and it says am I a man and a brother or a minor, like an artist comes up with an idea that represents a person that was previously to the vast majority of, let's say, white people in the world, not seen as a human being, and renders it artistically and creatively and you could even say propagandistically in a way that changes how people see things and over the course of 100, 200 years, an institution that no one would have questioned or predicted would go away, does go away. And that shows the seeds for the women's rights movement and the gay rights movement
Starting point is 00:37:05 and so many different movements because a few people made a moral invention or innovate. Absolutely. Absolutely. And thinking of this as a moral innovation, it is important. I think it's that's very accurate and very surprising. I think if you just look at the history of morality. So the abolitionist movement, so for example, that symbol of... Am I not a man and a brother? A man and a brother? That was part of the fact of abolitionism being a campaign. And it was really the first model campaign.
Starting point is 00:37:41 And Adam Hockshield, who the author of this book, better to change, describes this as something never seen before in history, which is mind-blowing. Like, the idea that it was just like social activism, social campaigns, petitions were not something that had been done previously, but yet were done for the abolitionist campaign, getting hundreds of thousands of people across Great Britain to sign to come out against slavery. This was just absolutely striking. And yet, as you point out, it took time.
Starting point is 00:38:20 So the first kind of public statement that in the course of this movement was the Germantown Memorandum in Pennsylvania, the Quakers, which were really the seed of the thought that led to the abolitionist campaign. That was in 1688, and then the abolition of slavery, not of the slave trade itself, which was earlier, but of slavery altogether in the British Empire, was 1833. And then we go all the way to count these Saudi Arabia and Yemen, slavery was only abolished in the 60s. China was 1909, Ethiopia was in the 40s, I think. Moral change takes time. We're talking about hundreds of years of campaigning,
Starting point is 00:39:07 working together to bring about this change. But that is a change that I think affects the potentially the entire course of future history. And the impacts there are truly enormous, like not just the millions of people who are being enslaved over the course of the 18th and 19th centuries, but all the people that could have followed that too. No, that's what I think the book is so great because you're talking about, you know,
Starting point is 00:39:36 you're talking about, in effect, in the first book you did, you're sure talking about how can you use a little bit of your money to help some people? But the thing of like, how can I invent or come up with something or be a part of something that makes the future almost incomprehensibly better? Is so empowering. I just read and I had the author on the podcast and I won't butcher his name. But he wrote this big biography of Gandhi
Starting point is 00:40:04 and he essentially presents the idea that Gandhi invents in the way that the abolitionist invent, the petition and the art and the boycott all these different forms of social protest. Gandhi himself invents basically an alternative to warfare, right, like every conflict prior to that point, including the abolition movement, I mean, slavery is eradicated in America by war. You know, all conflicts were settled
Starting point is 00:40:32 by violence, right? Effectively, right? Even, even Klauswood says, you know, war is just the extension of politics. And for Gandhi to invent a form, passive resistance, but effectively a way of resolving conflict, changing the system without violence, with the opposite of violence, is maybe the greatest gift that humanity has ever been given. If you think about it, right? Because we have the civil rights movement to thank for that, with the gay rights movement to thank for that. And then, and then if you thinking about the criterion, what we owe the future, like try to comprehend how many people are alive today, because they didn't die in wars that Gandhi's innovation prevented from happening. It's incredible. Yeah, that's a fantastic thought experiment and not one I'd thought about whether appreciated, but this general point of, you know, we look at history and there are certain
Starting point is 00:41:34 in it like technological innovations that they're very concrete and so you can be like, oh, and it was at this date that, well, the printing burden, Goodenburg derailleur, it's the printing place. In Scotland, it was always as a history of Scottish inventors. James Watt, and Vince the steam engine, it wasn't quite true. But yeah, you've got these particular technological inventions, and it's easy because they're very concrete.
Starting point is 00:42:00 But model innovation is just as important, and it's harder to categorize. It's more intangible and so we can overlook it. But in terms of the contribution to human well-being over the last, which has been enormous over the last few hundred years, technology and scientific innovation has been absolutely huge. But so has model innovation and model progress. In 1700s, three quarters of the world were in some sort of forced labor, whether slavery or surfdom. And now that's less than 0.5 percent%, maybe even considerably less. That is just incredible, or the fact that, yeah,
Starting point is 00:42:49 war between great powers was just absolutely the norm throughout, like, at least the history of times when we had large states from 1300 onwards. And now within a position, I mean, unfortunately, I think a war between the great powers of the world is still all too likely, but much less likely, I think, structurally, than it used to be. And that's part of just changing norms and changing taboos. And that's, yeah, I mean, the sheer number, how many people would be alive,
Starting point is 00:43:20 would not be alive today if Gandhi hadn't lived is, that's a pretty inspiring for experiment. Yeah. And what, how long would it have taken? Because so many people come out of Gandhi's movement and come back to the US and lead, help lead the civil rights movement, the influence there, does that change differently? Does that come as a much more violent movement instead? Does it because of the violence, does it lose? It's moral high ground and thus not happen. And it's really is almost incomprehensible to think. And Gandhi sort of, I mean, you could trace it back to Thoreau, you know, Thoreau protests
Starting point is 00:44:01 the Mexican-American war. But Gandhi sort of, Gandhi discovers Thoreau after he comes up with the idea-American War. But Gandhi sort of, Gandhi discovers the row after he comes up with the idea of passive resistance. But just the idea that like that was not a tool in people's toolkit before. Like, I guess you could also kind of credit the suffragettes. But just the idea that it was like before, if you were Indian and you wanted to protest British
Starting point is 00:44:26 rule, your options were effectively blow them up or get another country to come wage war on your behalf and for you. Those are like your only two options, right? Or go along with an injustice. But the idea that you could morally undermine the cause until your occupiers will collapse is, that just didn't exist. And it's so incredible, what an incredible gift that is
Starting point is 00:44:54 if you think about it that way. Yeah, it's huge. And I mean, the thought is just for all of these things, there has to have been some point at which it was new. Yes. So, democracy too. I mean, there is a point in time where, you know, I mean, you can think of hunter-gatherer societies, a sort of democratic, they're kind of very egalitarian, but they don't have a
Starting point is 00:45:15 state. But of governments that had a state, there was some point in time, which, at least in the West, was sometime before classical Athens, which was the first democracy. And then the impact of that was felt 2000 years later, significantly via, like, you know, the Roman slukchel of government, which was at some periods of time, not as authoritarian as other systems of government, where we wouldn't have had the United States of America. We wouldn't. I mean, would we have even had this like flourishing of democracy where over the course of 200 years, we've gone from no one living in a democracy to 250 years, no one living in a democracy to democracy to 250 years. No one living in the democracy to over half the world. Again, that's like, that's due to some morally pioneering societies thousands of years ago combined
Starting point is 00:46:13 with morally pioneering individuals 250 years ago with a formation of United States. That's like, there has to have been a first. And in, it's in particularly impactful in the case of moral innovation, because it really might not have happened otherwise. And the same is true, and then we think about moral campaigns we could be undergoing today. If we don't fight for certain rights, then it's not at all obvious that we'll have future generations will converge on them instead. Well, so how do you think about the tension between your two works then? So like, one argument of the effect of altruism
Starting point is 00:47:01 is like, get a job on Wall Street, make as much money as you can, try to donate as much of that as you can to causes where the altruism is effective and this has a big impact. And then you're sort of 80,000 hours thought process of like you have one, you have how many career hours you have, what are you going to spend them on? And this idea that like to spend them on. And this idea that like if we what we owe the future is moral innovation, you know, moral effort progress towards things that matter, what I like about the new book is it's kind of like, hey, maybe you should write a book, maybe you should create art, you know, maybe you should found a new community or a club. How do you think about that tension then between like optimizing financially to have maximum measurable impact
Starting point is 00:47:47 and then the soft moral impact of this other stuff we're talking about. Yes, the first thing I'll say is just all of this is really tough. I see effective altism as this project trying to figure out how do we do as much good as possible, rather than then a conversation around that, rather than the strict set of answers. And I mean, one thing I should say when it comes to career choice, the earning to give, go into Wall Street, donate it, it was a very powerful meme, and many people did do enormous amount of good through that route.
Starting point is 00:48:21 I think it's just one path among many for people, even if you're focused on helping people in poor countries, lots of ways you can do a lot of good. But then this question of what do you focus on? Are you trying to make just focus ruthlessly on making the pleasant generation better, or can are you trying to do things that are going to make the long term go as well as possible? One thing I think is that there's less tension between those two things that you might at first think.
Starting point is 00:48:52 There's enormous weight, a number of ways of doing good to that have both like very good in the short run while making the long, like being very good in the long term as well. So we have worried about pandemics for many years. We were trying to get people to work more on pandemics and funding areas, funding places in that area since like 2014. I think if we'd had more traction, then things would have maybe gone a little better with the COVID-19 pandemic. When I look to future work to try to prevent the next pandemic, well here, one of the things I'm really excited about
Starting point is 00:49:29 is this thing called far UVC radiation, which is basically just a specific spectrum of light with sufficient intensity. Then it can be fitted into a light bulb that just sterilizes our room. So it just kills off the pathogens in that room. And while hopefully being sufficiently safe for human beings as well, even with long exposure, this is like very early stage.
Starting point is 00:49:53 So we're funding stuff to like the search into this, to see if it is as efficacious as it seems, if it's as safe as would be necessarily. But let's just suppose it pans out. Well, then, and we can really get mass uptake of this. So put into all light bulbs and in all, like, as a matter of regulation, like all around the world, well, okay, sure, one thing we would have done is ensure that we never get a pandemic on the scale of COVID-19 or greater than
Starting point is 00:50:21 again. But along the way, we also would have eradicated all infectious diseases, or all respiratory diseases, I should say. And that's, you know, it's been good too. It's been good for the short term. Similarly, when I think about climate change and the big things we can do, best things we can do to impact climate change, a lot of it is in clean tech. So the idea of, you know, one of the most neglected things, I think, is what's called super hot rock geothermal, just digging really deep into the ground and getting heat from closer to the Earth's magma where it's much hotter. You know, huge technical
Starting point is 00:50:59 challenges there, but we should be investing in it a lot more as a society. And that's great for climate change. But it's also great in the near term, not just because climate change is impacting people over the coming generation, but also just the health impacts of fossil fuel burning alone. Like 3.5 million people per year die from particulates. We can just end this. And then the same as to do, I think think for model innovation and model change too, especially in the world today, one thing that's different about today compared to history is that model
Starting point is 00:51:33 change can happen much faster. Sure. I think it was like 11 years that gay marriage was legalized in Massachusetts, the first state to then having being legalized federally and being a federal right in fact. And you know, that's just an incredible turn of time by historical standards. Oh, that's a little bit tenuous, right? None of these things are a state. We'd like it to be. And that's the other, that's the other day. The measure continues up to. Yeah. Yeah.
Starting point is 00:52:05 Of course, absolutely. And so then if we're advocating for new model ideals, like taking the interests of people in other countries much more seriously than people currently do, I'm a big proponent of animal rights and animal welfare and the absolute atrocity that is the 80 billion animals that are essentially tortured every year and factory farms, or the idea of caring about future generations, caring about the generations to come. This is something, this is contingent, this need necessarily, you know, need necessarily occur, and you're right that even when we make progress, you've got to keep enforcing it
Starting point is 00:52:42 and you're right that even when we make progress, you've got to keep enforcing it because you can't get complacent. But these are things, like these thoughts can easily benefit, or these campaigns can often benefit people in the near term, as well as having actually in the long-lasting positive effects. That's one thing I took from the pandemic
Starting point is 00:53:00 because people would try to make this argument, and I always thought, I don't think you're making the point you think you're making, which is they go like, you know, only only 2000 people a day are dying of COVID here in the US or whatever. They're like, that's way less than heart disease, car accidents, guns, et cetera. And then you were, I'm like, wait, give me that chart again. And then you look at the chart and you're like, how are we accepting this daily Holocaust of needless death, right?
Starting point is 00:53:28 And it is amazing the readiness with which human beings can accommodate and rationalize an ongoing royal in tragedy because it predates them and will probably continue past them the way we're just like, that's the way that it is, right? because it predates them and will probably continue past them the way we're just like that's the way that it is, right? Like in a sense, our reaction to COVID was the right one, which is like this thing happened
Starting point is 00:53:55 in thousands of people are dying about it. We need to stop this, prevent this, and not let it become just another line on the PNL of the human race. But you can tell we've clearly done that with a number of, you know, not totally preventable human rights or human issues,
Starting point is 00:54:18 but like we have become very desensitized to enormous suffering and tragedy because it's a statistic. Absolutely. We get a sense of what's normal. I mean, in the case of cars, I mean, imagine if there was a new technology, I'm like, oh, hey, it's going to help us get to work a little bit faster, involve flowing, involves sending half a ton of metal at high speeds around schools. Yeah. And it'll kill like thousands of children and hundreds of thousands of people every year.
Starting point is 00:54:54 But like it'll make the economy go faster. It would never get expected now instead because it's in the past. And it's relevant when we consider something like self-driving cars, which could bring that death rate down to a tenth of what it is. People see it's, I don't know, there needs to be regulation, it needs to be sufficiently safe. They ethicists get into these questions of like, oh, well, what if the car could either run into one old person or five?
Starting point is 00:55:22 One young person or five old people, what should the car be programmed to do? The model imperative is that we get self-driving cars on the roads as quickly as possible, as soon as they're significantly passed human error rates. Why? Because literally, if it's a delay of 10 years, that is millions of lives being saved. Well, that is the other thing that COVID illustrated. You know, we get these vaccines, one of the fastest, greatest collective achievements in all of human history. And yet, it almost certainly could have been faster.
Starting point is 00:56:00 And certainly, innovation since it could have been faster. And it also brings into stark relief other things that could have been invented that are being slowed down by bureaucracy. We do a bad, Tyler Cowan has talked about the invisible graveyard and Alex Terremick too, the invisible graveyard. We do a bad job as human beings of calculating the costs
Starting point is 00:56:24 of not just our indifference, but like our bureaucracy or our slowness or our, you got to respect the process. You don't want to disrupt things. You don't want to, you know, you don't want to take risk. There's like, if you think about the people that could have been saved, had the vaccine, it's been greenlit a month earlier. Or like, I've talked about this before because I have young kids. We watched it play out with young kids in the vaccines where they're like, all right, the vaccines are approved. Well, we have a meeting scheduled for next month, so we'll address it then, right? Like the, the costs of not coming in the following weekend to deal with it quickly,
Starting point is 00:57:04 there's an enormous cost to the future, the way we slow walk things or we're afraid of things, we cause a lot of death and suffering that way, too. I completely agree. One of the things that I've been really excited to see come out of the effect of Altowus and long-termness community. I just new projects that have to then attack all some of their stuff.
Starting point is 00:57:26 So, you know, we're particularly concerned about pandemics. We think that because of advances in biotechnology, there could be pandemics in our lifetime that kill many, many more people than COVID-19 did. Could even lead to just the end of civilization, the end of human race. And that's really worrying. And some people in the effect of old sunkenity set up a company called Alvia,
Starting point is 00:57:52 a part company partner on profit, as I understand it, developed the first Omicthon BA2-specific vaccine in the world. And I'm setting up a system so that you can get very rapid detection, very rapid innovation in vaccines where Pfizer and Moderna, it was, you know, weeks within weeks, within days maybe that they created, they used mRNA to create the first vaccines. So that you can do that.
Starting point is 00:58:23 You have the regulatory setup all that are there have the regulatory set up all that are there. You have production capacity all that are in place, such that when the next pandemic hits, you are producing a vaccine against that within weeks, rather than the nine months that we were waiting. And that's extremely exciting. And I think, again, this is the sort of thing that it won't happen unless, you know,
Starting point is 00:58:49 altruistically minded people going to get off the seats and try to make it happen. And the folks Alvia, I think, really are. I have a sign next to my desk. I wonder if you think about this something similar. I find myself in this strange position. It wasn't totally random, but there's a certain amount of luck and blessing in it, like where I am seen as the face of this thing, this 2000 year old philosophy, right?
Starting point is 00:59:15 And so I, my, my rule is, am I being a good steward of stoicism, right? So like, one am I honoring it with the work that I do, but also the benefits that accrue from it, you know, where are those going, right? And I'm trying to think about where those going? Are they going to support different people? Am I paying people who work for me fairly? Am I investing it in more stuff that helps more people, etc? How do you think about, like, I was curious what you think of that idea of stewardship? Because it strikes me as maybe an idea or a meme that would be more helpful in business. Like I just had, I had David Gills on the podcast
Starting point is 00:59:55 who wrote this book about Jack Welch, the CEO of GE, who basically comes in, it's one of America's oldest companies, it's most successful companies, and he basically just, it's one of America's oldest companies, it's most successful companies. And he basically just sees it as numbers on a spreadsheet and he moves people around. He doesn't see it as being a true, he doesn't see himself as being a steward of GE. This thing, he sees it as like a thing to optimize. I think stewardship is maybe a term that more leaders, politicians and people should
Starting point is 01:00:26 think about even in the way in the US. We're stewards of this great democratic tradition. Are you making it better or worse? Do you know what I mean? Yeah, completely, yeah, completely understand and empathize with that. It's interesting you're talking about, you know about you being a face of stoicism and feeling the responsibility there because I feel similarly with effective altruism where I mean there's odd situation where I'm two things at once. One is an academic and one thing academic who's
Starting point is 01:00:58 due is to say, whatever they want, just whatever they think. And I'm like, all this, like, I, you know, mainly do that, but I'm also aware, like, I am at the same time that presenting this wider movement. And totally. It's often a kind of tension I have to grapple with. But then on the question of stewardship for the world as a whole, yeah, absolutely. Just, you know, this is something that a little see conservative thinkers have, you know, the flexed on Edmund Burke in particular, where yeah, there's a thought of just you look to the past and the benefits that people in the past have given you, you know, medicines, laws that are kind
Starting point is 01:01:38 of built. And they're, it's like a kind of gift and it's not perfect. There are bad things too, the legacy of systemic racism, many problems in the kind of political system that we have too. But you take the thing, and then you just find, like, you know, make it better, pass it on to the next generation. I think that's a very inspiring idea. And then when you appreciate that, oh, this is a chain that might be very long indeed, where I can, one of the metaphors I give in the book is of history as molten glass,
Starting point is 01:02:20 where it can be blown into many, one of many shapes could be deformed or could be beautiful. But the glass also shows how fragile it is as well because there are risks we face in our lifetime of things that could just end civilization, not just nuclear war, world war III, engineered pandemics, low-gei, could just destroy everything. And so you make me think of the imagining the chain of Genovations and they're just passing on this like this crystal that you're kind of or this glass that you're changing and You look behind you there are thousands of people in this line behind you You look ahead of you and you just can't even see because the line switch is so far. Yeah, do you pass it on or do you just drop it?
Starting point is 01:03:02 Smash Civilization is gone. Yes. I think that really shows, this is a huge responsibility we have to pass on the working civilization to the generations that have to come. Firstly, to pass on any civilization at all, which means avoiding all that nuclear war, it means avoiding generation of a next level of weapons of mass destruction, like engineered bioweapons, means avoiding lies of authoritarian ideologies that could take over the world, or extreme climate change
Starting point is 01:03:44 that could massively harm civilization, giving them a civilization at all. But then also just, yeah, I never prove it, I'm gonna make it better. Yeah, it's like, technological, but moral progress too. You think about the fragile glass that is the peaceful transition of power in the democracy. And like, one person can shatter it. And then you could argue argue you still have the ability to put it back together to learn from it to change,
Starting point is 01:04:07 but you can only drop it so many times, right, before it's never possible to put back. And often you find people making very short term calculations like, hey, yeah, this guy's bad, he's gonna do X, Y and Z, but it will help me do X, right? It's these short-term trade-offs or corner cuttings or compromises that we make that come at the cost of those future generations. I think you could argue the boomer generation is probably the quintessential
Starting point is 01:04:40 generation that's made a number of self-interested short-term changes to things that benefited them, but that have come at an immense cost to all future generations. And in some sense, that's probably the generation we can all learn from as to what not to be. Yeah, absolutely. It's so easy to have, I mean, I mean, take the issue of climate change, which is just one of the defining issues of our time. And, you know, something that has been amazing and very positive is the upsurge of model concern about this, where perhaps one of the gifts we'll get from climate change, the silver lining to the, I wouldn't say cloud, but I'm glad I can. Is that there's now been this movement that's actually
Starting point is 01:05:30 finished and up and be concerned about planetly scale effects and their impacts on future generations. And perhaps we can then generalize from that, seeing that, okay, yet, not just climate change that impacts the future. Other catastrophic risks can, too, nuclear war, engineer pathogens, failure of technological stagnation or moral stagnation, those are grave risks, too. And perhaps that means that, yeah, we can learn the lessons of the past and, from here on forth, have a culture that is not my optically concerned with benefits over the very short term, but it really takes seriously how what we're doing might impact sanctities to come.
Starting point is 01:06:17 It is incredible to me to say I've been reading a lot about him. In 1972, maybe Jimmy Carter gave a speech to the entire country where he called climate change a moral equivalent to war. He put solar panels on the White House and said this was like the primary thing we needed to be focused on, not just energy independence, but a way of weaning ourselves off fossil fuels. And he knew this because he was a nuclear physicist. He worked out a nuclear submarine, right? And he saw what this was going.
Starting point is 01:06:52 And people laughed at him. You know, they thought it was a feminine and stupid. And the first thing Ronald Reagan did when he came into the White House was remove the solar panels. And you know, here we are 50 years later, just coming around to his advice, which I guess goes to the other point, which is you got to elongate your view of these things and don't despair just because your message wasn't immediately well received in the moment. Yeah, I mean, you've firstly just got to love Jimmy Carter at Cartoon.
Starting point is 01:07:20 What a hero. He's still out of that just building homes. I know. I think. But yeah, one of the, again, the lessons I talk about in what we are the future, very early on, is the lesson from climate change science and activism. So people don't know this. People think, oh, we got developed an understanding of climate change in the 60s and 70s. But the first quantitative estimate of the warming you get for them carbon dioxide was an 1896 by Svanter Reneas. And his estimate was actually pretty good. It was a little on the
Starting point is 01:07:52 high side, but it's pretty close to the modern scientific consensus estimate. The idea of the greenhouse effect that was early 20th century in the 1950s, Frank Kapar, one of the director of its wonderful life, had a documentary called Unchained Goddess that talked about the, you know, the pedals of climate change. And this was an area where the sponsor was very slow, actually, you know, it was, it's only really the last few decades that we've seen sustained climate activism. We could have acted much, much earlier on. And if we had, we would have been acting on much more speculative evidence than we have now.
Starting point is 01:08:32 And so maybe we would have been confused and actually climate change was never an issue. But we could have had a much larger impact much earlier on. The entrenchment of the fossil fuel industry, you could have had a plan that, you know, over the course of many decades will invest much more in clean tech and introduce a carbon price and steer to the entirety of the trajectory of the fossil fuel industry in a better direction. And I actually quote
Starting point is 01:08:59 Bill McKibben, one of the world's leading environmentalists, who draws exactly this lesson saying, from what I've learned from decades of work on climate activism, it shows we really need to be concerned about AI, which is just amazing for him to do, because he's saying, look, the thing you don't realize is that change happens very fast, we need to be attentive to trends as they're starting. And the trends in the like exponential, but like also a fast exponential progress in the power of AI systems, we are going to have to grapple with some really major ethical issues. Firstly, can we just make AI systems safe and under control? And then secondly, who gets to use them and with what power and how will this be shaped society? And that's like, yeah, a lesson that I'm kind of
Starting point is 01:09:51 trying to draw on what we owe the future in general is look, climate change and environmental activism had this amazing model insight and coverage to stand up for future generations. What's everything that follows them that? Because it's not just an environmental issue, it's an issue that impacts many areas of technology, of politics, of model reasoning. Yeah, and I think the main message there is, you know, nip it in the butter early if you can. Exactly. Yeah.
Starting point is 01:10:23 Well, this is amazing. I loved both books but this new one especially and thanks for all your work. Great thanks so much for having me on Ryan. It's been a little dry talking about these issues and hopefully your audience it's Yes, it could be. It's not that life is short, as Seneca says. It's that we waste a lot of it. The practice of Momentumori, the meditation on death, is one of the most powerful and eye-opening things that there is.
Starting point is 01:11:01 We built this Momentumori calendar for Dio Stoke to illustrate that exact idea that your life in the best case scenario is 4,000 weeks. Are you going to let those weeks slip by or are you going to seize them? The act of unrolling this calendar, putting it on your wall and every single week that bubble is filled in, that black mark is marking it off forever.
Starting point is 01:11:26 Have something to show not just for your years but for every single dot that you filled in that you really lived that week that you made something of it. You can check it out at dailystoke.com. Hey, Prime Members, you can listen to the Daily Stoic early and ad-free on Amazon Music, download the Amazon Music app today, or you can listen early and ad-free with Wondery Add free on Amazon Music. Download the Amazon Music app today, or you can listen early and add free with Wondery Plus in Apple podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.