The Tim Ferriss Show - #120: Will MacAskill on Effective Altruism, Y Combinator, and Artificial Intelligence

Episode Date: November 22, 2015

Will MacAskill (@willmacaskill) is an Associate Professor in Philosophy at Lincoln College, Oxford. Just 28 years old, he is likely the youngest associate (i.e. tenured) professor of philosop...hy in the world. Will is the author of Doing Good Better and a co-founder of the "effective altruism" movement. He has pledged to donate everything he earns over ~$36,000 per year to whatever charities he believes will be most effective. He has also cofounded two well-known non-profits: 80,000 Hours, which provides research and advice on how you can best make a difference through your career, and Giving What We Can, which encourages people to commit to give at least 10% of their income to the most effective charities. Between them, they have raised more than $450 million in lifetime pledged donations, and are in the top 1% of non-profits in terms of growth. He is one of the few non-profit founders who have gone through Y Combinator; for-profit companies get $120,000 for 7% of equity; as a non-profit, 80,000 Hours got $100,000 for 0% of equity. In this episode, we discuss his story and a ton of actionable tips, including: Why "following your passion" in a career is often a mistake. Thought experiments: Pascal's Wager versus Pascal's Mugging Why working for a non-profit straight out of college is also a mistake. How it's possible to "hack" doing good in the same way you would a business. Implications of artificial intelligence. The best ways to really evaluate if you (or charities) are going good in the world. The reasons donating to disaster relief typically isn't the best use of your money. Why ethical consumerism typically isn't a great way to do good. Running a non-profit in the Harvard/Navy SEALs of startup incubators: Y Combinator This podcast is brought to you by Wealthfront. Wealthfront is a massively disruptive (in a good way) set-it-and-forget-it investing service, led by technologists from places like Apple and world-famous investors. It has exploded in popularity in the last 2 years and now has more than $2.5B under management. In fact, some of my good investor friends in Silicon Valley have millions of their own money in Wealthfront. Why? Because you can get services previously limited to the ultra-wealthy and only pay pennies on the dollar for them, and it’s all through smarter software instead of retail locations and bloated sales teams. Check out wealthfront.com/tim, take their risk assessment quiz, which only takes 2-5 minutes, and they’ll show you—for free–exactly the portfolio they’d put you in. If you want to just take their advice and do it yourself, you can. Or, as I would, you can set it and forget it. Well worth a few minutes: wealthfront.com/tim. Mandatory disclaimer: Wealthfront Inc. is an SEC registered Investment Advisor. Investing in securities involves risks, and there is the possibility of losing money. Past performance is no guarantee of future results. Please visit Wealthfront dot com to read their full disclosure. This podcast is also brought to you by Mizzen + Main. These are the only "dress" shirts I now travel with -- fancy enough for important dinners but made from athletic, sweat-wicking material. No more ironing, no more steaming, no more hassle. Click here for the exact shirts I wear most often. Don't forget to use the code "TIM" at checkout. Show notes and links for this episode can be found at www.fourhourworkweek.com/podcast.***If you enjoy the podcast, would you please consider leaving a short review on Apple Podcasts/iTunes? It takes less than 60 seconds, and it really makes a difference in helping to convince hard-to-get guests. I also love reading the reviews!For show notes and past guests, please visit tim.blog/podcast.Sign up for Tim’s email newsletter (“5-Bullet Friday”) at tim.blog/friday.For transcripts of episodes, go to tim.blog/transcripts.Interested in sponsoring the podcast? Visit tim.blog/sponsor and fill out the form.Discover Tim’s books: tim.blog/books.Follow Tim:Twitter: twitter.com/tferriss Instagram: instagram.com/timferrissFacebook: facebook.com/timferriss YouTube: youtube.com/timferrissPast guests on The Tim Ferriss Show include Jerry Seinfeld, Hugh Jackman, Dr. Jane Goodall, LeBron James, Kevin Hart, Doris Kearns Goodwin, Jamie Foxx, Matthew McConaughey, Esther Perel, Elizabeth Gilbert, Terry Crews, Sia, Yuval Noah Harari, Malcolm Gladwell, Madeleine Albright, Cheryl Strayed, Jim Collins, Mary Karr, Maria Popova, Sam Harris, Michael Phelps, Bob Iger, Edward Norton, Arnold Schwarzenegger, Neil Strauss, Ken Burns, Maria Sharapova, Marc Andreessen, Neil Gaiman, Neil de Grasse Tyson, Jocko Willink, Daniel Ek, Kelly Slater, Dr. Peter Attia, Seth Godin, Howard Marks, Dr. Brené Brown, Eric Schmidt, Michael Lewis, Joe Gebbia, Michael Pollan, Dr. Jordan Peterson, Vince Vaughn, Brian Koppelman, Ramit Sethi, Dax Shepard, Tony Robbins, Jim Dethmer, Dan Harris, Ray Dalio, Naval Ravikant, Vitalik Buterin, Elizabeth Lesser, Amanda Palmer, Katie Haun, Sir Richard Branson, Chuck Palahniuk, Arianna Huffington, Reid Hoffman, Bill Burr, Whitney Cummings, Rick Rubin, Dr. Vivek Murthy, Darren Aronofsky, and many more.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.

Transcript
Discussion (0)
Starting point is 00:00:00 optimal minimal at this altitude i can run flat out for a half mile before my hands start shaking can i answer your personal question now what is the appropriate time i'm a cybernetic organism living tissue over another endoskeleton this episode is brought to you by ag1, the daily foundational nutritional supplement that supports whole body health. I do get asked a lot what I would take if I could only take one supplement, and the true answer is invariably AG1. It simply covers a ton of bases. I usually drink it in the mornings and frequently take their travel packs with me on the road. So what is AG1? AG1 is a science-driven
Starting point is 00:00:45 formulation of vitamins, probiotics, and whole food sourced nutrients. In a single scoop, AG1 gives you support for the brain, gut, and immune system. So take ownership of your health and try AG1 today. You will get a free one-year supply of vitamin D and five free AG1 travel packs with your first subscription purchase. So learn more, check it out. Go to drinkag1.com slash Tim. That's drinkag1, the number one, drinkag1.com slash Tim. Last time, drinkag1.com slash Tim. Check it out. This episode is brought to you by Five Bullet Friday, my very own email newsletter. It's become one of the most popular email newsletters in the world with millions of subscribers. And it's super, super simple. It does not clog up your inbox. Every Friday,
Starting point is 00:01:33 I send out five bullet points, super short, of the coolest things I've found that week, which sometimes includes apps, books, documentaries, supplements, gadgets, new self-experiments, hacks, tricks, and all sorts of weird stuff that I dig up from around the world. You guys, podcast listeners and book readers, have asked me for something short and action-packed for a very long time. Because after all, the podcast, the books, they can be quite long. And that's why I created Five Bullet Friday. It's become one of my favorite things I do every week. It's free. It's always going to be free. And you can learn more at Tim.blog forward slash Friday. That's Tim.blog forward slash Friday.
Starting point is 00:02:11 I get asked a lot how I meet guests for the podcast, some of the most amazing people I've ever interacted with. And little known fact, I've met probably 25% of them because they first subscribed to Five Bullet Friday. So you'll be in good company. It's a lot of fun. Five Bullet Friday is only available if you subscribe via email. I do not publish the content on the blog or anywhere else. Also, if I'm doing small in-person meetups, offering early access to startups, beta testing, special deals, or anything else that's very
Starting point is 00:02:40 limited, I share it first with Five Bullet Friday subscribers. So check it out, tim.blog forward slash Friday. If you listen to this podcast, it's very likely that you'd dig it a lot and you can, of course, easily subscribe any time. So easy peasy. Again, that's tim.blog forward slash Friday. And thanks for checking it out. If the spirit moves you. Hello, ladies and germs. This is Tim Ferriss. And welcome to another episode of the Tim Ferriss show. I am looking out over a highway from a hotel next to JFK because I forgot my passport in California and missed my flight. In any case, on this show, it is always my job to try to deconstruct world-class performers and identify the routines, the habits, the decisions that help them to become who they are. And in this episode, we have a really fun guest,
Starting point is 00:03:31 Will McCaskill. Will McCaskill is an associate professor in philosophy at Lincoln College, which is at Oxford. He is 28 years old, which makes him likely the youngest associate, in other words, tenured professor of philosophy in the world. He is co-founder of the Effective Altruism Movement, which I'm a huge supporter of, and author of Doing Good Better. He has pledged to donate everything he earns over 36K roughly per year to whatever charities he believes will be most effective. He has co-founded two well-known nonprofits, 80,000 Hours and Giving What We Can, and we'll get into both of those, but more important, we will talk about the lessons he's learned, how to evaluate doing good, how to hack doing good as you would in business. And he has been exposed to business. He's one of the few nonprofit founders who have
Starting point is 00:04:17 gone through Y Combinator, which is effectively the Harvard Navy SEAL equivalent for startup incubators based in Silicon Valley. And his stories are just fascinating. We talk about many different things. We cover a lot of ground, everything from artificial intelligence to why following your passion in a career is often a mistake, how on earth he became a tenured professor at his young age, thought experiments like Pascal's Wager versus Pascal's Mugging. And even if you have no interest in nonprofits, charities, this will help teach you how to think more effectively, how to evaluate things more effectively. We'll also talk about
Starting point is 00:04:57 some specifics, why donating to disaster relief or purchasing things through ethical consumerism are generally not a great way to do good. But we talk about his story, his decisions, and there's a lot to be learned. So say hello to him on the interwebs. Will is at Will McCaskill at W I L L M A C A S K I L L on Twitter. And please enjoy our conversation. Thanks for listening. Will, welcome to the show. Thank you for having me on. I really appreciate you making the time. And where do we find you on the planet Earth at
Starting point is 00:05:34 this moment? I'm in Oxford in the United Kingdom at the moment. And now you are an associate professor in philosophy at Lincoln College. Is that right? Yeah, Lincoln College, Oxford University. And is Lincoln College, it's not a residential college. So for instance, at Princeton, they have different residential colleges. What is the significance of the colleges within Oxford? Yeah, it's a really big thing, actually.
Starting point is 00:06:00 So almost all your teaching goes on in college. You live there, you eat there, most of your friends are there while you're a student at the university. And actually all the colleges predate the university. So that's really the core of your life at Oxford. And then you get examined and go to lectures at the university itself. And when you receive a degree, is it from a particular college? Do you have to attribute it to the college or is it simply Oxford? No, the degree is just from university. Got it. So let's then begin. You do quite a few things. When someone asks you, what do you do? How do you answer that question? That's right. So depending on whether I want to be low key or not, I'll tell them I'm associate professor of philosophy at Oxford University,
Starting point is 00:06:45 but I'm also one of the co-founders of the Effective Altruism Movement, where that's a community of people who are dedicated to using their time and money as effectively as possible to make the world a better place. And to that end, I co-founded a couple of non-profits, Giving What We Can, which encourages people to give at least 10% of their income to the most effective charities, and 80,000 Hours, named after the number of hours you typically work in your life, which is about giving advice to people to ensure that if they want to pursue a career with a big impact, they can do as much good as they can. And how do you answer the question, what makes your brand of altruism effective? And maybe we can back into that.
Starting point is 00:07:29 There was a tweet I saw at one point from Bill Gates, and it was a data nerd after my own heart. And he linked to an article about you. I think it was a profile in The Atlantic. That's exactly right. So if you're talking to a skeptic and they say, okay, well, what makes it effective? So the key is that we take a kind of scientific approach to doing good. So that's using high-quality evidence, really good data, thinking about the outcomes that your actions do, rather than just what's a really sexy intervention or what makes you feel good, but actually what's helping people the most. And, you know, going on years of research now, as well as using careful, reflective,
Starting point is 00:08:08 self-critical reasoning in order to work out what those things aren't just making a difference, but are making the most difference. And what are some common mistakes? And just so people listening understand this, this isn't a disguised sell for nonprofit donations on my part. This is a conversation that I've wanted to have with you because we have a mutual friend in Ryan Holiday, who is himself a philosopher of sorts, a huge fan of Stoic philosophy, as am I. He's a fan of yours. I was very interested to connect. Just for those people who are wondering if this is a dressed up sales pitch, it is not. What are common mistakes that people make when giving? And I've
Starting point is 00:08:51 had many of my own struggles with providing time and money to various causes and nonprofits that maybe we'll dig into some of them specifically, but what are common mistakes or misallocation of resources that you see very commonly? So, yeah, I think the biggest mistake of all is just not really thinking or doing any research about where you're donating. So, I mean, imagine if someone came up to you in the street and told you about this company that they'd set up or that they were representing and that you should really invest in this company. And they tell you about how great the company is and then ask you right there on the street, well, are you going to make an investment? Sounds like Silicon Valley right now.
Starting point is 00:09:33 Yeah, maybe it's a little bit different in the day. Certainly here in Oxford, people would be... Generally not a good idea. Generally not a good idea. But yet, we're happy to do that when it comes to charities. We're happy to spend our money to try and help others, but without ever actually doing the research to work out
Starting point is 00:09:52 what's going to actually have the biggest impact. And there are evaluators now like GiveWell.org that are doing this research for us so that we can actually follow their recommendations because it does take quite a bit of time. How does GiveWell.org compare to, say, Charity Navigator or something like that? Yeah, there are really big differences. So Charity Navigator focuses just on the financials and just on the aspects of the charity itself, where one core part of it is how much does the charity spend on overheads?
Starting point is 00:10:27 What's the percentage spent on administration? But that's not a great metric, because imagine you've got some really lousy program. You're giving away donuts to hungry police officers or something, something that's not going to do very much good. But you've got this amazingly low overhead. You're spending almost nothing on administration. Well, you're still going to be a lousy charity. You can't make a lousy charity good by having a very low overhead. Right, got it. So they're looking at the operational efficiencies,
Starting point is 00:10:56 but whether something is efficient or not, it can still be very ineffective. Yeah, that's exactly right. So you've got to look at what program the charity is implementing as well. And there, there's absolutely huge differences. So three quarters of social programs, when put to the test using trials, are found to have no effect at all. They just don't actually improve people's lives. Some are even actively harmful. But then among the ones that are good, that do make a difference, there's still a vast discrepancy. The best ones are hundreds of times as good as merely ones that just do some amount of good.
Starting point is 00:11:30 And it seems like you are a skeptic of disaster relief. Is that a fair statement? Or how would you unpack that? Yeah, so I mean, I think funding disaster relief, you know, it should happen for sure. When there's a massive crisis like the Haitian earthquake, money should be going to it. And a lot of money, in fact.
Starting point is 00:11:53 But then the question is, what should you as an individual donor do? And natural disasters, because they get so much press coverage and so much media attention, they're massively overfunded compared to what I'd call ongoing natural disasters like the 400,000 people who die of malaria every year, but that gets far less media coverage.
Starting point is 00:12:12 And in fact, even in the case of the Japanese earthquake, the Japanese Red Cross issued a statement saying, we do not want any money, we do not need money, we're the fourth richest country in the world, we have resources to deal with this problem. But yet they still got $5 billion in donations, which was about the same as got allocated to Haiti, which is one of the poorest countries in the world. Understood. And let's take a step back for a minute, because I know we could really dig into the sort of tactics and science and numbers behind what you do. But you are currently 28. Is that right?
Starting point is 00:12:51 That's right. And you are a tenured professor of philosophy. Where does that put you on the spectrum of tenured professors of philosophy worldwide, age-wise? Age-wise. It's hard to verify, but I suspect I might be the youngest in the world. And to what do you attribute that? Why you? It's a competitive, it's a very competitive field,
Starting point is 00:13:20 as I understand it, the philosophy. I want to ask a question about this but you know philosophy postgrads having the highest gre scores out of any subject uh so so what are the what are the what are the factors that contributed to you becoming tenured at such a young age yeah so i definitely think it's not because i'm the smartest person. I'm very confident I'm not. I know lots of people who are, like, have exceptionally high IQs around these places.
Starting point is 00:13:52 And also not, I think, because I put in the most number of hours because I do so much other stuff as well. But I think the two things are just, one is, yeah, one is like actually kind of understanding, kind of, you know, really thinking about what your goals are and understanding how is it best to achieve those goals, which seems like, you know, that's pretty common sense. You'd think like everyone would do that, but actually most people don't. years and years kind of slugging away just on their dissertation, their PhD,
Starting point is 00:14:27 even though that's normally read by about four people in the world, rather than focusing, say, on publishing really good articles, which are read by much more people and much more important in terms of doing well in a career. And then the second aspect is just absolutely a ruthless application of the 80-20 rule, which obviously you talk about in your book. But just the idea that of the stuff that you do, most people do, almost all the value comes from a very small number of activities. And again, I think a lot of people in academia use a lot of their time used up with busy work.
Starting point is 00:14:56 So they attend conferences, they go to seminars, they do a lot of reading that doesn't ultimately contribute to what they're really trying to do, which is come up with new ideas, new arguments, and really put that forward. And so I think I've been just a lot more proactive and kind of goal oriented in terms of the approach I take to my work. So let's get into detail with that, because I know people love the details. And not surprisingly, a lot of people listening love 80-20. The 80-20 principle are all applications of that. So if we rewind to your undergraduate, were you studying philosophy undergrad? Yeah, that's right. I came to university as an undergrad.
Starting point is 00:15:35 And what did the trajectory look like? What were the inflection points or key moments between that point and getting tenured? Yeah, so I think I was a lot less effective as an undergrad. I was still trying to discover how to do things effectively. And I made some big mistakes. I remember trying to train myself to only sleep four hours a night, and that was a disaster. That was the worst year of my life. Then I realized, no, actually,
Starting point is 00:16:07 sleeping well and exercising well is really important in terms of having good performance. One thing I realized as an undergrad was just that you can just make so much progress just by finding someone smarter than you and learning from them. And that's exactly what I did. Someone who's now a close friend of
Starting point is 00:16:25 mine just knew absolutely everything. And so I'd spend hours just talking to him. So he's Andreas Morgensen. He also got a professorship at Oxford. In the philosophy department. In the philosophy department, that's right. So we've ended up having a very similar career trajectory. And I've been a zombie and eaten his brains at every stage of the way. But then in terms of the big inflection points, I think really came, I mean, both through, you know, it was only when I moved to Oxford to do my postgrad work that I started to get a lot more serious about this. Also realizing, you know, getting funding to do philosophy post-grad is pretty hard. But then realizing a lot of scholarships out there that most people have never heard of actually don't therefore get that many applicants. So if you apply for them, you can actually have a pretty good chance.
Starting point is 00:17:16 And then managed to get funding for my PhD and extend it by a little bit more than other people do through a combination of, I think, eight different scholarships in the end. And so doing that, and then the big thing was just from the outset of starting a PhD thinking, okay, well, philosophy is incredibly competitive. If I do want to do this as a career, then I've really got to know what's actually expected of you. Like, on what
Starting point is 00:17:46 basis are you going to get hired as a professor? And the answer is weirdly disconnected from what you typically do as a grad student. At least in philosophy, people basically judge you on the quality of your best 5,000 or 10,000 words of work, whereas your dissertation is kind of 80,000 words or something and so if you want to really do well then it's just about making that highest quality stuff as good as possible and then trying to get it published in the very best journals so it's really about optimizing for your kind of peak output rather than just being this kind of big journalist got it what was your dissertation about and what And what was your best 5,000 to 10,000 words about? think it's okay to eat meat, but I'm unsure about that. You know, there are these vegetarians around
Starting point is 00:18:45 and they have these arguments and so on. And you're like, well, I don't quite know what to do. How should you act in light of that? How should you take uncertainty into account? And this is quite a strategic choice of topic as well, because almost it's like a very important topic, but almost nothing has been written about it. So I could read everything that had been written on it in about two weeks, whereas many people choose to pursue PhDs on something that's already had a huge amount of work done. And so that was what the PhD was on. And in particular, I argued that even if you're unsure about your values,
Starting point is 00:19:25 you should treat that uncertainty in the same way as if you're unsure about matters of fact or what's going to happen. So in the same way as if you wouldn't speed around a blind corner if there was some risk of a child playing in the street, even if you thought there was probably no one there, you still wouldn't want to risk it. In the same way, if you think, well, maybe something is wrong, I think it probably isn't, but it's a risk, and if it was, it would be very wrong,
Starting point is 00:19:44 I think you should take that same uh same course of action you should again kind of play safe as it were pascal's wager for life it's a little bit yeah it doesn't involve infinite amounts of value though that's interesting too but a little bit like pascal's way different life and then what about your your your peak or your your best five to ten thousand words was it the same subject or something completely different yeah so it was the same subject as well um and this is all just quite distinct from my effective altruism stuff but it was the same subject and uh in particular i saw it was this analogy between that sort of decision and a bunch of work that had been done in economics.
Starting point is 00:20:26 So again, this is something where you can make a lot of progress by doing research, just by combining two different fields, because the number of combinations of fields is far, far greater than the number of fields themselves. And so I argued that this problem was kind of like the problem of voting. So in just the same way as there's this problem of like, if you've got all these people with different preferences, how do you kind of aggregate that into one social preference or one will of the people, as it were? And there's just a ton of work done on that in economics that's really interesting and
Starting point is 00:21:01 often quite technical. And I said that was kind of the same as the decision under ethical uncertainty. It's like you've got all these different ethical viewpoints, and they're like different voters. And if you want to be able to make a decision between them, in light of that kind of uncertainty, you can treat them as like voters, and you can use those same kind of, all the same technical apparatus that had been developed in economics and apply it to that case of ethical uncertainty. Which, so you mentioned Andreas Morgensen. What other philosophers are your,
Starting point is 00:21:39 idols is a strong word, but role models, people you really look up to. If you had to, alive or dead, if you had to put sort of your top five or fewer philosophers on a list, who would they be? Yeah, so there are two that really stand out. The first one on the, he's more of a kind of academic, is Derek Parfit. How do you say, Derek Parfit? Parfit, yeah, P-A-R-F-I-T. And he spent his entire life at All Souls College in Oxford, which is elite even within Oxford. What was the name of the college?
Starting point is 00:22:11 All Souls College. All Souls. All Souls, yeah. That's intense. The way you get into All Souls College, and Andreas achieved this as well, is you have to sit 15 hours of exams, and you have to answer questions that can be on any topic like is china overrated why democracy how many people should there be uh and then the final three hour
Starting point is 00:22:32 exam is just on a single word sounds horrible sounds very uniquely philosophical too i mean in terms of academia it's it's crazy so it's known as the hardest exam in the world but it gives you seven years of funding and you can do whatever you want in those seven years what's the name of the exam it's called the all souls um prize fellowship i think ah okay so that so derrick is one he is one and he's you know he's now 70 um and he wrote a book called reasons and persons uh which i think is one of the most important books written in the 20th century. Could you say that one more time? Reasons and Persons.
Starting point is 00:23:09 Reasons and Persons. Yeah. And it argues a whole number of things, but two of the big things are, one is the idea that there isn't really a continuing self over time. So the difference, there's nothing kind of fundamental, fundamentally distinct between, different between Will McCaskill, age 28, and Will McCaskill, age 70, versus Will McCaskill, age 28, and Tim Ferriss right now.
Starting point is 00:23:41 There's only a kind of matter of degree between those things. And that has a whole number of implications. And actually, interestingly has been, is actually quite similar to big strain in Buddhist thought. And apparently his book has been used as a, as a text in certain Buddhist temples, they chants to it. He didn't realize this for like 30 years or something.
Starting point is 00:24:08 And then he showed up and there was some effigy to Derek Barfitz sitting on a throne in a cave in Tibet and well I mean it does resonate certainly with the certain discussions of self right whether it's a contemporary like Sam Harris who's a PhD in neuroscience, but also a very experienced meditator who wrote Waking Up, or texts that are thousands of years old. I mean, the concept of a static self is something that's challenged a lot in Buddhist thought, among others. So you have Derek. Who's your number two?
Starting point is 00:24:43 Number two then has to be Peter Singer. I was going to ask you about him, yeah. Yeah, he's a big influence, both in terms of my thought and in terms of how I'm approaching my life, the decisions I'm making, in terms of my own career. And he has two really big, important arguments. One is for the moral importance of non-human animals and treating them much better than we treat them as we do at the moment. And then also the importance of
Starting point is 00:25:14 fighting global poverty, especially for us in rich countries using, you know, it should be most of your income, donating that away to charities that will, you know, help improve lives or save lives among the poorest people in the world,
Starting point is 00:25:33 where he argues that, you know, if you were walking past a shallow pond and a child was drowning in that pond, and you could run in and save the child, and it would ruin your expensive suit that you were wearing, and it cost you a couple of thousand dollars as a result. You'd obviously go and do that. You'd be, in technical terms, an asshole if you didn't. He doesn't say that.
Starting point is 00:25:56 That's my interpretation of the argument. But if so, then what's the difference between that and the life that you can save in Malawi right now by distributing insecticide-treated bed nets to save a child from malaria? And he argues there isn't a difference. And he was immensely influential, so sold millions of books and caused a large number of people to really change their lives, including myself. Now, do you, so I am a Peter Singer fan, and I think that most people who are not, most people who protest against or picket against Peter Singer haven't read his work.
Starting point is 00:26:38 And certainly he has controversial stances, but I remember when he came to Princeton to teach and there were people picketing and rolling out their disabled children and so on because they were, I think, sort of adopting bastardized versions of what he had said from media. As much as I like Peter, I would propose well, you know, this isn't about me.
Starting point is 00:27:05 Let me ask you, do you think it's really the same in so much walking in with a suit and rescuing a drowning child versus saving a potentially sort of faceless child across the world of which there are
Starting point is 00:27:21 potentially millions, right? And the reason I bring it up is not to say that people shouldn't do it, but I had a conversation with, well, actually, I suppose it was a Q&A of sorts with Sam Harris about the, I think it's called the trolley scenario, and I'm not sure. Do you know what I'm talking about, right? So those people who are not familiar with this, the hypothetical thought exercise,
Starting point is 00:27:44 which is going to have a lot of real implications when we're using autonomous vehicles and programming AI and so on. So philosophy is suddenly a lot more relevant, or some of these thought exercises are more relevant than they maybe would have been imagined to become. You have, and please correct me if I'm wrong, And I think I'm just paraphrasing this, but if you had a railroad track and you could flip a switch for it to go down one track that was split to the left and another split to the right, there's one kind of like fat man on the left. And then there are four people on the right, uh, which, and you have to throw the switch. Do you throw the switch to the one person or the four? And people say,
Starting point is 00:28:26 of course, that they would switch it to the one. But then the second scenario, I think, and there may be more, is if you had to push the fat man off of a bridge so he landed on the track and it was an obstacle that then prevented four people from dying, would you do it? And all of a sudden, the percentages change very dramatically, right? And even though the utilitarian philosophical outcome is on paper the same, right? And this is, I know I'm kind of brain vomiting at you, but this is something I've really grappled with. So Peter Singer had a cover story, I believe it was the New York Times Magazine, Sunday edition. Huge piece on basically redistributing wealth for greater good.
Starting point is 00:29:19 And yet I couldn't necessarily point to a huge change in donation behavior after that article. So why don't more people donate? who have money, they would save the drowning child, but they're not going to send... For the same reason that I think that they're afraid of opening the floodgates. If they donate to one child, does it not then follow they should donate all of the money that they have to save not one child, but as many as they can afford to save. And then they get themselves into a very hairy position where they feel like they can't enjoy
Starting point is 00:30:15 the fruit of their labor because of the guilt that they feel. Does that make sense? Yeah. How do you address that? I mean, this is something that I've grappled with, and I know friends of mine, I mean, I have friends who've had to basically check out and basically do therapy who've been in nonprofits for a long time because they get to a point where they'll have dinner with a friend,
Starting point is 00:30:39 and they're like, for the amount you spent on that bottle of wine, you could have saved a life in Malawi. And the friend's like, that's a real dickish thing to say. Yeah yeah yeah i know a lot of people have worked in non-profits and gotten very disillusioned very burnt out yes how do you how do you think of addressing that yeah and so actually it is kind of indicative because peter singer was making these arguments since the early 70s really and there wasn't that much uptake of them until Toby Ord, another academic at Oxford,
Starting point is 00:31:07 and I set up Giving What We Can. That was in 2009. And we were saying, okay, give 10%. And yeah, I think there were a number of changes. So one is just that we emphasize this concrete number, 10%. And some people go further than that. Some people give 50%. I'm able to give away most
Starting point is 00:31:26 of what I earn over the course of my life. So that was one thing. Second was presenting it as just this amazing opportunity rather than this moral obligation. Yes, that's right. Yeah, the way Peter presents this is just, yeah, you're this
Starting point is 00:31:42 asshole if you don't do this. And then people are like, well, fuck you. And that's this kind of, I mean, it's a kind of natural human reaction. Whereas actually, most people really want to do good with their lives. You know, if you could be that person rescuing that drowning child, or if you could knock down the door to a burning building and save someone from inside, you'd feel like a hero. You'd feel great about yourself. And actually, that's the situation we're in. We're in a situation where you can do huge amount of good at little cost to yourself, maybe even actually given the psychological evidence and benefiting yourself because giving has a whole host of benefits to the giver as well
Starting point is 00:32:19 as to the receiver. So it's actually this amazing opportunity we have. And then secondly, I think the reason we don't give is just because a lot of psychological biases. So I remember when I was thinking about this, I just thought, well, I just don't want to be a sucker. You know, everyone else is getting ahead, being really ambitious. I was ambitious myself. I don't want to be holding myself back by spending all this time on non-profit stuff and giving my money away and you know going behind falling behind my peers whereas we've built up this community the effective altruism community where everyone's help kind of self-reinforcing you get really praised for doing more good or doing it more effectively and it's this really warm welcoming it's kind of new peer group a new part of your identity and i think that can really help overcome a lot of the reservations that people have.
Starting point is 00:33:09 And then I think the final thing is just in terms of the impact people have. So obviously there's a huge debate about how effective aid is, and I think it's reasonable for someone who's only vaguely heard about this stuff to think, oh, yeah, well, if you just donate, doesn't all the money get wasted? Because it's true that in very many cases, the money is squandered. The money is wasted. There's no impact.
Starting point is 00:33:33 But then by us actually doing research and saying, no, look, if you do this, if you pay, give money to Against Malaria Foundation to distribute long-lasting insecticide-treated bed nets for $3,500. Statistically speaking, you will save a life. Huge number of trials have been conducted on this to show the efficacy of bed nets. We can answer all your questions. Then it's like, okay, this isn't just this kind of Pascal's mugging situation where maybe I'll save a life, but I don't really know what's going on.
Starting point is 00:34:03 Actually, I know exactly what my money is going to go and do and you know it's still the child you save you still will never know but they become a little bit less faceless it's a little bit more concrete what you're actually going to achieve did you say Pascal's mugging? I might have said Pascal's mugging do you know about that thought experiment? no I don't I'd love to hear that though because that
Starting point is 00:34:25 should be the title of your next book i think okay oh my god i would love to write about pascal's loan um that scenario is where uh it's same as pascal's wager except um without uh infinite amounts of uh value at stake it's not about heaven and we should probably could you just briefly explain pas's Wager? For people who are not familiar or would like to get reacquainted, we have that as a baseline before we get to Pascal's Mugging? I love that we've got
Starting point is 00:34:53 onto this. So, Pascal's Wager is the idea that you should go to church because maybe you think it's incredibly unlikely that God exists. Let's say it's, you know, you think it's just almost no chance, one in a billion chance or something. But the payoff's just so great because it's an infinite amount of happiness. If you take a one in a billion chance multiplied by positive infinity of happiness,
Starting point is 00:35:15 well, that's still plus infinity an expectation amount of happiness that you're going to get. Whereas the costs of going to church are just not that great. And so if you're really even just out for yourself, just looking to maximize your own happiness, then you should try and believe in God and you should try and go to church just because the potential payoff is so great. So that's kind of his idea. That's Pascal's idea from 17th century.
Starting point is 00:35:46 Pascal's mugging is a slightly more updated version where Blaise Pascal is coming out of a pub and this kind of eerie figure approaches him and says, give me all the money in your wallet. And Pascal's like, no. And the mugger says, well i know you're blaze pascal i know that you think that the way to make decisions is to look at the probabilities of outcomes and uh their values and take that all together and if you give me the money in your
Starting point is 00:36:18 wallet then i'll come back tomorrow and give you any finite amount of um you know happiness or money or anything you could possibly want and pascal's like no and it's like yeah but like look into my eyes i'm like this kind of like slightly creepy figure you don't know that i'm this not this you know alien or someone with superhuman powers like you can't be absolutely certain of that so you should still you know by your own logic you should still give me this money. And it kind of a thought experiment goes to show that Pascal, you know, would have to say, even in cases that aren't involving kind of infinities or heaven and hell, you should still do actions that seem pretty crazy to us, like giving this mugger money on this tiny, if there's a possibility of asymmetrical reward, then you should take the bet. Yeah, that's meant to be the argument.
Starting point is 00:37:11 But it seems ridiculous, so something's going wrong. Right, well, I mean, yeah. Well, using that, right, everyone should invest in speculative startups, right? Yeah, that's right. Pascal's mugging. I love it. I would love for a drunk hipster
Starting point is 00:37:30 to try to mug someone with that approach in the mission. I'd be curious to see how that turns out. If he's in the bay, it might work. It might. It depends on who you catch, I guess, coming out of their juice bar or whatever. But so the – I got us off track just a little bit.
Starting point is 00:37:52 But in terms of why people don't give more, right? I think that there – we could look at the negative examples, right? So the guilting approach doesn't work very well because like you said, it just provokes a go fuck yourself response. I think rightly so, quite frankly. I mean, I think it's a very naive and insulting way to kind of go about it, which doesn't have for someone who's thought about things so rationally,
Starting point is 00:38:18 it's surprising to me that singer takes that approach because it flies in the face of like any type of negotiation research or behavioral modification research. That's kind of funny. But if I look at, for instance, the causes that I've been involved with on some level, and I have not looked at them on givewell.org, but I've tried to do the amount of due diligence that I could with the bandwidth that I have, whether it's, say, DonorsChoose that does a lot of work with education or Charity Water, for instance, they both do a very good job of concretizing the abstract. So they send you photographs, updates, letters, et cetera,
Starting point is 00:38:58 to make you feel like you are rescuing that drowning child in your own suit. And just to come back for a second, the mosquito nets that you mentioned, is that the type of conclusion someone could come to on givewell.org? Or are there other sites that they should check out? And we usually do this at the end of the show, but since we're on it, like what, what, what other resources can people use given their busy lives? The people who have the most resources to allocate to something like this are usually also the busiest, right? I think that's another challenge. So, so what's, what's the, what's the, the most elegant way, time efficient way to figure these things out for oneself? Yeah. So if you're, yeah, if you're busy, by far, the figure these things out for oneself.
Starting point is 00:39:49 Yeah. So if you're, yeah, if you're busy by far, the best thing is just givewell.org, um, where they just have these four top recommended charities. So they just try and find out what are the charities that doing the most good that we know of. And those charities are against malaria foundation, which distributes bed nets, saving life about $3,500. So the best charities often have the worst names. And so a couple are Deworm the World Initiative and the Schistosomiasis Control Initiative, which when I first researched it five years ago, I could not pronounce. And they deworm school children so people don't really know about this but over a billion people worldwide suffer from these parasitic worm
Starting point is 00:40:32 infections in their guts and they don't kill as many people as hiv aids uh tuberculosis malaria and so on but they do just make huge numbers of people especially kids sick and therefore they don't go to school um they earn less they're less productive later on in life. And they're incredibly cheap to treat, so they only cost about 50 cents per child. And then the final charity I recommend is GiveDirectly, which simply transfers cash directly to the poorest people in the world, you know, the very poorest people living in Kenya. And about 90% of the money that you give ends up in the mobile phone bank account of those extremely poor people,
Starting point is 00:41:08 which they then can then spend in whatever way they believe is going to most benefit themselves. So it's the ultimate charity if you're really worried about white knights coming over to try and help the problems of some other country they don't really understand. And the people who receive the money tend to spend it on assets like tin roofs and livestock. So yeah, that's the best place to look
Starting point is 00:41:32 if you want just an incredibly well-researched set of recommendations. So I want to, because I think that there are people listening who will have some questions like those I'm going to ask. I'm going to challenge, I'm going to ask questions that I think might push on a couple of places. The first is, are there any organizations or resources like GiveWell.org that are less human-centric? And the reason I ask is that the ROI seems to be measured by the number of human lives saved are there other organizations or people who have evaluated causes cause-driven non-profits ngos whatever they might be that are not focused on the number of human lives saved um yeah so there's one i actually helped to set up called Animal Charity Evaluators. And that's applying the same sort of, you know, attempting to have the same sort of level of vigor and research. But if you just, if who you want to help are just animals, where should you donate?
Starting point is 00:42:59 And they're a much smaller operation than GiveWell, but again, you can check them out for their sort of recommendations. In terms of if you're thinking about the environment, there I'm just less sure, actually. GiveWell is starting to broaden their research that they do, so they're working with a foundation called Good Ventures, which is set up by Cari Tuna and Dustin Moskowitz, who's one of the Facebook co-founders. And there they're looking into a much wider variety of causes beyond just global health and global development, including things like climate change, fundamental research,
Starting point is 00:43:42 policy reform, especially immigration reform and criminal justice reform, and trying to look for, you know, comparing across all different causes you could be interested in. What are those that are particularly great in scale? So it's just a very big problem, particularly neglected, so there aren't very many other, like, philanthropists or actors trying to solve this problem, so it's not very crowded, or particularly tractable, so where there's just really great
Starting point is 00:44:07 programs that haven't yet been funded but that we know are going to make a really big difference and some of the ones they're championing are improving conditions of animals and factory farms improving
Starting point is 00:44:24 immigration policy, improving criminal justice policy to reduce the number of people who are incarcerated while at the same time maintaining the same level or better levels of public safety, and then also risks of from of kind of global catastrophe from uh new technologies or from climate change or from uh developments in biology and so on got it thank you uh related question or quandary maybe for i think a lot of people listening so if if you look at a given high need population let's just say we're looking at,
Starting point is 00:45:08 you gave Kenya as an example, we're looking at Kenya. The reflex seems to be to help the poorest of the poor. Are there philosophers or philanthropists out there that you respect who disagree with that. In other words, people who say, you could give $10 to the 10,000 poorest people in Kenya, but I prefer to try to identify the, say, 200 most promising young students who could become the leaders of tomorrow and break the cycle of tomorrow and take their country, break the cycle of poverty in this country through policy reform and this, that, and the other thing as engineers,
Starting point is 00:45:51 blah, blah, blah. But it's a much more expensive per person proposition, probably, right? Maybe they have to be sent to the US or Cambridge or Oxford for education, for instance. How are people thinking about that? And is there anyone who, it's politically safe to say, we want to focus on the poorest of the poor. No one's going to rake you over the coals publicly for that, right? But are there people who take the opposite approach,
Starting point is 00:46:29 whose arguments you think have some validity or that are interesting? Yeah, so I think there are a couple of good arguments here. money and the fact that so in rich countries if you're you know earning above about ten thousand dollars per year you're in the richest you know 15 10 of the world's population even taking account the fact that money is goes further overseas and if you're earning above about fifty thousand dollars per year um then you're in the richest one percent of the world's population um and the poorest people in the world are only on about 60 cents per day or the equivalent of what $1.50 could buy in the U.S. And for that reason, additional resources to them just make such a much bigger impact
Starting point is 00:47:15 than additional resources to people in richer countries. In my book that just came out, Doing Good Better, I talk about this as the 100-fold multiplier. A dollar, to me, is going to do less than a hundredth as much good as a dollar going to some of the poorest people in the world. But I think there are a couple of ways in which there can be things that focus on people that are already competitively well-off that can do a huge amount of good. One is through research and innovation. So if you're increasing that. So a huge amount of good has been done in the past through developments in science and technology and medical research. So mobile phones got developed by Motorola in the 70s,
Starting point is 00:48:07 and now across sub-Saharan Africa, the majority of people own a mobile phone. So you do get this kind of over-the-long-term tickle-down effect as a result of research and innovation. And that tends to get underfunded by the market. And then a second thing is if you can kind of harness these incredible resources, which are these very talented or, you know, very ambitious or more well-off people and kind of direct them in a way that's going to do more social good. So that's, I mean, the approach 80,000 Hours takes. My nonprofit advises on career choice. You know, we explicitly, with a small operation, we need to focus,
Starting point is 00:48:49 and so we focus on, you know, those elite students, the kids coming out of Ivy League schools. Not because we think it's important, comparatively important, to, you know, make Harvard grads a little bit better off, but rather because, you know, they're the people that are really going to be the leaders of tomorrow, they're be shaping the world. You want them to be doing, you know, more to improve the world, both by having more kind of motivation to do good and using that motivation in as effective a way as possible. And so those are the couple of ways in which I think you can do a lot of good by,
Starting point is 00:49:25 you know, focused on, you know, focusing on areas other than the very poorest of the poor at the moment. And, um, there's, there's a, there's an organization that I'm involved with called quest bridge, um, that people can check out if they're interested that I think is sort of along these lines. And it complements in a way the underserved students that I work with vis-a-vis donors choose. But Reid Hoffman and others are on the advisory board for QuestBridge. So people interested can check that out. Did an interview with Reid Hoffman for this podcast that discusses that on, on some level, the, if we wanted to convince people to help others,
Starting point is 00:50:12 but to do it through pure self-interest, how would you, how would you go about doing that? And what would the, what would the form of giving look like? So in other words, uh, if you can prove to someone they will be happier if they give enough to save the life of one person per year, for instance, right?
Starting point is 00:50:33 I think we're going to get into Andreas a bit. I think that would go a long way to, and I know this sounds maybe cynical and terrible, but I don't think that saving a life is enough to get millions of people to donate money. Uh, it sounds terrible, but I think that appealing to self-interest is the sort of the, um, the Trojan horse necessary to open them to that experience. Uh, what would it look like? And I started thinking about this very specifically over the last year also because I'm involved with various cause-driven companies, both non-profit and for-profit. And I took my family, took my parents and siblings on a trip to Iceland last year. It was the first time that we'd taken a family trip in 15 years or so. And the anticipation of that was so much fun
Starting point is 00:51:35 and made it so much more valuable to the entire family, all of the brainstorming and the researching and the sharing of photos and so on before it happened that i started thinking of how that type of structure could be wrapped around something like cause driven companies whether non-profit or for-profit right so if somebody really wanted to get just pure self-interest i want to improve my quality of life my optimism uh my self-reported well-being, blah, blah, blah. How should they do it? Yeah, so I think, I mean, the psychology evidence itself does suggest actually that giving, I mean, it suggests a couple of things.
Starting point is 00:52:16 One is just that money is actually way less important than we'd think to making ourselves better off. The relationship between higher levels of income and higher levels of happiness is really very low indeed. Whereas other things like having a really good community around you is actually very important. Having a group of friends that really like you is very important to being better off. And so one thing is just, yeah, if you want to, like, start doing good,
Starting point is 00:52:46 especially, like, the effective altruism community, suddenly you find you've got, like, thousands of, you know, new friends who really want to support you and make you do better in life. And that's, I think, one reason why the people who have, like, in, you know, my peers who have started giving actually feel, including myself, just actually feel really good about this decision. Another thing is just the direct effect of giving. You get a kind of warm glow.
Starting point is 00:53:12 So people do tend to feel, and they've done little psychology experiments on this as well, looking at people who donate rather than spending money on themselves. And people tend to feel happier after having donated. They feel better about themselves. Is there any particular type of donation or type of cause that has the most significant impact in that respect? Does that make sense?
Starting point is 00:53:40 Like, is it, for instance, you know, is the, and I know we don't want to do this, but if we put efficacy aside, what are the characteristics of the donation that has the most persistent effect on self-re to find, you know, choose between the goat or the chicken or the fill in the blank for the picture, you know, the kid who's pictured in the back? Is it something else? What is what are the characteristics of a sort of selfish, selfishly fulfilling charitable giving event? Yeah, so I don't know if there's any of a search on this, but I suspect it would be, if I'm honest, not exactly the sort of things I tend to promote. It would be things where it's fairly, you know, your donation is quite public, but not in a way that comes across as sanctimonious, but just that people know you're doing a lot of good and where you get the kind of positive feedback as a result. So probably donations within your own community or where you can see the kind of tangible benefits of what you're doing. That's going to be a really big factor. And then I guess if donations, if you're part of like a peer network
Starting point is 00:55:02 where you've got a number of people kind of all doing the same thing, that's also going to be, and those people who are kind of self-reinforcing so also saying like yeah that's really awesome what you're doing you're this you know really good person as a result uh i suspect that's also going to be one of the biggest ones in terms of um you know increasing your level of happiness whereas purely i suspect that um you know donating to someone just on the street where you never hear from them again, uh, that's going to be among the worst because that's, um, you know, that tends to sell you by making you feel guilty for a time. And then you donate in order to alleviate the guilt rather than this, this positive kind of
Starting point is 00:55:39 ongoing thing where you get, uh, consistent feedback and whether that's from the community or from the people you can actually see who you're benefiting. Well, it's a negative reinforcer, right, as opposed to a positive reinforcer. And if you look at dog training or really any mammalian training, that doesn't produce a lot of enthusiasm. Yeah, that's exactly right. Carrots work and sticks don't, at least not in a long run.
Starting point is 00:56:04 And the kind of worry I have about fundraising in general is that you get this competition between charities and you get this race to the bottom where they hold up bigger and bigger sticks. And it means that people just end up gettingbinator in a second, but I want to make sure we come back to this because I think many people, more people would be willing to get involved with charities or nonprofits if they felt they would stop getting annoyed. And that's, or that there wouldn't be incessant follow-up with guilt guilt guilt like every letter they receive yeah i think a lot of a lot of people don't want to open the door to that type of haranguing and kind of uh in incessant barking so to speak so they'd never take the first step does that make sense yeah, that's exactly right. Well, you know what, let's, let's just cover it now. If you could make a plea or a suggestion to people involved with nonprofits out there and say, stop doing this, this, and this,
Starting point is 00:57:16 start doing this, this, and this, what would be on those lists? Yeah. I mean, one thing for sure. So I used to work as a, in the UK we call them chuggers or charity muggers. People who are on the street who then harass you for $10 a month. People seem to really hate that, as I know, having done it. And I think that's something that's like particularly particularly damaging another is these pictures that you get of you know children in poverty
Starting point is 00:57:50 with bloated stomachs and flies on their face and it does a couple of bad things just one is just that it makes people really not want to get involved because it's these kind of horrific images that very naturally you want to steer away from and then also just paints this really bleak picture of,
Starting point is 00:58:07 and kind of quite disrespectful picture of people in poor countries as those who are just helpless. Whereas I think if you were, like you say, doing positively enforcement. So instead you're like, Hey, you know,
Starting point is 00:58:20 this person donated a thousand dollars and was able to deworm two whole schools. Isn't that amazing? That's much more compelling. And if you could get charities to band together to have that approach, then I think you'd do a lot more good. And then I think the second thing would be in terms of the amounts that are asked for. I think there's also a race to the bottom in terms of different charities wanting to ask for less and less. Because, you know, if you've got a choice, oh, one advert's asking me to give $10 a month and others asking me to give $2 a month.
Starting point is 00:58:54 I've got the same feeling of guilt and I could resolve it either way, then I'm going to donate for $2 a month. But I think that's just not an appropriate reaction if the images they're showing you are of these starving children. And so, again, I'd rather if we campaigned to say the amount you should give is 2%, everyone should give 2%, and then it's up to you where you give it, but that's what you should be aiming to do. And then it's just this one-off thing. It's just this one campaign. You don't get asked in all these, because I mean, this is part of a classic bit of social psychology is you want to disaggregate costs and, sorry, disaggregate benefits and aggregate costs. Where kind of getting asked to make a donation as a kind of cost, it can be a bit unpleasant. But then you want the benefits, the kind of rewards you get to be as recurring as possible. And so having different charities saying, look, this standard two percent of your income or you know maybe you could try for more but that i think would be a decent amount to publicly say um then i think people could and you know you can make a huge difference with this i think people
Starting point is 00:59:55 could really get behind that um and would start to have a more positive view of charity and trying to help others let's segue to y comb Y Combinator, for those people who are unfamiliar, it's like the Harvard All Souls Navy Seals of startup accelerators. And
Starting point is 01:00:18 they would dislike the term incubator, but a lot of people have a better familiarity with that. People apply, very few get accepted, and there's some huge companies that have come out of it, Dropbox, et cetera. You participated in Y Combinator as a nonprofit, which I think is unexpected to many people
Starting point is 01:00:39 or seems like a mismatch. Can you describe how you came to apply and get accepted to Y Combinator? Tell me the story of how that happened. Yeah, so the charity was 80,000 Hours. That's the career advice one that went to Y Combinator. And we had been doing research into different career paths, and one we recommended really highly, actually, was tech entrepreneurship. And that was for a few reasons.
Starting point is 01:01:10 One is because we think that early on in your careers, you should be really just trying to think about the long term. You should think about trying to build up yourself as a person, your skills, your network, your credentials, how much you're learning. And trying to run a startup is one of the best things you can possibly do for that, or being in the early stages of a startup as an early employee. It also has potentially great payoffs in terms of the good you can do through entrepreneurship. I can tell you about some really amazing companies that are doing incredible things to improve the world. And then also, if you do get really big, if you have founded a Dropbox or an Airbnb or something, then you have huge financial resources
Starting point is 01:01:49 that you can use to make an absolutely massive difference, as Bill Gates has done. And so we were promoting that quite heavily, and that meant that when Y Combinator started to say, okay, we want to do non-profits, we're going to open the doors to non-profit applications where they'll just give a grant instead of an investment. A lot of people then contacted us,
Starting point is 01:02:17 and we thought, yeah, this is just a perfect fit. We have the same sort of mentality. So the non-profit space can be very stale, very unambitious, very unoriginal, whereas we want to be really big. We think we can give the best advice in the world for people who want to make a difference with their careers, and we think we want to reach everyone
Starting point is 01:02:39 who's graduating from university. So we have big aims. We're focused on numbers. We actually are thinking in quite a similar way to a for-profit startup. Y Combinator does seem like a really pretty good home. So we made the application, and it's a very funny process
Starting point is 01:02:55 because there's the application form and a one-minute video. Everything is incredibly condensed in terms of what the partner is actually the view. Application form and a one-minute interview, one-minute video of yourself. And we just got drunk and filmed it. What are you supposed to put in the one-minute interview? Or the one-minute, excuse me, video, you incepted me.
Starting point is 01:03:20 What is the content of that one-minute video supposed to be? Yes, in that one minute video you talk about i mean you can talk about anything sometimes it's just a conversation between the founders um but for us we talked about uh actually included the warm-up that we were doing to get ourselves psyched up for 20 seconds it was just us singing um but then uh talking about what exactly 80 000 hours does what's the problem how are we going to grow why are we why do we think we're a team that's good enough that we're actually going to be able to become an absolutely massive organization um that was kind of how we
Starting point is 01:03:56 approached it but the key thing is just and this is amazing how often founders fail to do this it's just actually conveying what you do do because you get the curse of knowledge where you're so invested in this project that you've got so much detail and so much on your mind. But then when you actually try and convey it to someone who isn't as familiar, then you completely bastardize it and people have no idea what you actually do.
Starting point is 01:04:24 Yeah, you drown them in the minutia and they can't see the big picture. Yeah, exactly. And this is something that the Y Combinator partners are so good at. Every single week we just have to give the one-sentence description of what we do. And for us it's 8,000 hours gives career advice for people who want to make a big social impact in their lives. So just actually explaining that and explaining exactly how you do it was absolutely the key thing.
Starting point is 01:04:53 And when were you at Y Combinator then? So yeah, we were the summer batch. So just the last three months of the summer, basically June, July, August. June, July, August 2015. That's exactly right. What were the most important things you learned or skills you developed at Y Combinator? Yeah, so the whole thing felt like
Starting point is 01:05:15 this exhilarating learning experience, a whole new lens on how you build something to get very big. And so a lot of great pieces of advice. One is just to focus on the product basically exclusively. You'll constantly be tempted to spend your time doing things that make yourself look cool to your friends and family, but that aren't actually making a better product
Starting point is 01:05:40 and that aren't therefore helping with growth. So, you know, you tend to do press. You'll be tempted to hire a lot of people because then you can say, oh, well, we've got 20 people on the team. Whereas hiring actually just takes a lot of time away from just trying to build a better product. The other thing is just focusing, picking your metric. So for us, that's the number of people. Ultimately, that's just the number of people
Starting point is 01:06:05 whose plans we've changed in a very significant way and focus on growing that metric by 10% every week. So we'd been growing at something like kind of doubling in size every year, and that makes us in the top 1% of charities, I think. 10% every week week that's 142 times every year it's just a totally different kind of level of ambition um and that really gives you focus so you're insured on doing just what's gonna um make your company or in our case
Starting point is 01:06:39 charity um you know bigger than better every single week and just doing whatever it takes to hit that 10% growth target. Yeah, that focus on product in all of, I have about 40 angel investments now. And if you look at that sample set and pick out the biggest winners, and some of them are on paper still, but a lot of them have already had large liquidity events. All of them ruthlessly focused on product
Starting point is 01:07:12 to the extent that if I brought them an amazing press or business dev, business development opportunity, partnership of some type, they would say, that looks great, but we're heads down on product, just not the right time, but we'll be sure to reach out in five months, six months. And it's that ability to say no to focus on product, which as you know, in this day and age is the best approach to marketing and customer acquisition that you can take since
Starting point is 01:07:41 word travels organically if if you take that approach easily the most common denominator when you when you look at the the home runs in my portfolio the i'm going to come back to yc in a second but you mentioned startups making a big difference and one of the startups i'm involved with for for instance, is Duolingo. And Duolingo now has, I want to say, 100 million plus users who are learning languages for free on Duolingo. And the founders include Luis Van An, who was the effectively creator of Captcha and ReCaptcha, which was sold to Google. And it's a very brilliant model. I mean, they're pulling real content offline or from clients who are paying to have things translated and using the crowd to translate
Starting point is 01:08:35 while simultaneously teaching them different languages, right? So ostensibly, you end up in two very interesting positions. You have hundreds of millions of people learning languages more effectively than through paid programs for free, indefinitely. And then you also have the ability to generate revenue through translation and certification and other things. And then you also have the ability to rapidly translate. So you could crowdsource, say, turning Wikipedia into some lesser-known language in 50 hours of total time, which would be, of course, thousands or tens of thousands of hours of human time, but it would be simultaneous through this program.
Starting point is 01:09:19 So they are, I think, going to have and are having a huge impact. What are startups that come to mind for you for profit that are having a huge impact? Yeah, I mean, I think that's a great example. And one of the things with for profits is you can just get so big and get so big so quickly. So being able to reach 100 million people, you know, it's absolutely phenomenal and quite hard to do if you're functioning as a non-profit because you're constantly having to fund base. My favorite example of a for-profit company making a really big impact, again, set by someone in the effective autism community, is also Y Combinator alumnus is called Wave. And it makes remittances that are sending money from a country that you've immigrated to um back to your home country which is typically poorer but to your family there there's an absolutely
Starting point is 01:10:13 huge deal so it's about half a trillion dollars sent in remittances every single year and compare that to overseas development aid spending it's actually remittances are several times as great. But if you're a Kenyan in Maryland and you want to send money back to Kenya, it's a real hassle. You have to go to a Western Union. The Western Union takes 10%. And what Wave are doing is enabling you to send money mobile to mobile,
Starting point is 01:10:39 so it's much easier. And they'll also only take 3%. And they're growing phenomenally fast at the moment. They've already got thousands of users and tens of thousands of users and are moving millions of dollars, even though they only just set this up, only launched about six months ago.
Starting point is 01:10:56 And the potential there, you know, if you just do the math, if they're able to really make a significant change to the amount of money that's flowing to poorer countries and remittances. It's just tens of billions of dollars every single year going from richer countries to poorer countries and not getting taken by these middlemen companies. It's got an absolutely astonishing opportunity to have a really big impact.
Starting point is 01:11:27 So if you reflect back on your time at YC, as the kids call it, what were the most common debates you had with other participants in YC or with the partners? Yeah, that's a good question. Yeah, I mean, one thing that was very common was how much time to spend on things that are going to grow your user base rather than product. So I said the advice is always focus on the product. But then there's always going to be exceptions, and you always wonder, well, is this one of these exceptional cases?
Starting point is 01:12:16 And that was just definitely a recurring issue because it's kind of hard to make the judgment call of, well, actually, we've already got this thing, and you're going to have to do some amount of distribution. So that was a really kind of ongoing thing. Actually, as were all the, maybe a lot of the biggest debates were just, so Paul Graham is the founder of Y Combinator, and he has these essays. And Paul Graham is something of this kind of guru or god amongst the Y Combinator startup community.
Starting point is 01:12:45 And he has these teachings through his essays. And then when do you deviate from the teachings of... When do you violate the scripture? That's exactly right. That was the ongoing thing. So similarly, another piece of advice was, you know, don't take any investment during the period of Y Combinator. It's just a distraction.
Starting point is 01:13:04 Just focus on glowing, and then do all that after demo day, which is the big presentation when you pitch to 450 investors in a big room. But then people would get approached by angel investors or VCs, and the question would be, well, should actually we be taking this? It looks pretty good. So again, there'd be ongoing debates about, you know, investors or VCs, and the question would be, well, should actually we be taking this? It looks pretty good.
Starting point is 01:13:31 So again, there'd be ongoing debates about, you know, when should they violate these rules? And Paul Graham even acknowledges this. He says every single year he gives the same advice to startup companies. Every year, everyone ignores it. And then every year they say later, oh, I really wish you'd listened to this advice. And so that was kind of played out in many different ways, actually. Similarly for the recruitment as well, that's something where they say, you want to have this exceptionally high bar for who you hire. And you either want to be spending all your time going to hire because getting the best team, especially the early team, is just so vital. It's the most
Starting point is 01:14:04 important thing that if you're going to be doing it, it has to be absolutely full-time. And the Airbnb founders, it was six months before they hired their first employee just because they wanted them to be so good. But again, it always be these questions, well, we could do more if we hired someone now. Is this one of these exceptional cases where we should violate that rule? So that was the kind of theme in terms of the debates that were things that
Starting point is 01:14:27 were on people's minds. Yeah. That's, that's another one where Pascal's mugging will kick you in the nuts, right? Because if you're like, well, there's a 1% chance that they could be the Michael Jordan of exactly what I need. So let me hire of them. Let me hire them. I mean, that's a, that's a very, that's a, like a kamikaze run, uh, at a ship. So you have to be very careful. Um, where does, where do, uh, favorite, where do some of your favorite philosophical
Starting point is 01:14:58 frameworks, uh, have trouble in the real world? Yeah, I think there's, I think all over the place, probably. So, I mean, a big thing is just there's so, like, the real world is just so messy. So you've got this idea, okay, I just want to do the most good. I want to help as many people as possible by as much as possible. Then actually implementing that is like much harder to do. So, you know, in the early stages, for example, of giving what we can, when we were doing research into child defectiveness,
Starting point is 01:15:41 we made kind of certain assumptions about, say, the quality of academic evidence, where there's this body of research from economists that we were really pretty happy just to trust. Because we're like, look, these are the scientists. They really know what they're talking about. They're giving these numbers. We're happy to go with those numbers. And it turned out, actually, loads of the search was really kind of crappy.
Starting point is 01:16:06 Um, you really couldn't trust them in the exact, in the way that, um, it would have been hoped for. Uh, instead you've just got to go a lot more with, you know, very in-depth, independent investigations of the evidence yourself. Um, and that was something where you've got this kind of philosophical you know philosophical motivation and then you make an assumption which is that the people doing the experimental work the empirical work that you can just kind of trust what they're doing turns out that's really sadly not the case science is a lot more broken than you'd think kind of coming into it
Starting point is 01:16:41 and so that was maybe like yeah one case where you make certain assumptions about how best to do goods. But actually, when you have to start confronting a really messy little world, things get a lot more complicated. Did you ever find, I don't know what accent that was that I just threw out, but that's okay.
Starting point is 01:17:01 Did you ever find, when surrounded by startup founders at Y Combinator that you felt demotivated in any way because you've pledged to donate everything you earn over around $36,000 per year to whatever charities you believe will be most effective. Did you, do you find it that that is ever a demotivator, the lack of that financial incentive? And I only ask because the, the ambitious set, the smart and ambitious set who make it into Y Combinator,
Starting point is 01:17:37 they're not pure, they're not one dimensional from a financial standpoint, but many of them want to build large companies to get exceptionally, exceptionally rich among, among other things. Right. And there are case studies of what, there are case studies of the incredible realities you can create for yourself. If you win one of those lottery tickets, or if you can execute well enough to become one of those lottery tickets, um, what was your experience like? Yeah. So I think in terms of my personal motivation, it's almost the opposite, I think.
Starting point is 01:18:17 Since I decided, okay, I really want to use my life to make a big impact, including making this commitment to give away most of my income, that's made me way more motivated. Because now it's not just kind of me on the line. It's like all these people that I'm aiming to help. It's like, you know, I could not all the time because I'd burn out, but sometimes it's the feeling of kind of urgency you get in like a war situation or something like, whoa, no, this is a crisis. It's an emergency. You've got to do something.
Starting point is 01:18:41 And if I was just, you know know just out for doing myself then I'd be I think much happier to have a somewhat more relaxed life but I do feel you know definitely during Y Combinator I'd
Starting point is 01:19:00 feel envious of the for-profit companies because in some ways so the two ways I think are just, yeah, three ways maybe. One is just how quickly you can grow because you can get investment. So the top, what company to companies were getting $3 million of investment
Starting point is 01:19:15 two days after demo day. Whereas if you're a non-profit, then you're just having to go around soliciting donations. It takes, you know, we have really great donors who are just very rational, and it's not nearly as arduous for us as it is for many other non-profits. But even still, it's just much slower as a process for growing. Second is in terms of the scale you can reach.
Starting point is 01:19:41 I think something like only 50 charities have grown to more than $50 million of revenue in the last 40 years, whereas Airbnb is just less than 10 years, goes to a $20 billion company. Many other examples of this as well. And given the kind of scale of our ambition, that's also something that makes me think, yeah, actually, that's a really pretty good model. And then, yeah, the final thing is, in terms of the sort of talent you can affect in as well, working as a non-profit, you have the kind of, you just not, it's much harder to be able to pay competitively to try and get in those people who are just super ambitious themselves.
Starting point is 01:20:27 So then you've got a much smaller pool of people, those people who are much more motivated by the kind of impact they're going to have. And that's like an extra difficulty as well. So it definitely made me appreciate the benefits of kind of for-profit models if you're wanting to have a really big impact. If you look at – this is another question I think a lot of people wrestle with. Give now or give later. if people were to survey the philanthropists currently most famous for rationally giving and making an impact, you would find people like Gates, for instance, right? But the reality of Gates is that, and no offense, Bill, but he was a predator who became an icon, who became a philanthropist. You would not consider him an altruist for the first few decades of his career. And so there are people, I actually had
Starting point is 01:21:38 someone say to me not too long ago, Mother Teresa was a narcissist. Bill Gates, with a strike of the pen, can do 100 times more than she ever did in her lifetime. And therefore, if you have even a small likelihood of developing the dynastic wealth of someone like a gate you're better served uh rather than kind of shaving off speed by donating along the way to focus all of your efforts on building an empire that you can then use for the greater good uh and no doubt this is not the first time that you've you've heard this type of thinking how do you respond to that or how do you how do you how do you contend with that type of yeah thank you so i actually just it's such an interesting question actually i often judge when i'm giving talks i
Starting point is 01:22:30 often kind of judge audiences by whether this question comes up this is a good audience uh but no it's so it's so interesting so i think like um yeah i mean firstly i think you can do as gates did just a huge amount of goods by what I call learning to give and have promoted that, where you aim to do good through your ability to donate rather than through direct contribution of your labor. And I think it's not the right path for everybody, but I think a lot more people should consider that than currently do. In terms of then when should you be donating, I think there's just a few reasons on either side. So if you've got these amazing investment opportunities that are just going to really pay off, then you should definitely take them.
Starting point is 01:23:12 Where going to college is the clearest example. If you're age 18 coming out of high school, then you could just start earning money and donating it right away. But that would be a real mistake. You should definitely get a degree, especially if you can go to a good university um just because of the impact it has uh for the rest of your life and actually in general when um we give advice at 80 000 hours we think that people really under invest in the long term because uh with their careers because most of the you know most of your hours that you're going to be spending working
Starting point is 01:23:47 is going to be after the age of 30. And also that's when you're more influential. It's when you're learning an organization rather than turning for it. Whereas a lot of people who want to do good immediately go and work in a non-profit where they're not going to get as good training or skills, skills network credentials
Starting point is 01:24:05 money as they would um in other organizations other places like the for-profit world or um sometimes further education as well so i think a lot of the time actually people should be investing more than they do when it comes to the idea of just okay i'm already earning a lot but i'm just going to invest it all again in building up my own organization and i'll donate it at the end of my life uh you know alarm bells ring for me a bit because a lot of people say that and they never actually follow through i'm sure yeah and so i think like minimally you should start donating a pretty significant percentage just to get yourself in the habit of it just so you know you're not telling yourself this lie. I think there are other thoughts as well.
Starting point is 01:24:48 So like donating has its own sort of compounding. It's like a sort of investment. So when someone in Kenya buys a metal roof, they get this amazing return on that investment. It's like 14% per year or something. So if you're giving to that poor household in Kenya who then buys the metal roof, that money that you've given compounds over time. You don't see it because the effects are kind of diffuse, but you've made the whole country that little bit richer in a way that compounds just in the same way as if you put it in a bank. On the other hand, though, you also just might really not know what the best ways of doing good are.
Starting point is 01:25:27 And so you might want to wait until you've just got better information or you have better views or actually able to think about this. And I think that's maybe, with a lot of entrepreneurs, kind of what's going on. Maybe you feel this yourself as well. I've got so much going on. If I want to do a really good job of uh philanthropy that takes time and so i'm just not able to think about this i'm gonna have to punt it to a later stage and so i do think there's a reasonable argument to be made there but uh maybe the things you could do is start kind of binding yourself to the mast a little bit maybe you can make some public commitments make
Starting point is 01:26:01 some like big declarations publicly such that you know that if you back out of them, it's going to be really embarrassing. Or you can take a pledge. So I'm an advisor of an organization called the Founders Pledge, which provides you with a contract so you can legally bind yourself to give at least 2% of your income or 2% of the profits that you make when you exit your company. And I think that's, you know, again, one of these things where you can say, okay, to begin with, I'm just going to focus on, you know, building this thing
Starting point is 01:26:35 as much as possible. But I know that I've like, actually locked down my intention. So I'm going to follow through on this later on. That's the founders pledge. Founders pledge. That's right. How many people have made that pledge to date? That's a contractual obligation? That's a contractual obligation. Who is the counterparty? Who are you contractually obligated to? So the way that works is you still can donate anywhere ultimately,
Starting point is 01:26:59 but it has to have an entity in the contract just for legal purposes. So you donate to this organization, the Founders Pledge itself, that would then redistribute the money wherever you wanted it to go. So you don't have to make a decision about where the money goes until you've actually made the donation. They act as a trustee of sorts. That's right, kind of intermediary. Got it. I've read you write, following your passion can be a mistake. Could you elaborate on that?
Starting point is 01:27:34 Yeah. So when it comes to career advice, there's all these slogans that go around, the chief of which is follow your passion. And the idea is like, it's kind of like the idea of having a soulmate or something so you just look inside yourself and you've got this calling and it's like oh i should be an artist and then you see that calling inside yourself and that's what you should go and do and that's the way to be happy and i just think this is terrible advice and that's for a number of reasons so um one is just that actually one is just that most people don't have work-related passions. So there was one study that found that most people were really passionate, study of students, but they were passionate about things like arts, music, sports,
Starting point is 01:28:17 things that are incredibly difficult to actually work in, precisely because everyone's passionate about them and so everyone wants to pursue them. So it's not really taking the world or what the world needs into account. And it sets you up for kind of anxious soul searching or then trying to pursue this thing that just statistically speaking you're probably not going to be successful at. But I think it also just misconstrues the nature of finding a satisfying career and satisfying job, where the biggest predictor of job satisfaction is mentally engaging work. So that's the nature of the job itself. It's not actually got that much to do with you, though obviously that is important to some extent.
Starting point is 01:28:55 It's whether the job provides a lot of variety, gives you good feedback, allows you to exercise autonomy, contributes to the wider world. Is it meaningful? Is it actually, is it meaningful? Is it actually making the world better? And also whether it allows you to exercise a skill that you've developed. And then that's the final thing where you might think following a passion in terms of that is all, do something you're good at. But the thing is, if you're just starting out on work, you're probably just not that good at many things that are work-related. You know, when I was graduating, I hadn't done any, really done any management or fundraising or marketing or any sort of, anything of the skills that actually get used day-to-day in
Starting point is 01:29:37 work life. And so what you should be thinking when you're first coming out of university is, you should be thinking like an experimental scientist or investigative journalist or something you should be thinking well what are my hypotheses about the things i could become good at and then actually going into the world and then testing that finding out um hey like maybe i could become good at coding and that's something that the world really needs at the moment it's just huge demand for coders um and then actually going out and trying that uh because um people's preference that's the final thing it's just people's preferences and passions change massively um in ways that people systematically under predict so if you think back 10 years what were you like 10 years ago what were the things you're really passionate about probably quite different from
Starting point is 01:30:23 the things you're passionate about now um but yet when we think in 10 years time, we think, oh no, I'm just set. I'm the same person now. And so really what you want to be doing to begin with is building up like a broad array of skills, figuring out what are the things I can become good at. And that's the much better way to lead to kind of a successful and effective life. I agree on all those points. I think that there's another bullet, which is, I wrote an article years ago. I think it's just called The Dangerous Myth of the Dream Job. And I think the other issue, well, there are two issues that I'll underscore.
Starting point is 01:31:06 The first is, as you said, humans are very bad at predicting what will make them happy. Extremely famously, statistically bad. There's a great book by Daniel Gilbert called Stumbling Upon Happiness that goes into some depth on this. Now, that's great. Pretty depressing conclusion. How do you address it? I think, uh, the, the, the second bullet is realizing that the, one of the best ways to extinguish your passions sometimes, if you are using it to be synonymous with hobbies, let's say you surf on the weekends on, you wake up on a Saturday and
Starting point is 01:31:45 you surf every Saturday, you love surfing. Therefore you think you should follow that as your passion. Very different. That experience and the purpose of that experience is very different from waking up at six every morning to take, you know, investment bankers out to surf every morning from Monday to Friday. Right. And so the, I think people overestimate the persistence of their enthusiasm in that, in that switch from optional activity, you know, electional to obligatory. Um, yeah. And being in philosophy, I'm very familiar with this. So I'm one of the, you know, I'm really lucky in terms of the position I got. There's far more people wanting to do philosophy than as a career that actually make it.
Starting point is 01:32:32 And you see so many people like this who go into it because they really love this subject. They just wanted to learn. They found it incredibly intensely satisfying. And then they find out that actually in the real world they've worked to do this. They've just got to jump through loads of hoops and do loads of networking and bureaucracy and admin just as if they were in any other job.
Starting point is 01:32:54 And it can be really pretty dissatisfying. You can end up, you know, it can be really actually pretty tragic where you end up hating the thing that you used to love the most above everything else, which is very common, extremely common. So you are a very effective young man,
Starting point is 01:33:12 I would say. You're welcome. It's objectively. I think that's pretty easy to objectively assess. You've achieved, you've, you've, you've achieved many things that it would take people a lifetime to achieve,
Starting point is 01:33:29 if they achieve it at all. So congratulations, first and foremost. But the question I'd love to ask is, what book or books do you give the most to other people as gifts? Yeah, so we talked a bit about the moral philosophy that was peter singer and derrick parfit so i definitely give them then but for uh in terms of just improving your life and um just being more effective for the two i'd mentioned one is mindfulness by mark williams and dann Pullman. And, you know, having had Sam Harris on the show, obviously your listeners will know about this, but Mindfulness Meditation is the most, I don't know, it's kind of like this just recently gotten onto where
Starting point is 01:34:25 in effect you train yourself to be more in control of your thoughts and emotions by realizing that the current thoughts and emotions are not
Starting point is 01:34:43 you, they don't define you they're like propaganda and it's up to you to choose how you react to them. And our instinctive way, which is to fight with them, is actually counterproductive. Instead, you want to accept them with warm and welcome kind of curiosity almost. And then that means you have the ability to deal with them as you like. And that's got the most amazing evidence base in terms of basically seems to improve everything but in particular mood and self-control who are the authors of this again mark williams um a professor at uh oxford in psychology who is the real kind of champion of you're really part of the oxford mafia i know
Starting point is 01:35:22 i know it's also it's also nepotistic. That was a coincidence, though. It's actually one of the few authors I've written to just to say, look, this book just significantly improved my life. You should be very happy of what you've achieved. And then Danny
Starting point is 01:35:41 Perlman, I think his name is, who is kind of journalistic, but also promoter of these ideas. And it's just a really good course for it. I liked it because I'm, you know, a big science fan. So I hear something like meditation, and I, you know, get a little bit freaked out. It sounds a bit hippie for me. Whereas this is just,'s almost comically dull in fact um i mean they give these kind of guided meditations and you're used to hearing this kind of female high-pitched dreamy voice and instead you get this um uh you know broad midlands english accent saying
Starting point is 01:36:20 now sit on a rug or a chair or on a bed and close your eyes and it's um really very kind of surprising when you first listen so it's just like the worst dad bedtime story ever but effective nonetheless but then really good if you feel kind of intuitively a bit uh skeptical of that sort of thing because um it's a very friendly very very accessible introduction to mindfulness meditation. And it provides you with a course over eight weeks. We do a series of guided meditations. And I did that course. And it's one of the things I think has a really significant impact on my life.
Starting point is 01:37:01 Do you have a daily meditation practice now? I actually don't, but I'm going to start again. I mean, I think there's two things. Uh, yeah, I think I'm going to start again just after, um, probably just after the gym. Um, cause I, uh, go to the gym first thing every morning. Um, and then after that point, just do, you know, it can just be 20 minutes breathing. Um,
Starting point is 01:37:32 we first focus on your breath and then, uh, extend that feeling of awareness to your whole body. Um, uh, I still meditate if I'm feeling, uh, you know, stressed or anxious about something.
Starting point is 01:37:43 It's a really nice kind of go to, um, you know, stressed or anxious about something, it's a really nice kind of go-to activity that you can do to kind of put yourself, kind of reset yourself. Right. But then also the other thing is just it starts to affect your entire approach to life. So, you know, you'll start to feel the rising panic and you're much more in tune with your bodily reactions that then turn into thoughts. Um, and then again, you can kind of catch those bodily reactions to begin with again, kind of slow down your breathing, um, focus on the breath
Starting point is 01:38:16 and then realize that it's up to you how you want to respond. Um, and that can be very powerful because it means you have much more choice about your emotional reactions to things. What was the second book? So the second book is The Power of Persuasion by Robert Levine. So I'm really in favor of meta skills, just these kind of general purpose skills that can improve your effectiveness in all areas of life. And just the ability to be convincing, to sell ideas and to persuade other people is one of the most important of these skills, I think.
Starting point is 01:38:59 I like to think of myself as taking laziness and making it into a virtue because why do something when someone else could do it? If you can make that into a virtue, then suddenly you find you have all these volunteers helping you with this thing you're trying to create and they turn into employees and then they're the way doing the sort of stuff that you could have instead just been slogging away on yourself for years. So The Power of Persuasion. The Power of Persuasion.
Starting point is 01:39:29 And I don't think it became that popular, but it's the best book on persuasion that I know of, actually. Levine. Yeah, and it's quite a lot more in-depth than things like Cialdini's Power of Persuasion and some of the other books in that genre. But it's incredibly interesting and really lays out different principles for, like the key ideas for persuading someone.
Starting point is 01:39:55 So like norms of reciprocity or escalating commitment. And also just really shows how being persuasive, a lot is often just about being a really nice, authoritative, genuine, honest person. So, you know, the key aspects of being persuasive are honesty, honesty and authority. And many people, they think, oh, well, I want to become someone who can, you know, be persuasive. They then turn into these kind of sleazy second time car salesman types. And that's exactly the wrong thing to do. Um, instead it's actually about being this, uh,
Starting point is 01:40:43 um, you know, transparent person who really knows their stuff um and yeah i found that kind of very useful especially as someone who is trying to you know my life is about selling people on certain ideas ideas of effective altruism well i think everyone's lives are about selling other people on their ideas yeah yeah basically i mean it's just it comes up absolutely everywhere um and it's just very thorough very in-depth and really goes uh to goes to the level of breaking down into like really concrete principles like earlier i mentioned um you want to aggregate harms and disaggregate benefits so um it's more enjoyable for you to win $50 one day and $25 the next than it is to win $75 one day. Whereas if that was a cost, then you'd prefer to just lose $75 at once
Starting point is 01:41:35 rather than have two distinct losses. So again, it goes to the level of very specific recommendations. And then also has amazing case studies of, it does go to the best stories of working with the very best salespeople in all different areas of life. So it's the best book that I read on that topic. I'll have to check it out. Your morning ritual,
Starting point is 01:42:06 you mentioned working out first thing in the morning. What is the first 60 to 90 minutes of your ideal day look like? Yeah. So in terms of morning routine, I think the biggest, I think maybe the single piece of productivity advice or, you know, productivity improvement I made was sleeping enough. People's need for sleep just varies massively from person to person. Some people can just sleep
Starting point is 01:42:33 four hours a night and they're very lucky. I'm not one of those people. And coming to accept that was very important. So I aim to sleep nine hours a night. And then when I wake up, it's really about getting up and going. When do you wake up? What's your normal range? Yeah, I typically about 9 a.m. So I'm not a super early riser either. I'm really not much of a morning person. So typically about 9 a.m.
Starting point is 01:43:02 And then, yeah, it's just about getting up and going. So I eat things that I can hold in my hand. I really hate cereal. I hate things where I have to spend a lot of my time. Because also the morning is my peak time in terms of mental performance. And so I typically eat breakfast bars, which I'm sure you're going to chastise me for, for being unhealthy. But that's what I do in the morning, go to the gym. Again, then that's probably the second most important piece of productivity advice is just
Starting point is 01:43:38 regular exercise. Again, because I feel like in terms of my life and what I've contributed, almost all of it is in terms of the highest quality work I'm producing rather than how many hours I'm producing. So the thought of like, oh, I can sleep less and then produce more hours is just completely false economy. Instead, it's just how can I produce the highest quality work? And for that, again, just exercising in the morning is the most important thing. What type of exercise? What does your routine look like? Yeah, so now I've suffered from fairly severe back pain over the last year and a half. So that's changed things quite a lot.
Starting point is 01:44:20 And now, so my favorite exercise, which I was able to do when I was in Cambridge, but, uh, not here because I don't have the machine is called the Jacob's ladder. Um, do you know it? I do. Yeah. Maybe you could describe it for folks. I think describe it. So, um, uh, it's wooden bars, uh, forming a ladder that are on, um, kind of conveyor
Starting point is 01:44:42 belt. So they're constantly going down and you're constantly climbing up. So it's about 45 degrees, so it's not vertical. But if you know, if you've ever just tried to climb up a ladder, you get pretty tired pretty quickly. And you're attached to the machine so that you're able to set the pace as well. So it's kind of like a treadmill, except you're just climbing up these bars. And it's great for me because it's low impact because i can't do high impact stuff at the moment
Starting point is 01:45:07 uh but it's incredibly tiring i remember when i first did it i could do about two minutes and then i would be completely conked out um and then i would build up i built that up over time uh and it's the most you feel like your entire entire body is just completely spent by the end of it. Yeah, so it's my favorite exercise. And then how long does that workout last? About an hour. I initially, if my back's bad, then I try and focus it more like an hour and a half. But a lot of that time is spent doing physio exercises.
Starting point is 01:45:48 How did you hurt your back? Yeah, I don't actually know. I think most cases of back pain actually don't have a clear problem, but I think it was bad posture. So my best guess as to what's going on is anterior pelvic tilt. So where your pelvis just tilts forward too much and you're going to stick out your belly like kind of beer belly style. And so for there, the key is to really strengthen your glutes
Starting point is 01:46:17 and your abs, stretch out your hip flexors and your lower back so that you're strengthening the muscles that are pulling your pelvis back and stretching out those that are pulling it forward. And you get all these problems from sitting all day. I would have terrible posture as well, so I don't do that a lot using an ergonomic kneeling chair, experimented with a standing desk, and in general learning a lot about posture
Starting point is 01:46:43 because really I think the kind of common conceptions of what good posture consists in were just completely wrong actually um uh yeah people think it's about sitting up very straight and very rigidly whereas actually it's more about getting the curve of your spine right so um having your hips tilted too far forward so that your lower spine makes your belly stick out that's a very common problem and then also having your shoulders kind of hunched forward and then your neck and head up kind of like a duck another very common problem and so actually if you want to test your posture you can just stand against a wall and you should only have two inches between.
Starting point is 01:47:26 So just standing up straight against the wall, you should only have two inches between the spine, your spine and the wall and between your the kind of curve of your neck and the wall. And for almost everyone who does that, they'll find that, especially at the neck, there's just much bigger gaps than there should be. Yeah. It's also, that's also hard hard if you have a horse ass like I do, like a Kim Kardashian ass. But for the back, a couple of things that might be helpful.
Starting point is 01:47:57 Where do you feel the pain in your back? And this could be referral pain and not the location. The sensation locate, the sensation of pain could, might not be the location of the pathology, but where do you feel the pain the most? Yes. It's very lower back. Very lower back. So a couple of things that you might find interesting to play with there, if it's the low back, uh, would be number one. And you can check out Kelly Starrett.
Starting point is 01:48:26 Maybe you've seen his stuff, Mobility WOD. He's been on the podcast as well. But trying to get the head of your femur to seat at the back of your pelvis. So by sitting constantly, it tends to get pushed to the front of the hip capsule and it causes all sorts of issues and soft tissue changes and whatnot. So if you look at exercises, they're pretty easy to do just like on your hands and knees and you lift one leg up and move your weight around as you apply it to one leg. But if you were to look up sort of seating the femur in the pelvis and Kelly Starrett, I think that could be very helpful.
Starting point is 01:49:11 And then secondly, have you considered using or used inversion tables or gravity boots so that you can? No. So I found this to just be tremendously valuable where I'm decompressing my spine and putting myself into a state of traction. I try to do it at least once per day. And most frequently, I'll do that at night. So I'll hang from my – try this out because I've talked about this before on the podcast. And I've had literally dozens of, uh, men specifically,
Starting point is 01:49:45 but my audience is, uh, or the people who listen to the podcast are about 80% male, uh, come back and say that they've eliminated years of back pain doing this. And I'm not a doctor, of course, this isn't medical advice, but if you hang from your hands in the morning and then invert yourself at night for a short period of time. And if you can't invert, there are other options. But you could look at Teeter hang-ups, T-E-E-T-E-R. They're different models, but Teeter hang-ups. I use the boots. There are risks involved with hang-ups
Starting point is 01:50:15 that I'm like Batman from a bar with boots on, obviously. You could use the inversion table. I just use the boots. And if you can't do either of those due to space constraints or travel or whatever there's also a device called the links l i l y and x which allows you to put your lower back into traction on the ground and it's very small i have one about 15 feet from me behind my couch uh and making it a habit just as an experiment to hang twice a day once from your hands and then again from your feet if possible uh particularly for that like
Starting point is 01:50:53 iliopsoas pelvis low back complex i think you could find um yeah yeah that would be a worthwhile experiment um this is great i thought I knew all of the back pain remedy tricks, but you've educated me. I'm going to try this as well. You could also, if you want to, enjoy some masochism, but potentially
Starting point is 01:51:18 reverse some of the soft tissue issues that you have, no doubt, in your pelvis from the sitting and the sort of kyphosis lordosis that upper backgrounding and then the anterior pelvic tilt that like sway back position is you can find an art practitioner i'm sure there are i don't know if there's probably one at the very furthest from you or maybe maybe it's farthest. I always screw those up, uh, London, but ART is active release technique and they'll basically take, they'll like form their hand into it,
Starting point is 01:51:51 their fingers into like a ridge hand and, and dig it three or four inches into your pelvis and have you move your leg around. It's extremely uncomfortable. Um, you might need a safety word, but it's, uh, that, that can have a tremendous effect on mobility and, uh, the gliding of adjacent tissues and things like that. So those would be worth checking out, but I don't want to make this about me spouting off, but since I know a lot of people who've suffered from low back pain and I previously suffered from low back pain, which I do not suffer from anymore.
Starting point is 01:52:26 That would be a suggestion. And the other thing is what I found is standing at a standing desk all day long is very challenging, particularly if you're moving from location to location. If I ensure that I walk an hour a day, which we're really evolved to do. We've made a lot of compromises from an evolutionary standpoint to be able to walk for long distances. That also helps to keep that hip complex functioning normally.
Starting point is 01:52:59 And you're getting, at that point, sort of a high volume of low intensity stretching yeah which which is very valuable so this would be my two cents but uh well it's a big deal i mean so many people it's like a plague or something um number of people who suffer from back pain and then it just can be completely debilitating i've lost months of productivity as a result. So what I would, you can speak to your PT about this, but I think that oftentimes the reason people have low back pain and then cannot squat is because they don't squat enough in the first place. So the hanging, if you were to do that for a week, see how you feel,
Starting point is 01:53:43 and then find a good, uh, Olympic weightlifting coach, not power lifting coach, find a good Olympic weightlifting coach who can train you to do, um, overhead squats. And it may take a long time for you to get to the point where you can do proper overhead squats where you're not losing, uh stability. But that can be a complete game changer. I mean, I'm 38 and my hips and knees are better than they've been in probably 15 years. And I directly attribute that to regular deloading and decompressing of the spine, as well as a regular squatting practice where I'm squatting every day, even if it's just for five repetitions with, uh, 45 pounds on the back, uh, or in front of me or overhead. Uh, so that's, um, a rather massive digression,
Starting point is 01:54:34 but that's okay. Uh, well, you may have, yeah. Let me know how it goes. Yeah. Yeah. You may have, um, given me months of extra work, so I hope it helps. I know how debilitating it can be. Do you have any evening rituals, any evening routines for winding down? Yeah, actually, I really don't have an evening routine. I mean, except insofar as I always take an hour or two off before going to bed because I used to work until I wanted to feel full asleep. But that's just, again, this false economy because it means you just wake up much less energized.
Starting point is 01:55:16 And again, it's just eating into peak productivity time. I travel a lot, so if I need to reset my sleep, then I take melatonin and then I focus a lot on getting high quality sleep if I can so most important there just being completely shutting out light so that you're not waking up in the middle of the night at all but in terms of something to decompress myself normally just the regular things of seeing friends or reading. Generally trying to avoid watching TV or anything that's kind of bright and natural light.
Starting point is 01:55:55 Speaking of bright and natural light, what are your favorite documentaries or movies? So I think by far my favorite documentary maker is Louis Theroux. I don't know how popular he is in the U.S. He's a bit of a U.K. institution. Have you heard of him before? I've heard of him, but I couldn't name any of his work. Okay.
Starting point is 01:56:16 So he, I mean, the most interesting is Louis Theroux's Wild Weekends. He tends to go to places in the U. the US to these weird subcultures. And he does exceptionally well at becoming involved in those subcultures. So examples are neo-Nazis, survivalists, the West of the Baptist Church, swingers, off-goes to prisons, porn, cosmetic surgeons uh and the black power movement and he comes across just so bumbling and naive that the people he's filming then completely reveal everything about their own crazy lives and that's lou Theroux go ahead T-H-E-R-O-U-X that's right and it's incredibly powerful because
Starting point is 01:57:10 you see him interacting with these neo-Nazis and they're grilling him on whether he's Jewish and he just keeps telling them he doesn't want to answer and you see them the parents are getting their children to dance around the swastika as kind of morning playtime.
Starting point is 01:57:28 And you think, wow, I'm so happy I'm not one of these people. And, you know, so enlightened because they're so mind killed. They're so completely captured by their own ideology. But it makes you think, wow, well, what are maybe the things I believe just because of the people that I'm surrounded by all these kind of cultural things, because they're just completely convinced of this worldview of people who are looking for UFOs all day, or people
Starting point is 01:57:54 who are certain that the government is going to come and crack down, or the Westphalobactist church that thinks literally everyone is going to hell apart from them and it makes you think yeah,, maybe the things that I believe myself that are just in the future will be looked back upon as crazy as I'm looking at these communities and thinking that they're crazy. Oh, I think it's no maybe at all. I think it's 100% certain.
Starting point is 01:58:26 I mean, I think everybody should take the approach of good doctors or I should say the sort of perspective of good doctors, which is 50% of what we know is wrong. We just don't know which 50%. Yeah, that's exactly right. And I think most people don't tend to act that way. They're much more too accepting of the status quo. Oh, yeah.
Starting point is 01:58:47 It's just the quote that I always use and no doubt should implement more in my own life, although I try quite hard, is when you find yourself on the side of the majority, it's time to pause and reflect. That's Mark Twain. But most people interpret that to mean the majority of, say, the US, whatever their nationality happens to be. And I would just say, no, no, no. Even the majority of your friends, if you have a narrative that you're telling yourself, and it's within a peer group, even if it's 10 people, 20 people, you should really examine that. Have a regular check-in.
Starting point is 01:59:23 I think politics is the one that's the key, that's the biggest influence here, where, I don't know, people will identify very strongly as very left-wing or very right-wing, but it always strikes me as a very strange thing to do because there's this package of very different ideas associated with the left or associated with the right that don't have any resemblance to each other.
Starting point is 01:59:46 Like, why on earth should your views on abortion be related to your views on optimal taxation policy? There's completely distinct issues, yet they come in these packages. And I think it's because people, you know, we're all monkeys about walking around wearing suits. We, you know, we want to form tribes, and then we start forming tribes based around say political identities um but then that means you'll just
Starting point is 02:00:11 start to buy a package of views rather than just um looking at each one on their own um on their own merit oh i think yeah that that may be a whole separate conversation. I think humans can learn a lot about themselves and their biases by reading at least one book on chimpanzee behavior. There's one in particular that's popped up a lot in my reading about animal training and evolutionary biology and whatnot, because I have a new puppy, adopted a rescue puppy. And it's called Chimpanzee Politics, Power and Sex Among Apes, written by Franz de Waal, W-A-A-L. And this book apparently was used by, and I think he's mentioned this publicly several times, Newton Gingrich in amassing power and overcoming opponents in his political career. So I think the parallels are fascinating,
Starting point is 02:01:09 and it's easy to convince ourselves that we are passionate about a particular position because the position has merit, whereas in reality I think a lot of it is just a hardwired desire to fight and dominate and be right and so on, which you can trace back to chimpanzee behavior or find parallels. And it's depressing, but I think also helpful at the same time. Yeah, it's a really useful lens, I think. If you could have one billboard anywhere with anything on it, what would it say?
Starting point is 02:01:50 That's a good question. So I think it would be outside the Gates Foundation, or maybe outside Bill Gates' house. I don't know where that is, but in Seattle, where ultimately he's going to donate 100 billion dollars. You know, you've spoken about the risks and potential upside from in the long run developments of artificial general intelligence. Yet you're not doing anything about it yet. You haven't got involved. You have the power to make a massive difference here.
Starting point is 02:02:37 You should, like, do something about it. I think that's what I would say. So artificial intelligence, generalized artificial intelligence. Human level and greater than human level artificial intelligence. I did not see that coming at all. You did not see that coming. No, this is...
Starting point is 02:02:55 Late on into the interview, it's a whole other... Yeah, yeah. This is act three. Yeah. That's... Big debate in the media, but yeah, I think just this is and like another
Starting point is 02:03:09 oxford professor nick boston writing about this in a book called super intelligence um oh yeah very very famous book yeah very important book uh where you know very sometimes i think we just have it's very hard to predict the future. Sometimes I think you have a bit of an inkling into you're able to make really pretty educated guesses about what are going to be really big transformative technologies in the future. The sort of things like
Starting point is 02:03:36 development efficient that has huge potential for power and also huge potential for harm through use of nuclear weapons. I think the case of development of artificial intelligence, you know, it's not going to happen tomorrow. We're thinking about like 30 years or 50 years or by the end of the century.
Starting point is 02:04:00 It's clearly, it's really pretty likely it's going to be one of the most or the most important development of essentially when it does happen. Like in the case, you know, if we could have known about nuclear weapons or developing fission much earlier, we could have had policies in place so we're really prepared for that. And we wouldn't have maybe had a nuclear arms race. The world would have been a much better place. I think that's the situation we're potentially in with developments of artificial intelligence as well. So I was planning on wrapping up after another two minutes, but I can't let this one go. So you're hanging out with, you mentioned Nick, super intelligence. You're surrounded by, or you have access to some very smart people
Starting point is 02:04:45 who have thought a lot about this seems like you have as well what percentage of those who are most educated about the potential implications ramifications of ai are strongly concerned that it's summoning the demon or something the demon um yeah i mean summoning the demon. Oh, summoning the demon. Yeah, I mean, summoning the demon is quite an extreme way of putting it. It is. Elon Musk. Right, that's Elon Musk. So it depends exactly on who the reference
Starting point is 02:05:18 class is, but on some accounts, it's the large majority actually, where then the media just completely distorts the debate because the media loves to distort debates. Where if you're framing it as this is a really important issue, it's not something that's going to happen tomorrow. It's something that's like a long-term speculative issue. But obviously we need to have a sensible, rational approach to this and a proactive approach such that we're aware of what's coming and have taken precautions so that we use this new technology
Starting point is 02:05:59 in a way that is going to lead to good outcomes and avoid bad outcomes. Then the rate of agreement is just kind of very high indeed. If instead you were saying something narrower, which is like, well, AI is going to happen in 20 years and then it's going to be Terminator scenario and we're all gone for sure, that's a much smaller percentage of people. So let me rephrase my question, which is a totally different question, so I'm kind of cheating.
Starting point is 02:06:28 Okay. So you're in a very interesting position because you've had the perspective and experience of watching people behave what could be considered very irrational, irrationally. In other words, they would rescue the drowning child, but they won't donate that amount of money for a similar, nearly guaranteed outcome. And then you have,
Starting point is 02:06:59 you have, for instance, this is more from my experience, but I've had a lot of exposure to lawyers and attorneys in the legal world over the last decade or so. And you'll find people who are genuinely, I would say, defending child molesters in the Catholic Church. And their job is to find holes in the depositions of these victims. I mean, it sounds fucking terrible, and it is. oil companies avoid lawsuits and problematic legislation when there are violations of EPA regulations, right? I mean, like from my perspective, just horrific, like evil shit.
Starting point is 02:07:55 And they're able to rationalize doing it, right? Like everyone is entitled to due process, right? That's kind of the catch-all brush aside that you hear. But they go from that to a point where, this is maybe a separate podcast, but I'm all fired up now. They go from being a hesitant participant in that to maybe now they're a senior partner
Starting point is 02:08:20 and they're like, oh, there's an oil spill. Fantastic. Can you imagine how much work we're going to have now, right? But these are people who, outside of that compartmentalization, act in very good ways. So I guess part of my concern, or it's not really a concern,
Starting point is 02:08:36 my question for you is, given how you've observed these quirks of human nature, do you think people who are at the forefront of AI, who have the possibility of changing the world in such a fundamental way not only for other people but to generate wealth that is almost beyond comprehension for themselves do you think that drive and greed and arms race because there are competing teams right right? Trying to get to this point in many countries where you have a generalized artificial intelligence. Do you
Starting point is 02:09:11 think that that competitive drive and desire to win and generate wealth, et cetera, will override the voice in the back of their head saying, you need to figure out the safety precautions and the safety net before we get anywhere close to this technology taking off, in the same way that you talked about the nuclear arms race, right? Yeah. What's your perspective? Yeah, so, yeah, sadly, I mean, this is just, it's a classic tragedy of the commons where
Starting point is 02:09:45 if you're going to have multiple people um trying to build the same thing and where whoever gets their first wins basically um is just you know has much more power um and then some people think oh yeah we should be doing this like more cautiously that would you know slow progress but um we'd be having you know greater that chance of positive upside and fewer risks um then they're just going to kind of lose the race and yeah sadly i think it's not and maybe you can even have that um even if everyone's acting altruistically, maybe they disagree slightly on how things should be done, and that's enough for them kind of not to trust each other
Starting point is 02:10:31 in the absence of coordination. And there, again, it's just not even a matter of people getting corrupted, perhaps, as you talk about the people, you know, the lawyers being happy about an oil spill being, but just as a matter of economic incentives, then you can get these race dynamics. And, you know, that was the case. It was exactly the case between the U.S. and Russia with nuclear weapons. I'm definitely not saying it's a perfect analogy at all,
Starting point is 02:11:05 but there there was the Bavach Plan, which was a proposal after the Second World War for complete abandonment of all nuclear weapons, and all fissile material would be
Starting point is 02:11:22 kept to check on by the United Nations. And basically all parties were in favor of this because it's the best outcome for everyone. But it still wasn't able to happen just because there wasn't sufficient trust between the two countries. And, yep, then we get this incentive, this kind of arm's race uh and that's i think why we kind of want you know and you know not just for ai other sorts of uh you know risky technologies as well you know the ability to develop pathogens ability to do geoengineering
Starting point is 02:11:58 um we're on the frontier of developing many different technologies that have very large potential upsides and very large potential costs um and in each case we want to have you know coordinated um approach so that we can ensure that we don't get those sort of base dynamics i think what existential threat to mankind worries you the most or is most underrated those are two different questions but i'll i'll make it two questions anyway uh yeah so i guess um uh until recently i would have said actually yeah okay i have an answer for most underrated um for you know what is me the most? So, yeah, development of new pathogens.
Starting point is 02:12:48 So once we start being able to build viruses and bacteria, then it will become very easy to potentially build pathogens that could kill billions of people or the entire world or just almost everyone in the world. You know, that's very worrying as is AI. Those are the kind of, those are the two big ones, I think. In terms of most underrated, I think are the ones we don't even know about. Right.
Starting point is 02:13:18 We, you know, predicting future technology is extremely difficult to do. Everyone basically agrees with this. And many of the developments that have happened over the last 50, 100 years would have been completely unpredictable 50 or 100 years before that. And we should expect, again, there's going to be developments that happen over the next 50 or 100 years that um you know no one's even thought of at the moment uh and so i think that means but you can still make some sorts of um progress on like mitigating those risks because there's some things you can do like greater political
Starting point is 02:13:57 coordination across the world is just going to be really good across a very wide range of scenarios um you know having research institutes working on the frontiers of technological development, doing horizon scanning to try and identify risks like this. Those are some of the things that
Starting point is 02:14:17 we could be doing to try and mitigate these unknown unknowns. But that's exactly the sort of thing we're going to be biased against because it's like you're spending money doing something that you don't even know what it's going to help with. It's like quite an abstract kind of sell. So I suspect the biggest risks are ones we haven't even thought of.
Starting point is 02:14:37 Right. Yeah. The black swans. Just a couple more questions. What advice would you give to your, you're only 28. So what advice would you give to your you're only 28 so what advice would you give to your 20 year old self to my 20 year old self um the biggest i think there's yeah let's see the two i think so one is emphasizing yeah you have 80 000 working
Starting point is 02:14:59 hours in the course of your life uh it's incredibly important to work out how best to spend them and what you're doing at the moment 20 year old will is just kind of drifting and thinking um not spending very much time thinking about this kind of macro optimization um you might be thinking about you know how can i do my coursework as well as possible, kind of micro-optimization, but not really thinking about, okay, what are actually my ultimate goals in life and how can I optimize towards them? Analogy I use is,
Starting point is 02:15:33 if you're going out for dinner, it's going to take you a couple of hours. You might spend five minutes working out where to go for dinner. Seems reasonable to spend 5% of your time on how to spend the remaining 95%. If you did that with your career, that would be 4,000 hours or two working years. percent of your time and how to spend the remaining 95 percent if you did that with your career that
Starting point is 02:15:45 would be 4 000 hours um or two working years and actually i think that's pretty legitimate as a thing to do spending that length of time to work out how should you be spending the rest of your life now do you spend that do you spend are those two contiguous years or those four years of total time divided i think four years of total time weighted towards the front of your career, I think. I think we should be spending a lot of time, and I do this, any sort of big decision I make, I spend a very large amount of time thinking about, is this the best thing I could be doing? What other things could I be doing instead?
Starting point is 02:16:21 Are there ways I can change my plans? What's your process for thinking that through? Do you sit down with a particular pad of paper and go through a particular set of questions? What is the thinking process for big decisions? Yeah, so I'll create a Google Doc that I share with friends or people I particularly respect. We'll then provide comments and there will often be several iterations of this.
Starting point is 02:16:48 There's a framework. So the 80,000 hours, which is what 80,000 hours promotes as well, where because I'm thinking about the impact I can ultimately make, you can break that down into three components. Impact you have. So this is for job decisions but it actually applies quite widely um impact you'll have on the job where you can think about uh impact you'll have through your direct labor through um your ability to advocate for important
Starting point is 02:17:17 causes uh through uh your donations as well but then also impact later on in life, where that's skills, credentials, network. Then also, how does this keep my options open? So academia is a great example of this. If you leave academia, it's very hard to come back, whereas if you go and do something before going into academia or doing a PhD, it's easy to transition back in. Similarly, if you go into a for-profit,
Starting point is 02:17:46 then you can transition to non-profits quite easily, much harder to do it vice versa. And then also how much do you learn about yourself in the course of this work? And then the third aspect is personal fit. So, you know, how uniquely good am I at doing this compared to other people? And so that's the kind of framework,
Starting point is 02:18:06 basically just like a big checklist that I'll use if I'm evaluating different sorts of large-scale pieces of work I could be doing. Can people find this framework on the 80,000 Hours website? Yeah, so on the 80,000 Hours website, you'll get it kind of as soon as you go in. What is the website? The career guide.
Starting point is 02:18:29 So just 80,000hours.org. 80,000 is the number, hours.org. And then there's a career guide, and actually we've kind of built an interactive tool to help you apply this framework in your own career decisions. It takes about 30 minutes to do for how to choose that. And we'll also kind of recommend your ideas as you go through. Because I think we don't often think about this in a very structured way at all.
Starting point is 02:18:54 That's the most important decision of our lives. No, I should, I think I'm going to do that in the next 24 to 48 hours because it's like I have a habit of just at random moments, you know, I'll have two glasses of wine and ask my girlfriend, what should I do with my life? What do you think? You know, it's kind of, it's I mean, I'm playing a little bit. I mean, it's not the only way that I approach trying to make these decisions, but I haven't had a structured way of assessing impact of some of these larger options. And I think the, the,
Starting point is 02:19:24 if you want any career advice as well, we do specialize in that. So happy to give you a one-on-one. I appreciate it. I'm not sure if you could consider anything. I've done a career, but that's the thing that actually we, that's one of the mistakes we think people make is thinking about careers.
Starting point is 02:19:39 Really. You should just be thinking about stages in your life. Yeah. Because very few people nowadays just do one thing and then stick at it for the rest of their life. Yeah, definitely. You want to be much more flexible than that.
Starting point is 02:19:51 Yeah. And that the, the keeping the options open is a really interesting point. We we're going to wrap up in a minute, so we won't get into it. This, this particular conversation, but spoke a lot with Scott Adams,
Starting point is 02:20:02 the creator of Dilbert about this and how he approaches his life from what he calls a systems perspective as opposed to a goals perspective. And the systems is always, in effect, ensuring that even if a given project or stage fails, the skills and relationships and so on that he develops, in addition to the way in which he sequenced things, like you mentioned, allows him to be as good off, as well off or better off afterwards, even if it's a strikeout on some other levels. What is the last question?
Starting point is 02:20:43 What ask or request do you have of people listening? I mean, I'm going to throw it out there just to, because it addresses a pet peeve of mine. If you're a founder who claims to be building something to change the world, and if you're not able or not willing to contribute to any causes right now, then sign the founder's pledge. 2% is nothing.
Starting point is 02:21:10 It's $2,000 out of a million dollars. It's nothing. It's trivial. So I would just say, if that's your line, to ensure you're not lying to yourself and other people, just sign the pledge. I don't see any downside to it that I can perceive. So that would be one ask of mine.
Starting point is 02:21:27 But what would your ask or request be of the audience? And where can they learn more about what you're up to and find your work online? And you, for that matter. Great. So key ask is go into effectivealtruism.com. And you can sign up for the Effective Altruism newsletter there. That's also, if you're interested, you can buy my book on that as well, Doing Good Better.
Starting point is 02:21:53 That's all about the ideas we've talked about, at least some of the ideas we've talked about, about doing the most good. Beyond that, if you want to do good with charity, see givewell.org for top recommended charities. If you want to do good with charity, see givewell.org for top recommended charities. If you want to do good for your career choice, 80000hours.org. Again, sign up for the newsletter.
Starting point is 02:22:13 And if you're really feeling inspired and you want to make an even bigger commitment on that founder's pledge, Giving What We Can is a pledge of 10% or more. And you can join the community and it's a really kind of worthwhile and worthwhile thing to do that will make your life more meaningful and also have a huge impact at the same time. But key of those is effectiveautism.com. And Will, how do you pronounce your last name correctly? McCaskill.
Starting point is 02:22:43 Okay. And so for people who are going to misspell this, if you wanted to say hi to Will on Twitter, it's at Will McCaskill, M-A-C-A-S-K-I-L-L. So kind of like Maca skill, I guess, if you wanted to try to split those up. But Will McCaskill. And then Facebook is facebook.com forward slash WD crouch. That's a whole separate question that I want to get into.
Starting point is 02:23:08 And then LinkedIn and so on. And for everybody listening, of course, the links that we discussed, the links that we'll just mention, those will all be in the show notes, the books, the movies,
Starting point is 02:23:20 the wild weekends with Louie Thoreau will all be found at 4hourworkweek.com. Spell it all out, 4hourworkweek.com forward slash podcast. And Will, this has been great fun. I really appreciate you taking the time. Yeah, I've really enjoyed it. Thank you. Thanks. And everybody listening, thank you for listening.
Starting point is 02:23:40 And until next time, please experiment often. Consider the impact of what you're doing. Don't misspend your 80,000 hours and check it out. 80,000 hours.org and everything else that Will mentioned. Thanks so much. Thank you for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.