The Knowledge Project with Shane Parrish - #8 Julia Galef: The Art of Changing Minds

Episode Date: February 20, 2016

On this episode of the Knowledge Project, I discuss rationality, changing minds (our own and others), filtering information, and a lot more with Julia Galef.   Go Premium: Members get early access,... ad-free episodes, hand-edited transcripts, searchable transcripts, member-only episodes, and more. Sign up at: https://fs.blog/membership/   Every Sunday our newsletter shares timeless insights and ideas that you can use at work and home. Add it to your inbox: https://fs.blog/newsletter/   Follow Shane on Twitter at: https://twitter.com/ShaneAParrish Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Welcome to the Knowledge Project. I'm your host, Shane Parrish, curator behind Farnham Street blog, an intellectual hub of interestingness that covers topics like human misjudgment, decision-making, strategy, and philosophy. The Knowledge Project allows me to interview amazing people from around the world
Starting point is 00:00:26 to deconstruct why they're good at what they do. It's more conversation than prescription. On this episode, I have Julia Gaff. She's the president and co-founder of the Center for Applied Rationality, a nonprofit organization based in Berkeley, California, devoted to developing, testing, and training people in strategies for reasoning and decision-making. She also hosts the Rationally Speaking podcast,
Starting point is 00:00:50 a bi-weekly show featuring conversations about science and philosophy. On this episode, we talk about a host of fascinating subjects, including rationality, of course, changing minds, our own and others, filtering information, and a ton more. This was a fascinating conversation than one you don't want to miss. I hope you enjoy it as much as I did. Where did the interest, you're the co-founder for the Center for Applied Rationality, where did the interest in Rationality come from? Well, people ask me that sometimes, and I don't have a great answer. answer for it. I've sort of been interested in, I mean, before I knew the word rationality, I've been interested in what that word refers to as long as I can remember, basically.
Starting point is 00:01:37 One of my guesses as to the origins of my interest is my parents. They're both very intellectually curious, smart for sure, but also intellectually curious people who like to inquire about the world and encourage me and my brother to be curious about the world. And more than that, were also unusually good at changing their minds when presented with a good argument or when they, you know, encountered new facts that should change their mind. So I just have these vivid memories from when I was, I couldn't have been more than six or seven, I think, and I would have these arguments with my parents, you know, the way that kids do about rules or or what was fair or, you know, what's good behavior, et cetera. And, and, you know,
Starting point is 00:02:27 And sometimes, not usually, but some of the time, my parents would come back later and say, you know, Julia, we thought about it and we think you're right. We think this was unfair or, you know, whatever. And I just, I remember feeling so, well, grateful for one that, you know, they had listened to me and taken me seriously, but also I felt admiration for them, you know, because they didn't have to change their minds. There was no, you know, no one was going to hold them accountable. I couldn't do anything about it.
Starting point is 00:02:54 but they clearly wanted to, like, if I was right, they wanted to realize that I was right, you know. So that kind of stuck with me, and I got pretty interested in good argument, what counts as good evidence. I ended up majoring in statistics in college, which I was interested in because it asks questions like, how can we know things with any confidence about the world, how much evidence is enough to sort of change someone's position on something or should be enough. And through it all, I carried this memory of my parents as sort of an ideal in the
Starting point is 00:03:31 back of my mind in terms of how to react when you encounter new evidence or good arguments. Did your parents do anything else to kind of encourage that in you? Or like when you changed your mind, did they, were they responding positively to that? I think so. That's less of a vivid memory in my mind. But if I had to guess, I would say yes. They, I mean, my dad's, well, both of my parents, but my dad has more of a background in science. My mom's background is in math and statistics, actually. So my dad was the one I would more often ask questions about, you know, why is the sky blue, that kind of question about the world. And he rarely just gave me the answer.
Starting point is 00:04:13 He would more often sort of lead me through a reasoning process in kind of a Socratic type way. And, you know, usually I couldn't actually reason my way from first principles as a seven-year-old to figuring out why the sky is blue. But I could sort of come up with some guesses and maybe I could sort of notice, oh, maybe that guess doesn't make sense, et cetera. So when I finally got the answer, it was sort of more satisfying because I had tried, you know how when you try to solve a puzzle and someone tells you the answer it's much more satisfying and it sticks with you more than if they just tell you the puzzle and then tell you the answer straight away? Right. It was like that. So I think that was part of it. Well, the last thing I can think of in that space is, I don't know if I'm joking about this,
Starting point is 00:04:53 but my dad, well, sorry, the facts are true. I don't know. I can't tell if I'm serious about them contributing to my interest in rationality. But the facts are that my dad has kind of a trickster spirit at the core of him. And sometimes, especially if he didn't know the answer to a question, he would just make up an answer. You know, if you've ever read Calvin and Hobbs, Calvin's dad is like, is the example, the poster child for this. So, you know, why, how does the radio play music?
Starting point is 00:05:26 Well, little people inside the radio that have been, you know, shrunk and hired and hired to play whenever you turn on the switch, that sort of thing. And he would say it with this perfectly straight face, his very serious, you know, professorly tone of voice. And eventually, you know, he would make me realize he was fibbing. but it was much more satisfying if I could figure it out myself if I could learn to notice those little bells in my head
Starting point is 00:05:53 of wait a minute, that doesn't make any sense before he, you know, gave it up. Your dad sounds like he'd be awesome at our dinner party. Awesome is one way to put it. So yeah, I mean, there's a neat story there about how that helped me be skeptical, but I don't know. It's really hard to, it's really hard to draw causal connection.
Starting point is 00:06:16 So now you're the co-founder for the Center for Applied Rationality. What does that do? I mean, what are you guys trying to accomplish? Yeah. So our mission is to develop strategies to improve on the default human processes for reasoning and making decisions and to educate people in those improved reasoning and decision-making strategies and particularly with an eye towards having a positive impact on the world. So the timing of founding the Center for Applied Rationality, which we call C-FAR, is basically we had noticed that the science on human rationality or more precisely human irrationality had kind of hit this peak over the last few decades. If you've heard of Daniel Kahneman, who wrote the bestseller Thinking Fast and Slow, he won a Nobel Prize for his work on human irrationality and inspired a lot more work on similar topics.
Starting point is 00:07:17 And there was just, you know, the academic research had sort of been building up for the last 50 years, kind of culminating in Daniel Kahneman's research, Nobel Award, Nobel Prize Award. And then the public interest in the topic had been growing over the past 10 years or so with books like Thinking Past and Slow,
Starting point is 00:07:35 but before that, Dan Ariely is predictably irrational, Nudge, lots of other books on behavioral economics, and cognitive science. And the thing was, there was all this research on how the brain is irrational, what kinds of systematic reasoning and decision-making errors humans make, leaving them to sort of systematically get the wrong answer to questions or systematically make decisions that they end up regretting, that kind of thing. And there was tons of public interest in this topic,
Starting point is 00:08:06 but there wasn't yet a lot of research on, okay, what do we do about this? There's a little research examining interventions to try to overcome some of these cognitive biases, but not a lot. And most of it was pretty shallow research. And I don't mean that in a disparaging way. I just mean the interventions consisted of things like the experimenters would tell the treatment group about the bias and then see if they committed the bias on a quiz or something. And that's not the kind of thing that I would expect to really change these. ingrained habits of thought and behavior, I would expect that to change ingrained habits of thought and behavior, you need more longer-term practice on real-life issues, not on toy
Starting point is 00:08:52 problems, you know, in a lab. I definitely want to get into how we go about changing some of those processes, but maybe before we continue, do we have a common understanding of what rationality means? Oof, how many hours do you have? It's hard. There's a sort of simple definition you could give that doesn't capture that much, and then there are increasingly complex definitions you can give that, you know, sacrifice parsimony for depth. Is there more than one type of rationality? Yeah. So sometimes we talk about two types of rationality. On the one hand, we have epistemic rationality. And sorry, these aren't just see far as terms. These are terms in cognitive science and
Starting point is 00:09:35 philosophy. Right. So epistemic rationality is about using processes, processes for reasoning or processing information that systematically get you closer to an accurate model of how the world works. And, you know, we can never be 100% confident and get the perfect, perfectly correct answer about everything about how the world works because, you know, we just have limited information in time and there's a lot of uncertainty, but some reasoning processes are just more reliable than others. So, I mean, to take a dumb example, making stuff up, or believing what a random person on the street tells you will be a less reliable process for having an accurate model of the world than synthesizing the opinions of top experts on
Starting point is 00:10:21 a topic or looking at randomized controlled trials or something like that. And that's kind of an oversimplified description, but I'm just trying to sketch out what the spectrum of epistemic rationality looks like. Wendy's most important deal of the day has a fresh lineup. Pick any two breakfast items for $4. New four-piece French toast sticks, bacon or sausage wrap, biscuit or English muffin sandwiches, small hot coffee, and more. Limited time only at participating Wendy's taxes extra. Should I go on to the second version? Can't please. type of rationality. So the second type is instrumental rationality. And that's about
Starting point is 00:10:56 making choices that given the best information you have at your disposal are most likely to achieve your goals. And I know the word goals kind of has associations for most people with like career goals and being productive and I don't know, maybe getting an award or getting something published. But in this context it could really mean anything, anything you want or value. It could mean having friends. It could mean being happy. It could mean, you know, helping the world, whatever it is you care about. Instrumental rationality is making choices that sort of you can systematically expect are most likely to, to achieve those goals as, as efficiently as possible. Is there only one optimum path for doing that, or is it more broad than that?
Starting point is 00:11:45 Well, it's an interesting theoretical question. I suppose if you had access if you had perfect information about the world, there might be one optimal choice that was slightly better than all the other choices in terms of its probability. But this is all very abstract. In practice, in practice it's not helpful to think about there being one very best choice
Starting point is 00:12:08 that you can know for certain. Okay. But you can, you know, with some careful reasoning and using some good heuristics or rules of thumb, you can sort of rule out some obviously bad choices. and try to make an educated guess among the set of plausible best choices.
Starting point is 00:12:26 Do you think that that's a good strategy if you don't know what's likely to lead to success, but you do know what's likely to lead to failure is just eliminating the stupidity? It's low-hanging fruit, I think. I mean, so often in my work at C-FAR, I guess I didn't get to fully finish explaining what we do, but the short rest of the explanation is that we, in addition to sort of doing some of our own research on these topics,
Starting point is 00:12:53 we run workshops where we run training sessions, teaching people some of the more promising techniques that we've been developing to help overcome some of these biases and make better decisions. And so often I find myself in this situation where someone is kind of, they're at some crossroads, you know, they're at their job and they don't really like it and they're not motivated. they're not really doing good work at all, but they don't really know what they should be doing instead, and they're having a lot of trouble figuring out what the best choice is. And so usually my answer in that case is, okay, I don't know what the best choice is. I don't think you have any way to know what the best choice is. But clearly the thing you're doing now is not the best choice. You're, you know, if the best choice is staying at your job, then it means staying at your job
Starting point is 00:13:39 and finding a way to make it actually fulfilling and to make yourself actually succeed at it. the best choice is not doing, you know, a half-ass job at your current job. So we can kind of eliminate your current strategy. That's a good point. That's a simple example, but, or it sounds silly. But that is, in fact, the way a lot of us make our decisions by default. So in your view of rationality, then what's the role of intuition? Well, so I know a common conception of rationality in the public, in the media, is that rationality
Starting point is 00:14:13 means dismissing or suppressing your intuition. There's often this dichotomy set up of, you know, reason and rationality on the one hand and then intuition and emotion on the other hand. And so, you know, this sort of perceived dichotomy gives rise to characters in movies or TV shows that are sort of explicitly the rational one or the logical one. And what that means in practice, you know, for the character's actions, is that they're the one who goes around, you know, poo-pooing other people's thoughts and feelings and looking down their nose at people for, you know, having fun or falling in love or finding something beautiful. And that is not what we or cognitive scientists deem rationality. the actual scientific model, which I think is also pretty common sense when you think about it,
Starting point is 00:15:18 is our brains are, our minds are divided. We do have a more intuitive side of our mind, which a cognitive scientist calls System 1. It's kind of a dull name. I apologize for that. And then we also have this more recently evolved part of our mind that allows us to do logical reasoning, way abstract tradeoffs against each other, do math, long-term planning, that kind of thing. It's very roughly speaking, it's our prefrontal cortex, and the kind of reasoning it does is what cognitive scientists call system two. And system one is indispensable. There's no way we could
Starting point is 00:15:54 actually survive as a species or as individuals if we ignored the output of our intuitive system one, right? So if I were to throw a pen at your head, and you had, well, we're on opposite sides of the continent right now, I think so you're safe, but if we meet in person someday and I throw a pen at your head and you were only able to choose what to do using your system two, it would be absurd. Your system two would have to sort of reason through all the possible ways to react. It might say to itself, okay, well, Well, first let's calculate the trajectory of the pen approaching my head and its velocity, and I can estimate, okay, it'll probably approach, hit my forehead in a half second.
Starting point is 00:16:42 What are my choices? I could stay still. I could try to dodge. Maybe I could try to catch the pen. Let's weigh the pros and cons of each option. And it would quickly be a moot point, right? The pen would have already hit your head. And this is sort of the drawback of your system, too.
Starting point is 00:16:56 It's sort of slow and laborious and effortful, although it can do. things that your system one can't do. Your system one is very fast and it's it's very good at processing lots of information that you've sort of picked up often unconsciously over the course of your life. So when you're, you know, forget about pens hitting foreheads. Let's take a real example. If you're in a social situation, you can sort of pick up things. You can pick up social cues to varying extent. If you can detect whether someone is, you know, frustrated with you we're bored or flirting with you or threatening you, you usually can't fully articulate what it is about their tone of voice or their facial expressions that's giving you that impression,
Starting point is 00:17:40 but you're still pretty confident it's happening. And the reason you can't articulate it is that it's your system one, your intuition, that's doing all of that processing. And it's doing it in this very black box kind of way, but this is what it, it's had practice over the many years of interacting in social situations. and so it's learned over time to be able to do this well and to just spit out a quick answer about, you know, how does this person feel about me?
Starting point is 00:18:05 So it's a long-winded way to say that system one intuition is indispensable. The catch is just that it's often fallible. And so rationality is it's less about ignoring or suppressing system one intuition and more about understanding the sort of respective strengths and weaknesses of the two systems and learning how to sort of get them to communicate with each other, so to speak, get your system two to listen to your system one and get your system one to listen to your system two and sort of update your emotional or intuitive impressions if they seem flawed in that situation. You mentioned art and love and passion and, you know, all of these
Starting point is 00:18:46 other things that we, you know, give us some sort of emotional response. What's the connection to rationality there? Is there one? The connection between art and rationality? Yeah, is there one? I mean, And you brought it up in a way that made me think that there was one. Well, oh, you mean in my example of the supposedly logical character in movies? Can it be rational to appreciate art? Can it be, I mean... Oh, yeah, yeah. Oh, by the way, before I answer your question, I realized I forgot to give the name of that character.
Starting point is 00:19:15 The name for that trope, that archetype, is the Straw Vulcan. Okay. It's a play on the expression, the Straw Man. Do you know the expression to Straw Man someone? Yes, yeah. Right. So it just means to... to present this weak caricature of what your opponent is arguing
Starting point is 00:19:30 and then knock that down because it's easier to knock down. And so the Straw Vulcan is like a weak caricature of rationality. It's not actual rationality, but it's like easier to sort of make fun of a knockdown than real rationality. So I just wanted to get that in. So, right, in your answer to your question about art and rationality, I mean, I'd say the only obvious connection to me, I kind of think of art as being orthogonal.
Starting point is 00:19:56 independent of rationality. The kinds of things that the artist is trying to do are mostly not trying to get the right answer to a question or achieve a goal. It's more of an expressive process. But in terms of appreciating art, I'd say for many people, maybe for most people,
Starting point is 00:20:14 for some form of art that's very important for their goals. And the kinds of goals I'm thinking of are indeed the kinds of goals that people usually don't think of when I say the word goals, but are still very important goals. the goal of feeling connected to other humans on the planet, the goal of, I don't know, feeling a sense of meaning or finding a sense of meaning. The goal of pleasure, lots of art is about pleasure. So I'm glad you asked the question, actually, because this is exactly the kind of thing that I think people neglect when they think about optimizing for their goals.
Starting point is 00:20:50 And these are, in fact, very important goals that people miss when they don't have them. but somehow they don't really, they don't seem to come to mind when people think about, okay, now I'm going to try to be rational. You're interested in changing minds. How do we go about doing that? Are you talking about changing other people's minds or changing your own mind? Oh, let's do both. Let's start with our own mind and then explore how we change other people's minds.
Starting point is 00:21:13 So that is the order in which I'm interested in. The order of questions you asked is the order in which I'm interested in them. it's difficult because usually for other people the order is reversed and when I want to talk about okay here's how to be better at change your own mind people often interrupt me and they're like no no no I want to change other people's minds how do I convince my you know co-workers my my partner that they're wrong yes if only they saw the world through my eyes right yeah it's less interesting to me but I'll also give my best answer to it so to change your own mind I mean the first step I would say is just believing that that's a desirable thing to do, which many or most people
Starting point is 00:21:56 don't really either would explicitly say is a bad thing to do because changing your mind makes you kind of wishy-washy or weak or stupid. Or they would explicitly claim it's a good thing to do, but they don't really believe it on a sort of gut level. And so they don't have the motivation to do it, and therefore they're not going to do it. So step one is actually believing yes, I believe on a gut level that there's probably a bunch of things I'm wrong about that I'm not yet aware of, just because that's true of everyone. And also that whatever it is I'm wrong about, I would like to know about that because I'll probably be able to make better decisions for myself and for other people. I'll probably be able to avoid hurting other people
Starting point is 00:22:41 if I have a more accurate model of the world. And so the implication of that is I would like to change my mind when I encounter new evidence. In fact, I want to seek out new. evidence that might cause me to change my mind, not just sort of passively accept it when it comes. How do we get to a point where we do that? Like, how do you take somebody who's naturally closed-minded and develop over time, I would imagine it's probably not a light switch, this open-mindedness to the point where you can give up your, you know, beliefs or cherished thoughts about something? Yeah. Well, I mean, even the way you phrased that question, And how do you take someone who's closed-minded is already kind of begging a question?
Starting point is 00:23:23 We're all closed-minded in some ways, right? We're blind to certain things. So myself, I'm blind to a lot. Yeah, I thought you meant close-minded in the sense of not wanting to change their mind. Oh, yeah. That's what I mean, right? So I just, like, you could show any amount of evidence. And if it's something that we hold dear, we have an emotional connection to.
Starting point is 00:23:44 We're less likely to change our mind. that's an extreme example, but I mean, there's many things in the organizations or workplace where any amount of evidence or rationality, I mean, the proverbial story would kind of be, you know, you present evidence about cigarette smoking, and then one person in the meeting chimes up and says their grandmother, you know, has been smoking for 99 years and doesn't have cancer, and then it just gets dismissed and washed it under the table, right? So how do you get to a point where people themselves are motivated, I guess, some way to change their minds or be open, more open-minded?
Starting point is 00:24:21 Right. So I guess I have this mental model, it's sort of a two-layer mental model where the bottom, the more fundamental layer is just abstractly but sincerely wanting to be able to change your mind. And that doesn't mean that in any given situation, like if you read an article saying that caffeine, well, that's a bad example because we're not. nutrition data is not very reliable, but let's say you read a reliable article showing that a reliable study showing that caffeine was really bad for you, but you've cherished your
Starting point is 00:24:54 morning cup of coffee or, you know, your three cups of coffee a day. That is going to be hard to hear and the natural reaction is to find a reason to dismiss the study. And that's often going to happen even if you have that fundamental layer in place of believing in general that you would like to be the kind of person who changes your mind. The distinction I was trying to make in my response to the closed-minded phrase is I think a lot of people just don't have that fundamental bottom layer. They don't even want to be able to change their mind in particular situations. I think it's like I'm pretty pessimistic about the ability to get it to work in a particular situation. So I tend to focus on that bottom layer first. And to get that in place, again, I don't think
Starting point is 00:25:35 it's the kind of thing you can sort of do to someone else. And when people, so we have people go through an admissions process to come to our workshops, just to make sure it's going to be a good fit for them, because it's kind of an investment of time and money. We want to make sure it's a good fit. And so one of the questions we ask them in the admissions interview is, why are you interested in coming? What are you hoping to get out of it? And sometimes, occasionally, we get someone who says, well, I want to like learn how to explain rationality to the people around me so that they realize how irrational they are. And those people, we gently turn away saying we don't think it's a good fit for them because it the motivation really
Starting point is 00:26:15 has to be I want to improve my own reasoning and notice my own blind spots. So my guess as to the things that cause someone to be not in that group of other focused people are my guess is that it's partly social like the influence of the people around you because you know humans are primates and social creatures and the things that we're motivated to try to acquire or be tend to be very influenced by the culture in which we're in which we live. So, you know, if your parents valued and rewarded change in your mind or if the people in your social circle value and reward changing your mind, I think that makes a huge difference and that that isn't just sort of an abstract a priori reasoning. I see that pattern just looking around me
Starting point is 00:27:04 at the people I know. The other thing that I think can sometimes work, even if you don't have the general motivation to change your mind, is there's a particular goal that you need to achieve, like your startup is going to fail if you can't look at the data in a clear-eyed way and make the best decision you can about, you know, whether to pivot or not, that kind of thing. So sometimes these kind of immediate, like, high-stakes motivations can really make you want to see the world the way it as closely, as close to the way it really is is possible and not to see sort of a wishful thinking version of the world that you've created in your mind. It doesn't always work. Sometimes we still kind of try to distort
Starting point is 00:27:46 or deny our perception of the world. But having that motivation can help. Did I answer your question? I can't remember what your question was. Yeah, I like that. I mean, how do we go about changing our own minds, right? And then how do we go about changing the minds of others? Oh, yeah, I guess I didn't quite answer the how did we change our own minds question. I just asked, I answered the question. How do you become the kind of person who could go about changing their minds? Right, yeah. So, I mean, that was a parenthesis around that. So if we can go back to, like, how do we go about changing our own minds?
Starting point is 00:28:15 The first step you had mentioned, and I derailed you after that, was how do we, you know, we need to be open-minded about processing new information. And then I think you had more to that answer. Yeah. So the problem that I found when I read the pre-existing advice, in, I don't know, skeptic blogs or, I don't, do your listeners know what the skeptic movement is? Maybe I should just explain that very briefly. Yeah, please do you.
Starting point is 00:28:45 The skeptic movement, or you could call it a community, I don't know. It's a group of people who are promoting scientific and critical reasoning and go around sort of testing and sometimes debunking things that turn out to be pseudoscience or, I don't know, paranormal or magical claims. So anyway, I used to read a lot of skeptic blogs and listen to some skeptic podcasts. And I think skeptics are pretty good at giving advice about the importance of changing your mind. But it tends to be at a very sort of high abstract level, like be open-minded. That's good advice.
Starting point is 00:29:28 It's important. But it's not clear how to concretely implement that. It's sort of like telling someone, you know, eat healthy. That's unlikely to actually change their behavior because their dietary choices are made up of all of these little moment-to-moment choices about what to eat and how much to eat, you know, when to stop eating, and just having this abstract commandment eat healthy in the back of their mind usually doesn't translate into changing their actions on a moment-to-moment basis. So a lot of what Sipar ended up doing was just taking this sort of abstract advice like the open-minded and translating it into these sort of very concrete almost algorithms that have that start with a cue with a trigger. So a cue might be I read an article that I disagree with. That's like a trigger. And then the action that I can take then to try to to manifest the principle of what.
Starting point is 00:30:28 moven-mindedness is look for evidence that actually agrees with the article. So this is sort of a counter to our general tendency to only look for evidence that supports what we believe. So in this sort of trigger action plan, we call these things, what we're trying to get ourselves to do is take the trigger of a moment where my natural response would be to try to find reasons to reject an article and instead install the habit of looking for reasons not to reject the article. And that doesn't mean I'm usually going to end up thinking the article is correct, but I will be giving it a much fairer shake than I would be if I didn't
Starting point is 00:31:07 have this trigger action plan installed. So that's just one example, but hopefully it's representative of the kind of concrete habit-based approach that we take to helping people change their minds. So that's really interesting. So that as you mentioned that, it seems like the cost of information processing would go up. And in a world where, you know, we're consuming so much more information, if you had a trigger action plan where you were evaluating each piece of information that you're reading and seeking out reasons for not disagreeing or agreeing or whatever it happens to be, you would be spending more time on that and have more of an investment in it. Do you think that that, like I guess how do we process information in a world where
Starting point is 00:31:53 the costs of doing that rationally become so high that maybe they're outweighing the benefits? I don't know. I mean, there is a cost. You're absolutely right. There's a reason that system one and system two are called respectively fast and slow thinking systems. We can't consume everything slow, can we in like today's age? Yeah, no, you can't.
Starting point is 00:32:15 But, you know, no reason to let the perfect be the enemy of the good, right? Of course, yeah. And the other thing I would say is that the whole, the goal with habits is that they become automatic over time. So there are a lot of things that I find myself automatically doing now. Like when someone I dislike says something, I now sort of automatically well, actually not all the time, but I sort of have an intuitive sense for when this would be a useful thing to do. I will automatically imagine that someone I like said the exact same thing. and I notice, would my reaction be different? In other words, was I sort of unfairly dismissing this person's argument because I don't like them? That kind of thing. You know, just block them out of your mind. Right. And that was something I sort of started trying to do intentionally and now it happens automatically. And to be fair, it is still, there's still some extra steps there happening in my mind that I wouldn't be spending the effort on if I didn't do this at all. But at least the process happens kind of quickly.
Starting point is 00:33:19 and automatically. So that is the goal to try to get some of these explicit processes to eventually lodge themselves in your system one and your intuitive thinking. But I agree, it's, you know, there's a bit of an investment there. It's not as, it's not quite as effortless and easy as just using your default thinking systems. Are there any other tips you would offer in terms of how we process information that helps us become more rational as we're consuming it? Oh, man, that's a big question. Well, I guess one thing we haven't really touched on is being attuned to your emotional reactions, which was a big thing that I, I don't think it was really on my radar so much before I started C-FAR.
Starting point is 00:34:06 But there's, so, you know, your emotions kind of are often in the background of your thinking. And so you, you know, you become defensive, you don't even really notice that you're defensive. You just start having thoughts like, oh, this person is out to get me. or like, oh, this isn't fair, or you start looking for things to criticize about them instead of, you know, trying to evaluate their criticism of you. And you're actually getting defensive. You know, your body's tensing up or your shoulders are sort of turning in. You can't see my shoulders. I'm doing it now. But you just don't notice that that emotion is influencing the way that you're processing the person's argument. Right. And so a fair amount of what we teach ends up being
Starting point is 00:34:49 about just being much more self-aware of those emotional reactions. And they don't always have to be, you know, defensiveness or aggressiveness or anything like that. They can often be sort of the subtle anxiety or concern in the background when your mind starts to go to a topic and then flinches away from it because it might turn out. If I start going down that road, I might have to conclude that, shoot, I shouldn't have entered the PhD program in the first place or like, I really, I really am going to have to break up with my partner. And these are unpleasant to think about. And so we automatically flinch away from them. But if you, if you become more attuned to your emotional reactions, they can just be important clues as to where you're, what are the
Starting point is 00:35:28 blind spots you're creating for yourself and where are they? It's interesting. I mean, yeah, I never pause and kind of reflect on my emotional state when I'm consuming information. I just assume that I process it all the same way. Yeah, I think there's a lot of interesting variation in the texture of your reactions that you can start to notice when you, you look for them. And different people have different ways that work well for them to develop this kind of self-awareness. One of my, the co-founders of Seafar has a background in Aikido. He's been doing it and teaching it for like two decades now. And he does a lot of meditation. And he's just very embodied, I guess is the right word, in a way that I'm totally am not. I really live in my
Starting point is 00:36:09 head. And so Val is his name is just really good at helping people detect the sort of physical, just what's going on with their body. Like notice the tension, notice that they're not, you know, that they're like leaning forward with sort of aggression or they're like leaning backwards and anxiety or something like that. I just am not good at noticing these signs of my body. I have other ways of noticing my emotions, but they're more cognitive and less. physical. So I think there's a lot of variation in what works for different people. But things like meditation, I guess, and to some extent martial arts, I think really do help a lot of people with this stuff. And that was an interesting discovery for me, because I had never
Starting point is 00:36:54 considered that there might be a link between or some overlap between meditation or martial arts and rationality. That's interesting. Have you started doing martial arts now? No, I'm really lazy. So to kind of come back to it, how do we change other people's minds then? Yeah, so there are different strategies to take depending on how intellectually honest you want to be, right? So, you know, a lot of the research that's come out, the books that I've been referring to on irrationality that describe, you know, all of these hidden forces that affect our decisions that we're not even aware of, like, you know, how tall was the person making the claim? Or was I holding a hot cup of coffee when the person asked, you know, made a request of me, that sort of thing? Right.
Starting point is 00:37:43 All these insidious aspects of our psychology that we're not even aware of, there's a real opportunity. there for people to exploit those things and use them to change other people's minds without actually making any good argument or presenting, you know, any evidence. And in fact, there's a good book about this. I would recommend called Influence by Chaldini, who's a psychologist. I can't remember his university now. It's been out for a few decades. It's been a bestseller for a long time.
Starting point is 00:38:15 But basically what he did is he went undercover at, various companies who are in the business of persuading people, of influencing people. So I think that included a telemarketer, some door-to-door sales. This was, you know, back in the 70s or 80s when that was more of a thing. Some, excuse me, some lobbyists or maybe not lobbyists, but activists who were trying to get people to sign petitions, that kind of thing. And, you know, these industries have, there's a real profit incentive to try to find new in better ways of persuading people to do things.
Starting point is 00:38:50 And so they've developed all these techniques, but they're not going to publish them because they're a competitive advantage for them. Of course, yeah. So he basically did all this research for years and then wrote a book, classifying the different kinds of techniques that these companies like these use
Starting point is 00:39:05 to persuade people. And it's a very clear and compelling and easy to read book, but very information dense. So there are five categories. I'm going to forget all of them. But they include things like scarcity, trying to make people feel like something is a rare opportunity. Social proof, trying to make them feel like other sort of high status people are doing this or believing this. Reciprocity, when you give someone something, like, like, Hari Krishna is giving people a flower at the airport.
Starting point is 00:39:39 They're much more likely to feel on some level, not consciously, like they owe you something. And so they're more likely to agree to a request or, you know, change their mind about something. So these are all kind of effective but insidious ways of changing people's minds. I personally am uncomfortable using methods of changing people's minds that don't ground out in sort of good argument or logic. So, you know, changing people's minds with facts is harder than changing their minds with a smile or a gift or a nice haircut.
Starting point is 00:40:15 See, you're like a diehard rationalist then in the sense of if I present a good argument, you should therefore adopt it or? No, it's, I definitely don't expect. So I wouldn't agree with that sentence if should means I expect people to adopt it. That's in fact one of the, one of the pillars of what I consider Stravolcanism to be. Because if you look at the quintessential Stravolcan with Spock himself, he is supposed to be the logical rational. one, but he keeps making miscalculations because he expects other people to behave rationally, and they don't. And he should know this because he's lived among humans for ages, and the fact that he still expects people to behave rationally is frankly quite
Starting point is 00:40:58 irrational of him. So I don't expect people to change their minds based on facts and evidence, because I know that's not a thing. And I don't know that I would endorse the moral should either. I don't think people are sort of morally wrong for not changing their minds in response to good arguments. It's just an unfortunate fact of the world and how our brains work. What I was trying to say was more like, I personally feel kind of morally uncomfortable with changing people's minds without facts and evidence. So it's kind of a constraint on my ability to change people's minds that I have to work, that I force myself to work within, essentially. And that doesn't mean I expect to succeed. It just means I don't want to succeed if I'm
Starting point is 00:41:43 following that constraint. Does that make sense? Yeah, totally. So one thing that struck me as you were saying, how we possible, or avenues to pursue about changing minds is I just had in the back of my mind the whole climate change thing where we have or even the cigarette companies who are trying to discredit, I guess, the growing evidence as it was coming. And it seemed like their playbook was more creating uncertainty and how effective is that when you're trying to change the mind of a group or make them at least or create some sort of uncertainty how do you how would you go about doing that uh you're asking me to put on my evil hat is that what you want me to do well that that presumes that you
Starting point is 00:42:31 didn't have it on to begin with so i don't want to make any assumptions what i create uncertainty I mean, well, I think it's quite easy to present what looks like a very compelling case for whatever you want to convince people of. Using studies, using quotes from experts, using even randomized controlled studies, which are sort of the gold standard, allegedly the gold standard in scientific research. The problem, like the reason that this is possible, and the reason this is all such a problem, is that there's so much scientific research that's done, that it's quite possible to end up with a few studies that seem to support the claim that, I don't know, cigarettes aren't linked to cancer. I actually haven't looked that up to see if there are studies showing that, but, you know, you can, if you do enough studies, some of them will even just by chance, and often by experimental bias, will turn out to show, you know, whatever you want. And so you can easily sort of selectively present the studies that support what you're claiming. And so even if you have a pretty skeptical
Starting point is 00:43:45 well-educated audience who knows, okay, I shouldn't believe things without a study, you can still kind of pull one over on them. And I think we do this to ourselves inadvertently all the time. You know, we kind of suspect that something is true. Like, we suspect that, you know, the raw food diet is good for health. And so we Google around and we find a few papers that seem to show that and we're like, ah, I found it, scientific evidence. But, you know, if you were Googling for the opposite, you could also have found probably a lot more studies showing, you know, no effect on health. So, oh, I don't know. I haven't actually looked at the studies on the rock food diet, but that's just an example. That's interesting. And so what is your take on how you would refute that?
Starting point is 00:44:25 So in the context of climate change where you have, you know, possibly people creating doubt or uncertainty, or we can use cigarettes to be more tangible, where I think the evidence is now overwhelming that, you know, there's a definite linkage to cancer. How would you have combated that if you were the government who's trying to kind of crack down and prove this, and then you have these companies that are well-heeled, you know, and well-resourced to fight that, trying to create some sort of uncertainty to maybe possibly not combat what people believe, but delay, you you know, it seems like an inevitability to everybody else. Oh, so hard.
Starting point is 00:45:08 I mean, I might relax my constraints a little bit. What would I do? I don't know. It's really hard. I don't have a good answer to this. Okay, no problem. I mean, that's a good answer, right? Yeah.
Starting point is 00:45:23 Yeah, no, I really, I think it's really hard. And I think I'm not confident that there is a good way to do it. while adhering, you know, perfectly to the standards of, you know, just using the best arguments and evidence, I think you probably have to also, you know, get charismatic spokespeople, and you probably also have to, you know, buy lots of ad time to, like, really, to use the principle of familiarity where if people hear something a lot, they're, like, more likely to believe it's true. And it's not, I don't feel great about that. But, you know, as long as you're staying relatively close to the end of the spectrum of integrity of persuasion, it might just,
Starting point is 00:46:04 you know, be necessary to try to get the right message out there. Better answer than I were to give. Before we get going, is there any books that you've read that have had an incredibly meaningful impact on your life or what has had the most kind of impact or change your direction or your thoughts to a profound degree? So the set of books that has had an impact, on me is a little different than the set of books I'd recommend to other people. Just, you know, the particular path that I followed and the particular books that happened to be, you know, influential to me given the place I was at is, you know, somewhat idiosyncratic. But one book that really influenced me in college was by a philosopher named AJ Iyer.
Starting point is 00:46:50 It's called Language, Truth, and Logic. Okay. And this is kind of, I was going to say it's a philosophy book for people who are, kind of skeptical of philosophy, but I also would want people who aren't skeptical of philosophy to read it because I think it would be good for them. It's kind of, it's this short, clear down-to-ear, down-to-ear, almost manifesto talking about the importance of clear thinking and philosophy would be a sort of vague way to put it. And it really had an influence on the way I think. I think I'm more likely now to do things like if I believe something,
Starting point is 00:47:28 like let's say I believe so-and-so is unfair or biased or let's say I believe rationality works or whatever belief I have, I'm more likely to ask myself to make myself cash out that belief in terms of a concrete prediction.
Starting point is 00:47:44 What do I expect to see differently in the world because this claim is true? So forcing myself to get really concrete to the point where my belief could potentially be disproven by evidence or at least made less likely by evidence. And this, I think, came out of reading Irish language, truth, and logic. He was basically, he was annoyed at the way philosophers would debate these kinds of empty, meaningless questions.
Starting point is 00:48:09 And he was pointing out, like, a lot of these questions could not be cashed out in terms of concrete. Right. You can't refute them or... Yeah, they're sort of too empty. There's this expression, not even wrong. Like, I'm not calling your paper your claim wrong. It's not even wrong. It doesn't even make enough sense to be wrong.
Starting point is 00:48:30 And so a lot of people think that Iyer kind of went a little too far in his condemnation of philosophy, and I would agree. But I think it's still a very useful sort of bracing splash of ice water in your face and like a good sort of habit to have in the back of your mind when you're thinking about things and evaluating claims. And it's an easy read, as I said, very short. I think you can just get it online. Awesome. I'll check that out. Yeah, so that's one. I would also, so obviously the work about human irrationality has been very influential to me. It's hard to pick a single book about that. I've already named Daniel Kahneman's
Starting point is 00:49:09 book Thinking Fast and Slow, which is... And CLD News? Oh yeah, and CLDNs. Ah, I've already kind of answered your question. But I'll give you a couple more briefly. Oh, the other book about rationality that I would recommend is a new book by Phil Tetlock. It's called Super Forecasting. And, you know, the thing that I said towards the beginning of the podcast that the motivation for founding CFAR was that there really wasn't much research at all on how do we overcome these biases. Well, Super Forecasting is an exception to the rule. Phil Tetlock has been doing amazing work, studying interventions to try to make people better at making predictions about things, better at evaluating arguments, that kind of thing.
Starting point is 00:49:52 And this book explains what he did and gives some really pretty impressive results showing the effects of these interventions. Basically, Phil Tetlock's team of forecasters outperformed by a huge margin, all the other teams of forecasters and experts in politics and economics who are trying to make predictions about world events.
Starting point is 00:50:16 And that's because Phil Tetlock's team was using these techniques. So it's a pretty inspiring story, plus being full of good data and examples. And then I guess the last thing I want to recommend is it didn't really change my thinking just because I was already kind of immersed in this subject, but it's the kind of thing that would have changed my thinking and really inspired me if I'd read it a little earlier. It's a book by a friend of mine, Will McCaskill. It's called Doing Good Better. And it's basically making, it's basically applied philosophy.
Starting point is 00:50:50 So it's the premise of the book is that the way that people instinctively kind of by default try to help the world is really ineffective and inefficient. And it's driven by these kinds of system one sort of intuitive heuristics that aren't always that epistemically rational. So, for example, people are more likely to give to a charity if they see a picture of, you know, a starving child on the cover of the brochure. And that's very understandable. It's very, it like really taps into our emotions and motivates us, but that often has no connection to whether the charity is actually effective at helping people and saving lives and improving the quality of life. So, effective altruism is, I mean, simply put, it's just the idea that for whatever level of money or time or
Starting point is 00:51:41 effort or sacrifice you want to put into trying to help the world. And we all have different levels that we're comfortable with. But for whatever level, there are vastly better and worse ways to do it. They're literally like orders of magnitude difference in the effectiveness of different charities trying to do the same thing. So just by by kind of stepping back from your automatic emotional reactions and asking yourself, okay, what does the evidence show? You can just do, you can save a hundred times more lives with the same amount of money that donating to charities over your lifetime if you choose your charities well. That's sort of the tip of the iceberg. But it's, I like it. It's inspiring. It's sort of a great example of rationality
Starting point is 00:52:22 being used to make the world a better place. That's an awesome way to think about this as we head into the end. Listen, I want to thank you so much for your time today. And this conversation was fascinating. Thanks so much for having me on. Hey guys. This is Shane again. Just a few more things. before we wrap up. You can find show notes at Farnhamstreetblog.com slash podcast. That's F-A-R-N-A-M-S-T-R-E-E-T-B-L-O-G dot com slash podcast. You can also find information there on how to get a transcript.
Starting point is 00:52:58 And if you'd like to receive a weekly email from me filled with all sorts of brain food, go to Farnham Street blog.com slash newsletter. This is all the good stuff I've found on the web that week that I've read and shared with close friends, books I'm reading, and so much more. Thank you for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.