Life Kit - How to counteract common thinking traps

Episode Date: September 15, 2022

Humans have a tendency to make snap judgments and assumptions due to our cognitive biases, says Woo-kyoung Ahn in her book 'Thinking 101.' So how do we fight them?Learn more about sponsor message choi...ces: podcastchoices.com/adchoicesNPR Privacy Policy

Transcript
Discussion (0)
Starting point is 00:00:00 This is NPR's Life Kit. I'm Elise Hu. Most of us have probably never taken a class that's just about thinking, or more specifically, thinking traps. But Woo Kyung Ahn teaches on these in her role as a psychology professor at Yale. Her class focuses on the common cognitive biases we fall into, sometimes without realizing them. She likes to start out one of her lectures by showing them a BTS dance clip. All right, what I'm going to do is I will play the original version first, okay? And then we'll get to the slow down version, okay?
Starting point is 00:00:38 See whether you can do it. So her students watch this short clip over and over again, trying to remember all the basic steps. The one that I used is supposed to be the easiest one. And I only got the six seconds of that video. And I played that like 10 times to the students. And I warned them that they can come out and show the dance for the whole class. And they will get a prize for it. You feel like you can do it.
Starting point is 00:01:07 Even I feel like I could do it. I'm not a dancer at all. But after that, they come out and don't look at the screen anymore. I only play the music. They face the audience. And then, of course, none of them could do it. The experiment with her students illustrates the fluency effect, one of the many cognitive traps Woo Kyung-an warns us about.
Starting point is 00:01:28 It's about almost everything that looks like very fluent and executed skillfully. It creates an illusion that it's easy to do. Of course, it's not just about fooling ourselves into thinking we're better dancers than we actually are. Our cognitive biases can seriously cloud our judgment and sometimes lead to bigger societal issues, like thinking hydrochloroquine is a COVID cure, for example. So one question I have is, does knowing about these biases and the tendencies that we have to fall into these errors do anything to prevent them?
Starting point is 00:02:12 Well, I don't think that's a sufficient solution, but it is actually a necessary step, right? So awareness is a first step. Yes, exactly. We should be aware of it, but I don't think it's a magic wand. Of course not. Okay, step one, learn your biases. On this episode of Life Kit, Thinking 101, Professor Ahn walks us through some common thinking traps she writes about in her new book and offers helpful advice on how to avoid them. Hey, Life Kit listeners. Andy Tegel here to spread the word about our new special series, Dear Life Kit. It's an advice
Starting point is 00:03:05 column for your ears, and we're getting personal. Every episode will enlist expert advice for one of your most pressing and intimate anonymous questions about life, love, and how to keep it together, all in about 10 minutes. New episodes every Saturday until October 8th. Listen to Dear Life Kits from NPR. Okay, let's get into some of the biases that really have major consequences, not only for us individually, but also as a society. There are two particular thinking traps that I want to focus on because they could have severe effects. The first is negativity bias, which you write makes us act totally irrationally sometimes or just leads us to make the wrong choices. So what is it? And give me an example. So the negative bias is the loss, the negative information actually weighs a lot more than the equal
Starting point is 00:04:09 amount of positive information. Okay. So give me an example in my daily life. I remember you had one about kind of when we are shopping and looking at ratings. Right. Right. It's called, it's like a loss aversion, too. So let's say you order something online, and it said free return, no risk at all. So you just order it. You're not sure about the length or the fit. It arrived, and now you try it on. It kind of belongs to your house. It's kind of yours, and you can't return it just because it is yours now. You own it at this point.
Starting point is 00:04:49 Returning would mean losing it. And that feels like a huge, huge effect to you. Why is it that we cling to things that we already have and are kind of scared to lose them? So some people make an evolutionary argument about this. They say we were evolved to be a lot more sensitive to the negativity than positivity because our ancestors lived in an environment where resources are very scarce. It's just a matter of life or death all the time. So losing something at that point should mean a lot of, you know, it should be something that we should care a lot about.
Starting point is 00:05:33 Gaining something, it's kind of a luxury at that point, right? So that's why we are a lot sensitive to losses. So it does make sense. But in the current environment, that can't be the case for most of the people. So because we tend to have loss aversion and filter for the negative, how do we counteract that? I try reframing the questions. So there's a study. It involves a custody decision between two parents. Parent A has, it's like a
Starting point is 00:06:07 parent with all B grades, just average on everything. Parent B has very good features and very bad features. And if we ask participants, who would you offer the custody to, then they choose the second one because they focus only on the good features. And then if a different group of subjects were asked who would deny the custody from, then they'd also choose the second one because they focus on the negative ones. So one of the ways of avoiding the negativity bias would be to reframe your question. Instead of just saying who would you deny the custody, maybe you might be to reframe your question. Instead of just saying, who would you deny the custody? Maybe you might want to reframe it as who would you give the custody to,
Starting point is 00:06:51 then kind of average out. So it's like a ground beef, right? We can say that, oh, it's a 15% pure fat, but it's 85% lean, right? So that's one way of getting over it. Yeah, so when I'm shopping and I see 85% lean, I'm like, oh, that's good enough. That's lean enough. But you're saying that if it was advertised
Starting point is 00:07:14 as 10% pure fat or 15% pure fat, that really reframes things. Exactly, yeah, yep. The next big bias or thinking trap I want to focus on is confirmation bias. And this came up a lot, obviously, during election years, especially. You argue that confirmation bias is probably the worst of all the biases. What is it really quick? So there are two kinds of confirmation bias. One is that it's a tendency to seek evidence that would confirm your hypothesis. The other is a tendency to interpret evidence to fit with what you believe.
Starting point is 00:07:55 So it's the same kind of bias, right? Trying to show you're right. So it's at two levels. One is you want to be fair to yourself. So here's one study that I did. I gave participants some fake genetic information about themselves. This is all IRB approved. It's ethically conducted.
Starting point is 00:08:20 So they performed some saliva test and they were asked to rinse their mouth with a mouthwash. And that mouthwash actually contained a large amount of sugar. And then they have to put this test stripe under their tongue, and the color changes because that test stripe is just diabetes stripe. And when the color changes, we randomly assign them into one of the two conditions. In one condition, we tell them that means that you don't have genetic risks for major depression. In the other, that says the color change means that you have elevated levels of genetic risks for depression. There are more than hundreds of participants.
Starting point is 00:09:03 There's no reason why one group would be particularly more depressed than the other group. And then I administered Beck's Depression Inventory, which is a measure of depression. And we asked them how depressed they were in the past two weeks or how pessimistic they were, how well they slept and so on. And then the people, the group who were told that they don't have a genetic risks, they were way below the cutoff for the depression. But the people who received the positive test results, they were way higher than the cutoff for the major depression. That is this three minute fake saliva test. Made them think. Made them depressed. Created a major depression.
Starting point is 00:09:46 Yes, exactly. That's what I mean by how confirmation bias can be unfair to you. Because once you learn that you have a genetic risk for depression, and when you think about your past two weeks, you may retrieve only the evidence that fits with that hypothesis. Yeah. So you're shortchanging yourself. Exactly. Exactly. And once you start believing that, then moving on, I mean, of course, we had to debrief them right away. So that this is all fake and blah, blah, blah. But moving, if they were not in real life, if they actually get that kind of a genetic feedback, they might move on. They might go on thinking that I am actually a depressed
Starting point is 00:10:32 person. I can see how a teacher telling me in elementary school that I wasn't good at sports then made me filter and make me shortchange myself and think that, oh gosh, I'm never going to be an athlete, right? Or this can happen in a bunch of different permutations. Exactly, exactly. So that is what I mean by how the confirmation bias can be unfair at the individual level. And how could better thinking be fairer at a more collective level? Right. So if you hire, for instance, only the male people for
Starting point is 00:11:08 top-level scientist jobs, you know, if people believe that only men can be a great scientist, only men have a better, you know, intrinsic aptitude to do science, then you don't end up hiring female scientists. And that's exactly how the prejudice and stereotype gets formed in the society. But look who discovered all these COVID vaccines, female scientists. But there's no guarantee we should be open-minded that there can be other possible causes. So what would you say that we should remind ourselves when trying to explain the why of something or when we make quick judgments or decisions? What could we do to counteract some of the most common traps? For the confirmation bias, you should always think about what the other possible counterfactual cases, right? Always think about, yeah, what if, would it have been the case if the person was also female and so on.
Starting point is 00:12:14 I mean, confirmation bias is really difficult one to get rid of because it is actually a very adaptive mechanism too. So for instance, when our ancestors were hunters and gatherers, they go to forest and found very good berries. And would you try a different forest to try to falsify your hypothesis? Or would you go back to the same hypothesis at the same forest, right? It's a confirmation bias, but it's actually also adaptive system. What is interesting about confirmation bias that's described in the popular media is that they feel like it's something devious, something that, you know, only a very bad people want to do it just to prove that they're right. No, that's not the case. It's
Starting point is 00:13:03 just ingrained in our cognitive system. It's a very cognitively efficient system, right? Because we're not born to be scientists. We don't need to find out the truth about everything. But medically, I don't want to try to take all the drugs in order to figure out which one works for me. Exactly, exactly. And my husband, he had worked with, you know,
Starting point is 00:13:28 in my life for 27 years. I'm not going to try another husband to falsify the possibility. Very costly. Yes, yes. But in other cases, however, it can go wrong, right? Like, you know, you might also limit the possibilities in your life. For instance, you might go to the restaurant and eat the same food. Yeah, yeah, yeah. Same thing over and over again.
Starting point is 00:13:51 So for the things that are not as risky, you might want to just kind of do the random search just for fun. So you can just take a different route
Starting point is 00:14:02 to go to your work. Why not, right? Before we let you go, Wukong, are there any basic questions that we could ask ourselves if we feel like, hey, I might be falling into a cognitive trap and I want to counteract it? One of the things that I sometimes do when I get overly anxious or overly stressed out,
Starting point is 00:14:22 I just take the drone perspective. I pretend that I'm watching myself from a drone. And it really kind of puts me, puts everything into perspective. I kind of see, oh yeah, there's the other side of the story too. Or this is actually nothing, you know, what I'm worried about is really nothing compared to everything else. Or we could also think about the alternative scenarios as well. In many cases, that helps with, you know, overly narrow judgments and being overly confident about your judgments. Because if you broaden your perspectives, then yes, you realize that, oh yeah, maybe there was a different side of the story as well. Yeah. And for the bias that
Starting point is 00:15:11 we think that we might be good at an easy BTS move, how do we counteract that? You have to just try it. You have to dance it. That's easy. That's the easiest one to start out with. Do it. Woo Kyung Ahn, a professor at Yale University and the author of Thinking 101, which is available now. Thanks so much, professor. Thank you. For more Life Kit, we invite you to check out our other episodes. I did one on dealing with regret, which has a lot of tie-ins to this episode. And another on the basics of caring for your skin. You can find those at npr.org slash life kit.
Starting point is 00:15:53 And if you love Life Kit and want more, please subscribe to our newsletter at npr.org slash Life Kit newsletter. And now a random tip from one of our listeners. Hey, this is Hannah from Austin, Texas. Something I always do if you have a scratchy throat is, after I'm done steeping my tea, when I would normally mix honey in or something of that nature, I also drop in my favorite cough drop. And I'll just stir that in. It'll melt. You just get that nice soothing feeling as you finish your cup of tea. If you've got a good tip, please leave us a voicemail at 202-216-9823, 202-216-9823, or just email us a voice memo at lifekit at npr.org. We love your tips.
Starting point is 00:16:38 This episode of Life Kit was produced by Michelle Aslam. Our visuals editor is Beck Harlan. Our digital editor is Malaga Garib. Megan Cain is the supervising editor. Beth Donovan is the executive producer. Our production team also includes Andy Tagle, Audrey Nguyen, Claire Marie Schneider, and Sylvie Douglas.
Starting point is 00:16:57 Julia Carney is our podcast coordinator. Engineering support comes from Trey Watson, Stu Rushfield, and Patrick Murray. I'm Elise Hu. Thanks for listening.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.