Ideas - What you should do when accused of being biased

Episode Date: April 24, 2026

All of us are biased. We have individual biases, momentary biases, morning biases and evening biases. Our institutions are biased. Our constitutions are biased. So what to do about it? IDEAS producer ...Tom Howell explores the art of naming your most important biases — and deciding which to keep, as he continues his investigation into what the field of ‘bias studies’ has to offer us. *This episode originally aired on Feb. 3, 2022. Guests in this episode:Olivier Sibony is the author of Noise: A Flaw in Human Judgment and You're About to Make a Terrible Mistake.Jessica Nordell is the author of The End of Bias: A Beginning.Jimmy Calanchini is assistant professor of psychology at University of California, Riverside.Jack Nagler was the ombudsman at the Canadian Broadcasting Corporation.

Transcript
Discussion (0)
Starting point is 00:00:01 The Powering Politics Podcast is available six times a week, but you might not be. If you want to catch up on what happened this week in politics, join me, Laura Dangelow and some of Canada's most tuned-in political strategists to break down the week that was. Short on time, the weekly wrap has you covered with a new episode every Saturday. This is a CBC podcast. Welcome to Ideas. I'm Nala Ayyad. However hard we try. to be objective. Our interpretations of facts and figures are always subject to our biases. Bias is by far issue number one in terms of what people complain about.
Starting point is 00:00:43 How can we change something that we don't even know we're doing? Over-precision. I overestimate the degree of confidence I can have in my estimates and forecasts. What good does that do you at the time you need to make the decision? I still think there's a very strong argument to be made to de-bias your perspective so that you get things right more often than not. There's no time. There isn't time to actually find out everything. On today's episode, what a free-minded individual can do when accused of being biased.
Starting point is 00:01:18 To clothe anybody who disagrees as a moral pervert and a racist while confirming the view that benefits economically the liberal elites, right? It's just appalling. The self-serving bias. I genuinely believe in viewpoints that happen to coincide with my interests. Ideas producer Tom Howell continues his investigation into what the field of bias studies has to offer us. Maybe there are practical steps a biased person should take to become less biased. Or maybe spotting one's own prejudices is actually the key here,
Starting point is 00:01:56 even if one makes no effort whatsoever to change them. Status quo bias. I tend to avoid making a new decision and maintain the status quo by default. Is that a problem? In the first episode of this series, Tom encountered myriad ways of defining, analyzing types of bias. He now turns to considering which ones he should try to combat. This episode is called The Bias List. The point of a bias list is it's personal.
Starting point is 00:02:31 My bias list is different from your bias list, is different from my colleague's bias list, is different from my wife's bias list. All right. So. I recently found a book called You're About to Make a Terrible Mistake, How Bias Distort Decision Making and What You Can Do to Fight Them. Confirmation bias slash storytelling. I pay more attention to facts that support my hypotheses and neglect those that would disconform. them, particularly when my hypotheses are organized in a coherent narrative. The book contains an appendix with all the major species of bias labeled and grouped into five families. I rewrote their definitions in the first person, so they take the form of a confession, and then got my wife Linda to read them out loud.
Starting point is 00:03:21 Experience bias. I reasoned by analogy with situations from my own experience that easily come to mind. I mean, actually, I'm not sure that I spend that much time reasoning by analogy, because I know analogies are a bit stupid, like two things are not going to be perfect equivalence. I also gave this list of confessions to my colleague, Nahit. Okay, confirmation bias, storytelling. I pay more attention to facts. I wanted to see if the people in my life could, through sheer introspection, identify the biases they are most prone to.
Starting point is 00:03:57 coherent narrative. I hate to admit it, but that's probably true. I probably do do that. My hypothesis, which on a good day can form part of a coherent narrative, is that each of us ought to be capable, with help from others, of naming our own most important biases. Champion bias. I give too much weight to the reputation of the messenger versus the value of the information he or she bears. I mean, that I think, is unfortunately very true. Because I don't know. Like, how am I supposed to know whether things are true? Within the universe of possible biases, some surely affect one individual more than another.
Starting point is 00:04:40 Experience bias. I reasoned by analogy with situations from my own experience that easily come to mind. Definitely do that. Definitely. Would it not be useful to have on hand a list of just your very own, your personal top 20? or 25. Not so many is to be unmanageable, not so few as to be unambitious. This list could be a great help in one's endeavors to be fair-minded to others. And bonus, it'll save some work in that endeavor because if a particular bias doesn't make my list, it means I don't need to worry about
Starting point is 00:05:10 that one. Champion bias. I give too much weight to the reputation of the messenger versus the value of the information he or she bears. No. I am too cynical for that. And I actually have the opposite problem that if somebody has a really stellar reputation, I tend to disregard what they're saying more easily. Well, as it so happens, I spoke to somebody with a stellar reputation just the other day. Hi, I'm Olivier Sybony. I'm a professor of strategy at HEC Paris, which is a business school, and I'm the co-author with Danielle Canerman and Kassamstein of Noise, a Flow in Human judgment. Olivier is also the author of the aforementioned you're about to make a terrible mistake with its taxonomy of the 24 key biases.
Starting point is 00:05:58 And as it turns out, he thinks my idea of trying to perform a sort of bias MRI on myself is a terrible one. Yes, I have a big word of caution for you. It doesn't work. You can try. I mean, it's intellectually interesting. But it's awfully difficult to be aware of your own biases. that's the whole point of biases, if you think of it, right?
Starting point is 00:06:25 I mean, we all know that we shouldn't make the same mistake twice, and we all learn from our mistakes. We were three years old, we go to the oven, we touch the door, we get burned, we remember not to touch the oven door. And that's how we learn. We make a mistake, we understand the mistake, we don't make the mistake again. What makes biases different from plain old mistakes is that In fact, it's very difficult to realize that the reason we are making a mistake is because of a bias.
Starting point is 00:07:00 Halo effect. I form a general impression of a person, company, etc. Based on a few salient features and let that impression. Halo influenced my assessment of unrelated features. Yeah, probably. No, because I do know enough people, good people that have done terrible things. I mean, there's no time. There isn't time to actually find out everything and make informed decisions about what you think about stuff.
Starting point is 00:07:33 Like it is basically your general sense of. Survivorship bias. I draw conclusions from samples that include success. You draw conclusions from samples that include successes but exclude failures. That's probably true. I overestimate my relative abilities, i.e. how much better than others I am. Oh, definitely not. Definitely not.
Starting point is 00:07:56 Too much self-esteem. The problem is that at the time you're making your decisions, if you look at your list of 25 biases, some are going to tell you, oh, that's a bias that would push you in the direction of doing it. Be careful. Others are going to tell you, that's a bias that will push you in the direction of not doing it.
Starting point is 00:08:16 Be bold. Are you a victim of overconfidence? Or are you a victim of status quo bias? The two pull you in opposite direction. how would you know? So this whole idea of, you know, coming up with a list of biases and trying to use it whenever you have a decision to make seems to me like a bit of a fool's errand, to be honest.
Starting point is 00:08:39 Thanks. Maybe that's just your... Sorry. Okay, but, are you then saying... Like, there are some people, you know, I'm accused of bias from all sides of the political spectrum, for instance. Do I just say to them all, well, you never know. There's no way of knowing.
Starting point is 00:08:55 which bias I'm under at any given time. There's no point saying I have a, I need to be more cautious about my, I don't know, my bias towards English majors and my bias against statisticians. Like, are there not persistent biases that characterize certain people as opposed to other people? It seems to me that there must be. There could be. And you could become aware of them, probably more from the feedback of others than you would from introspection.
Starting point is 00:09:21 But if you, you know, if your colleagues told you, after dozens and dozens of recruiting conversations, that, gee, Tom, you always seem to be biased in favor of English majors. And whenever we have an engineer, you seem to have a really negative stereotype of engineers and never want to hire them. You might want to reflect upon that. It's very hard to do this yourself, because to do this yourself, you would have to realize that you've made a mistake by being biased in favor of English major.
Starting point is 00:09:55 in our example. How would you realize that? How many English majors do you have to hire? How many of them have got to underperform? How many reasons do they have that can lead them to underperform that have nothing to do with being an English major? How would you become aware of that? It's very difficult.
Starting point is 00:10:15 So I'm less excited about this approach than I used to be. I used to be a believer that you could do this. I'm now much more in favor of what we call in this book, No, is decision hygiene, which is to try to make your decision process less susceptible to your biases whatever they are. It is alleged to be futile, this project of building a pan-opticon for the biases operating within oneself. It is alleged that one should give it up in favor of better decision hygiene. of commitment, sunk cost fallacy. I double down on a failing course of action, notably because I do not treat the resources previously invested as sunk costs. I mean, yes, this describes my entire career as a writer. Resource inertia. I am timid in reallocating resources to reflect my stated priorities, especially when these priorities change suddenly. I actually think I'm pretty good at that. That's because I've been raising kids my whole life.
Starting point is 00:11:24 Um, escalation of commitment, sunk cost fallacy. Oh, no, poor action. It's failing. A nailing course of action, notably because I do not treat the resources previously. But I put so much time and energy into it. No, I don't think that one's a problem for me. I mean, are those resources really sunk? Is this irrecoverable effort? I think I'm pretty quick at abandoning something if it's not working the way I think it needs.
Starting point is 00:11:52 It's like I'm pretty quick at pulling the plug on stuff. Okay, quickly. Help me do it. How do I? I want decision hygiene stats. How do I do it? Well, a couple of things. First of all, don't decide alone. Whenever you have a decision that you can make with other people, make it with other people, try for the judgment of these different people to be as independent as possible of one another and not to influence one another. Multiple independent opinions is one. piece of decision hygiene. Another piece of decision hygiene would be to break down the problem
Starting point is 00:12:29 into discrete components and to try to evaluate those discrete components as independently of each other as possible. There are other ways to have decision hygiene. One way would be to have guidelines, to have rules, to have models and formulas that tell you how to make some specific decisions that are frequent recurring decisions. But, you know, multiple judgments... This is reaching for a yardstick, right? This is the problem that we don't have a yardstick, so we're going to try and conjure one up using various mechanisms and measurements. Exactly, exactly. To come up with the scale that you agree about,
Starting point is 00:13:03 the yardstick, as you say, I like the phrase. My name's Jack Nagler, and I'm the ombudsman at CBC. Meet Mr. Yardstick. At its core, the ombudsman is someone who represents the public. And in this case, when you're talking about being a news ombudsman, I'm that intersection between journalism and the people. So I'm there to represent and advocate on behalf of the public to make sure that in this case, CBC does journalism that's fair and balanced and accurate.
Starting point is 00:13:32 And I'm also an advocate for the crafted journalism trying to explain how it works sometimes to the audience, because there's not always an easy understanding between the two sides. Right. So you also have to decide whether the audience person who's complaining to you is wrong. Well, so when someone files a complaint, there are certain cases where I end up doing a public review, and I assess the journalism against the complaint and through what CBC calls the journalistic standards and practices, and ultimately come down and say the journalism did or didn't adhere to the kind of standards that were ethical and appropriate that we want. So, you know, most of the time, as in many disputes, it's not about deciding that one person's right and the other person's wrong. It's somewhere on that spectrum in between, and you're trying to figure out where it fits. Right.
Starting point is 00:14:19 Sometimes the journalists are mostly right and the complainant is mostly wrong. Is it a fun job? Look, when you're dealing with people who are unhappy about something and feel passionately about something, fun is not the first word that comes to mind, perhaps not even the first F word that comes to mind. But it's an important job. It's a job I think is incredibly important. I wish that more media organizations had this kind of role because journalism is a big part of people's lives. They rely on it. They count on it. And they're frustrated as heck by it many, many times.
Starting point is 00:14:53 And there isn't an outlet for them to help understand why their frustrations either are or are not valid. And so if I can be an honest broker and help people understand things better, that's a really valuable service. Does anyone complain about bias at this organization? Does anyone not complain about bias? No, honestly, you know, bias is by far issue number one in terms of what people complain about. And particularly now, the question of how much people trust the media and whether they feel that mainstream media in particular have a particular agenda when what they should really be is and choose your word of neutral or objective or impartial, all of those different terms come up. But they're frustrated when they hear stories that don't align with their views on the world or the frame. through which they see things. And so they often accuse media organizations of bias. And that's
Starting point is 00:15:48 hardly unique to CBC. That's commonplace everywhere. And so what is your philosophy on allegations of bias? You have a general sort of attitude to it. You feel that some of the points are legitimate. You feel that you would say that there are biases that are inevitable? or... Look, in journalism, you're dealing with people, right? It's not to this point, with few exceptions, it's not robots who are conducting journalism.
Starting point is 00:16:19 And people have biases based not just on their own beliefs, but on their own experiences, their upbringing, their ethnicity, their cultural frames of reference. Everybody brings biases to the table. I think fundamentally, the first thing is that journalists need to be aware of those biases and then figure out which ones are benign and which ones actually need to be confronted. I love that you think that's an important job because every expert on bias I've talked to has told me this is impossible.
Starting point is 00:16:45 You cannot become aware of all the biases that are affecting you and detect which ones to hold on to and which ones to get rid of. No, well, I'm not saying to get rid of them. And I'm also not saying that anyone has insight into all of their own biases. So I'm not contradicting anything that you've heard. But what I'm saying is that it's important for journalists to recognize that they inevitably have biases. So they need to do something to make sure that those biases don't distort their ultimate objective, which is essentially to pursue the best obtainable version of the truth, as Watergate's Carl Bernstein calls journalism. If the mission is to try to really understand a circumstance and to explain it to their audience,
Starting point is 00:17:33 you want to be able to make sure that whatever your biases are, that they don't take you off the road toward truth. It's the existence of this road, and it's the step from, sure, I have biases to I have these biases and these biases are the problematic ones. It's that step that I find the most intellectually challenging, because I haven't met anyone yet who professes to have no biases, but neither have I met anyone who can say, and here they all are. Right.
Starting point is 00:18:07 But in a sense, if you are open to it, it's not an imperative that you need to have insight into every single one of your biases in order to confront them, right? Because you take the advice and the input and the feedback from people around you who can help you confront them. So if you're a journalist, you want to make sure that you speak to a variety of people with different perspectives on the topic you're covering. You want a newsroom that's diverse in all the different ways from, you know, race and age and gender to ideological partisan affinities
Starting point is 00:18:41 and get those different kind of inputs to help you realize where maybe you're a little bit short-sighted and you don't realize that you have a bias. But, you know, when I spoke about the biases that you should worry about in the ones that you don't, I mean that in some ways, you know, bias is ultra, it can also be a synonym for core values. Those aren't necessarily bad things. You know, we have a bias in the media, you know, in this country, whether it's CBC or elsewhere, that democracy is better than tyranny.
Starting point is 00:19:14 That's a bias, even though there's nobody out there, or very few people out there, I should say, that would actually question that assumption. So, you know, there are certain biases that are actually important to kind of define what you do and set a framework for what you do. But it's the biases that, you know, where you don't realize you're putting your finger on the scale of how the audience might judge something that does have a variety of different perspectives on it. And you need other people to sometimes give you a shake. And you need, but it starts with that recognition that, yeah, I probably have biases. And I need to be open to this stuff to make sure that I'm not letting my preconceived notions influence the work too much.
Starting point is 00:19:54 When I write an opinion resolving a case, I read every word from the perspective of the losing party. I ask myself how I would view the decision if one of my children was the party that I was ruling against. Even though I would not like the result, would I understand that the decision was fairly reasoned and grounded in law? That is the standard that I set for myself in every case, and it is the standard that I will feel. follow so long as I am a judge on any court. The U.S. Supreme Court Justice Amy Coney-Barritt recently said that she thinks justices must be hyper-vigilant to make sure they're not letting their personal biases creep into their decisions since judges are people too, which sounds like one of those things who go, well, yeah, sure.
Starting point is 00:20:47 But I'm just wondering, if you switch that judges to journalists, would you support that statement that journalists need to be hyper-vigilant to make sure they're not letting their personal biases creep into their decisions since journalists are people too. Is that fair statement? I think that journalists need to be hyper-aware that they have biases and take the steps that they can to make sure that those biases don't divert them from their objective to try to actually tell a story that is fair and balanced and accurate. You know, to be vigilant and be questioning yourself every step of the way, you know, is not the only solution. But you need to be aware of something and hopefully your news organization has methods in place. You know, there is a methodological
Starting point is 00:21:34 approach that you can take to make sure that there's a rigorous vetting process, you know, and at CBC, as I'm sure in many other places, people don't just go on the air and say whatever the heck they want. They actually talk with their producers and say, well, this is the story we're going to tell this, they'll write a script before they go on television or on the radio, and say, okay, this is what I'm going to say. And people have really vigorous debates about every word. You know, I would also draw a distinction, of course, between judges and journalists. They're similar. We expect them to be fair-minded. We expect them to be accurate and we expect them to examine a situation carefully. But ultimately, the judge is trying to figure out how does
Starting point is 00:22:15 the law, which is there in writing, apply to the circumstance that you're in. And the journalist's task is to try to figure out what actually is happening in this circumstance and explain it to the audience in a way that will help them understand their world a little bit better. You know, those are those are not completely dissimilar but they're not the same thing. Yeah, journalists have even less of a book to work from than a judge says, I guess. Last question. Sure. Is media bias a big problem in Canada? Yeah, sure. Throw me the easy one at the end. Look, I think that there is such a thing as advocacy journalism.
Starting point is 00:22:57 It's not the kind of journalism that most people think of right away when they just think of the J-word. And it's not the kind of journalism that should be practiced by the public broadcaster. So advocacy journalism is not the thing here. And bias, if it exists, is problematic. I think it would be naive to say you can squelch media bias. it's, you know, there's a way to eliminate it altogether. But I think by acting in good faith and having the proper methodology and being aware and responsible and trying, you know, holding yourself up against values such as impartiality
Starting point is 00:23:31 and fairness and balance, that you can mitigate it to the point that people will trust you. All right. Thanks. There's lots to chew on. And it does sound like, yes, that's quite a position to be in to decide whether someone has done that or not. When you yourself, you have, you know, quote-unquote biases such ways of judging things. So that must be tricky to factor that in and work out if you're favoring journalists because you know their job better or then. Right. And yeah, I mean, that's the art of the job. I try to, well, people will always keep me humble, but I try to be humble about it to begin with, recognizing that, you know, there probably isn't a clean answer for complaint X. Someone's got to decide, was this okay or not? And so I will use my best judgment. people can judge me or not however they choose.
Starting point is 00:24:20 There's no ombudsman, ombudsman. I mean, that would be a pretty funny job type. Especially if you were the ombudsman ombudsman. Well, that's true. They remind the ombuds person. Yeah, okay. So I think we have a WNA episode ready to go. All right. Well, thank you very much for dog, Nemea, Jay.
Starting point is 00:24:37 You're welcome. I hope this was helpful, Tom. You're listening to a documentary called The Bias List on CBC Radio 1 in Canada, across North America on Sirius XM, in Australia, on ABC Radio National, and around the world at cbc.ca.ca slash ideas. You can also hear ideas on the CBC Listen app or wherever you get your podcasts. I'm Nala Ayad. On Big Lives, we take a single cultural icon. People like Jane Fonda, George Michael, Little Richard.
Starting point is 00:25:14 And we pull apart the story behind the image. And we do this by digging through the BBC. sees vast archives. Discovering forgotten interviews that change exactly how we see these giants of our culture. We're here for the messy, the brilliant, the human version of our heroes. I'm Immanuel Jochi and Kai Wright. And this is Big Lives. Listen to Big Lives wherever you get your podcasts. There is always a well-known solution to every human problem, neat, plausible, and wrong. The line comes from Henry Lewis Mencken's essay collection, prejudices. Mencken was an American journalist who loved the arts, but despised democracy and Jews.
Starting point is 00:25:59 So we must approach his words with caution. But his warning about the neat answer sums up a challenge when identifying your own bias or prejudice. Here's author and decision-making expert, Olivier Sybony. It's actually very hard in practice to have a person. true statistical base of which to determine that you're biased in a particular direction. If three of your friends tell you you're politically biased, you always see things in a particular way, well, maybe it's actually these three friends of yours who have a different political leaning, right? For a statistical base of judgments to be wide enough that you can
Starting point is 00:26:42 actually learn something for it, it's actually quite true. rare in practice, which is why trying to learn from your own mistakes and trying to determine the biases that have led you to make mistakes in the past and trying to learn from that is quite hard. Today's episode is the second attempt by Ideas producer Tom Howell to solve the problem of how to check his own biases. And I want my solution to be neat, plausible, and correct. My name is Jessica Nordell.
Starting point is 00:27:29 I'm a science and culture journalist, and I'm the author of the new book, The End of Bias, A Beginning. I wanted to indicate that I was interested in solving the problem and not just sort of discussing the problem. I mean, it's kind of, it sounds a little bit like a paradox sometimes. Like, well, if it's unconscious, how can it be changed? How could, you know, how can we change something that we don't even know we're doing? Jessica Nordell encountered the problems anyone runs into when trying to, quote-unquote, tackle bias. If you want to tackle an opponent in a game of rugby, you need to start on solid ground and you need to identify which opponent you're going after, including how big they are and their approximate speed.
Starting point is 00:28:14 When tackling bias, it's exactly the same situation. Analogies are a bit stupid. Like two things are not going to be perfect equivalent. Okay, well, it's a somewhat similar situation. You know, this was a real challenge for me, as I was working on this project, about trying to tackle the problem and understand what solutions there could be for unconscious bias. I really wanted to understand if there was a way to quantify the impact of these kind of everyday biases.
Starting point is 00:28:43 What I ended up doing was partnering with a computer scientist to develop a computer simulation of gender bias in the workplace. And so what we did was we developed this fake workplace, a virtual workplace we call Norm Corp. And in Norm Corp. Yes, exactly. And so in Norm Corp, we have individual employees who carry out individual projects and then they're evaluated and they get a score that determines how promotable they're seen as being. Into this computer simulation, we factor in a number of the really common kinds of biases that women face at work. So there are these really common patterns like women's work being valued slightly less than men's work. Or when women and men work together on projects, women get a little bit less credit than men for those
Starting point is 00:29:42 project. So there are about a half a dozen different biases that we factored in. Then we ran the simulation as though it were going for years. In our simulation, we had individuals who did projects, got evaluated, their score was adjusted based on how they were evaluated, and then every so often in the simulation, the top scorers would get bumped to the next level up. So it was kind of like a real-world sort of promotion scenario. And so the way that we evaluated how the biases were being accumulated over time was to look in the computer simulation at what the gender ratio was at different levels of the corporate hierarchy. In our simulation, our workplace had eight levels of hierarchy.
Starting point is 00:30:42 Women were just evaluated with a 3% bias. Their work was devalued by 3%. Or they were given 3% less credit, for instance. 3% sounds quite conservative. Yes, I think it was pretty conservative. After we ran the simulation for, I believe it was 10 years, we ended up with a gender ratio at the top of 87% men. The simulation really just,
Starting point is 00:31:15 just looked at the iterative, cumulative effect of all of these different evaluations. And as I mentioned, I think there were a lot of different kinds of biases we factored in. We know that in the real world, when people experience unfairness, they sometimes speak out against it, but also that women are more penalized when they're seen as being self-promoting. So one of the biases that we factored in was that 10% of the women who experienced discrimination spoke out, and then they were penalized. a little bit for being self-promoting. You know, another bias that we factored in
Starting point is 00:31:51 that also has a lot of empirical data behind it is that as the ratio of women got smaller, the bias increased slightly against them. So we see that that happens in the real world. In environments where there are very, very few women, they actually face more stereotyping. What was your reaction when you saw the results? Were you thrilled that you're produced such a,
Starting point is 00:32:15 a picture or did it affect you in any way? It was, you know, it wasn't particularly surprising because I thought, you know, I've been a woman in the workplace and I've been in environments where, you know, it feels like there's something not quite right because there are a lot of women at the entry level. But then when you look at the leadership, at least in my experience, in a lot of places, the leadership is really mostly men because those small biases really. affect, you know, things like who gets really good assignments, who gets the most visibility, who gets sort of that extra nudge from someone at the top to say, you know, I believe you can take
Starting point is 00:32:58 on this new responsibility that's like maybe a little bit beyond your skill at the moment. So, yeah, I mean, these small biases, I think, affect so much. I spoke in the previous episode of this inquiry to the chair of the scientific advisory board for the world's biggest and longest running attempt to measure people's unconscious biases. What we do is just collect more and more data on different types of implicit biases, not just on the basis of race, but on the basis of stereotypes about gender, nationality, religion. That's Calvin Lai, and he's talking about Project Implicit. It's been running for about 25 years, and its centerpiece is the Implicit Association Test, or IAT. It's a rudimentary video game that
Starting point is 00:33:44 anyone can play online, and the computer spits out a result, telling you whether you've shown an unconscious preference for white people, black people, old people, young people, and so on. The IAT kind of gave bias this moment in the spotlight when it came out. Everyone was very excited about it and felt very sort of nervous to take the IAT test and reveal their own hidden, you know, sort of machinations of their mind. I think what the IET does really well is it kind of gives you a broad snapshot of the associations and stereotypes that are present in a culture. So if you look at thousands, or I think the IET now has been taken by like 25 million people. So if you look at thousands or even millions of these tests, you see patterns being repeated.
Starting point is 00:34:38 Hi, I'm Jimmy Calentini. I am an assistant professor of social psychology. at the University of California, Riverside, and the USA. I'm the director of the Riverside Social and Spatial Cognition Lab, where my students and I study how people think about the people and groups around them. In other words, we study inner group biases. Jimmy Kalincini's team released a big study in January 2022. We mapped out visually how a couple of different intergroup biases vary across regions of the United States. And the maps color-coded. The blue areas are more egalitarian, and then the more orange regions are more biased.
Starting point is 00:35:27 And biased in this context means that the aggregate of people's test results in a region indicate a strong, unconscious preference for one group versus another. In intergroup bias theory, to nerd out for a minute, we usually think about uses and thems. And so the way I evaluate my national identity or my ethnic identity or my gender identity is usually in contrast to something else, men versus women, Americans versus Asians, gay versus straight. And so it's a relative judgment that we're assessing here. And so, yeah, racial attitudes that we've mapped are preferences for white versus black people. Sexuality attitudes are gay versus straight people.
Starting point is 00:36:12 When it comes to unconsciously preferring white or black people, US counties show up as more blue, meaning no real preference, along the west coast of the country. Regions in the middle and south of the country are generally more orange. But when it comes to preferring people based on their visible body weight, it's the central and southern counties that show up more blue, California, glows bright orange on that one. According to its IAT results, the region strongly prefers thin people.
Starting point is 00:36:41 You asked about the two ends of the continuum. So in the race map, it's preference for white people versus preference for black people. So you would think that maybe there's some parts of America that are pro-white and some parts of America that are pro-black. And that's generally not how it shakes out. Instead, our maps are constrained from pro-white to about egalitarian. So there's not the other end of the spectrum. And this persists across most of the domains. You know, there's pro-strait to egalitarian.
Starting point is 00:37:10 There aren't regions in our map that are significantly pro-gay. And we find this a lot that it's the high status group. It's the dominant group that are generally more favorably evaluated to more egalitarian, equality-minded attitudes. Even among people who are members of the lower status groups, oftentimes they, themselves show biases in favor of the high status group. And that makes a lot of sense. You know, you grow up in a culture where every action hero looks like this. And every politician and every leader and every, you know, awesome person in your culture looks like this or is a part of that group.
Starting point is 00:37:47 So even if you're not a part of that group, you grow up looking up to those folks. You know those are the folks with the power. Those are the beautiful people. And so the map also shows that. We don't see big pockets of America that are pro the lower status group. It's either the pro-high status group or neutral or something in between. Thanks so much for telling me about your research, Jimmy. It's been my pleasure. Thank you so much for having me today. If what's true in the States holds true in my region of Canada,
Starting point is 00:38:23 it might sound like Jimmy's research into measuring and aggregating biases has thrown up a useful indicator for me as an individual. At a regional level, when you lump together people's test results, it seems as though biases in favor of high status groups are worth worrying about, and biases in favor of low status groups are not worth worrying about. That insight could be very useful for choosing which biases one should pay most attention to. It might solve the problem of what to put on one's own bias list. Or that solution could be neat, plausible, and wrong, because noise. When we look only for the average error, we miss this very important source of error, which is noise. You could have a judicial system that on average sentences people to what you think is a reasonable sentence.
Starting point is 00:39:13 But if half of the people get one year and half of the people get seven years, when on average every should get four years in jail, you can't say that justice has been done. You don't want it to be a lottery. And noise essentially turns every human judgment into a bit of a lottery, because whenever there is human judgment, there is noise, and usually a lot more of it than we assume there is going to be. What is the distinction between noise and bias? It's actually fairly simple. If you think of a problem of measurement, it's going to be simpler than if we think of a problem of judgment to start with. suppose you have a scale in your bathroom and you happen to know that your scale is a bit generous.
Starting point is 00:39:57 It's always giving you a weight that is on average a pound less than you really wait. That's the bias of the scale, is the average error of the scale. Now, suppose that you step on the scale three times in quick succession, you are not going to have exactly the same reading. There is a random error in the reading of your scale.
Starting point is 00:40:16 That's the noise. Your scale probably has both bias and noise. And our judgment is just like your bathroom scale. Our mind is a measuring instrument when it makes judgment. It's got biases. It's got errors that are predictable, that are directional, that we all tend to make. But it also makes errors that are random and largely unpredictable, which are what we call noise. Do I have to worry about noise within my own decision-making process? You do. And I'm sorry to add to your neuroticism here in adding one more source of concern in addition to your biases,
Starting point is 00:40:55 you also have to worry about your noise. If you take the example of judges, for instance, you can be the lenient judge that we talked about or the severe judge that we talked about. Regardless, you are probably going to be tougher just before lunch than you are just after lunch. You are probably going to be tougher when you're in a bad mood because your football team has just lost the game on the weekend and it's Monday morning, then you would be on Friday afternoon. You are probably going to be tougher when it's a hot day out there than when it's a cool
Starting point is 00:41:26 and breezy day. Polarization. Groups tend to reach a conclusion that is more extreme than the average viewpoint of their members and to be more confident in it. Group think. In a group, I silence my doubts and side with the prevailing opinion instead of dissenting. I mean, I'm sure I do it from time to time, but I don't think that's generally a problem that I have. Yes, you probably have all those biases. Exposed after the fact, you can explain any failure by cherry picking from the very long list of biases that you know who are well, one that explains the mistake you've made. What good does that do you at the time you need to make the decision? Think of, again, a hiring process. What mistakes are you making in your hiring decisions? Are you biased in favor
Starting point is 00:42:18 of people who look like you in terms of gender and ethnicity and, and, you? you were giving the example of academic background. And so tomorrow someone comes in who happens to be like you, white and male and I suppose an English major. Should you say, oh, no, no, we shouldn't hire this person because it's only my biases who are speaking and telling me to hire this person. But maybe this person is actually great. What are you going to do? I don't see the practical learning that you can have from this when you're talking about a failure. small number of individual cases.
Starting point is 00:43:01 What decisions are you going to make differently? Now, contrast this with having a good decision process, where the candidates are going to be evaluated by a number of different people, not just you, but also people who have different views and preferences and biases. You have defined in advance what you're looking for in these candidates. If the English major turns out not to be the best candidate, you can pick someone else because someone else happens to be the best candidate, not because you've decided to fight your biases.
Starting point is 00:43:30 I think that's a lot easier and a lot healthier. It's actually good news. This has been fascinating. Thank you so much for your help and guidance and some advice that I may or may not take, depending on how I'm feeling on that day. Thank you so much, Tom. This was fun. If you and I are having a conversation right now,
Starting point is 00:43:54 there might be some element of our interaction that's being influenced by implicit associations about one another. But then there are so many. other things that are also influencing our behavior. We're influenced by what other people are doing around us. We're influenced by norms. We're influenced by even how fatigued we are. The more tired we are, the more we're likely to be influenced by stereotypes, how much cognitive control we have. One thing that's kind of interesting about the IAT is it, you know, it shows your instant reaction to a particular group of people of a particular social identity. But,
Starting point is 00:44:33 it's possible that what it shows is actually how much or little cognitive control you have over. So there might be someone who has a lot of biases about a particular group of people, but they might be really good at overriding those biases. So their IAT scores might not be that bad looking because they're really good at like overriding their associations. For that reason, the IAT is not necessarily a great predictor of an individual's behavior at a particular moment in time. But I still think it's really interesting and valuable. The value, and I guess therefore the interest of knowing about other people's unconscious biases measured at a group level, is that you can use that information to manipulate small populations of people in ways that serve your own goals, interests, and desires,
Starting point is 00:45:17 which are, of course, entirely honorable. There's some really interesting research about changing people's behavior. So one piece of research took place in France. Anti-Arab sentiment is a real problem in France. And what the researchers were interested in looking at was whether there might be a way to change people's perception of what's known as outgroup homogeneity. One of the factors that contributes to our bias is that we tend to think of our own group as really diverse and various. And we tend to think of the groups that we're not part of as being homogenous and monolithic. And you can imagine that if you think of a group as being really homogenous, it's easy to stereotype that group because, well, they're probably all pretty similar to one another.
Starting point is 00:46:06 And terrifying. And, yes. In this particular experiment, what the researchers did was they created a poster and they put faces of individuals of Arab background on the poster along with a description of the people. So it would have a person's name and then something like optimistic or stingy or friendly. Right. So they weren't just positive things. They could be this person's grouchy or something. Exactly.
Starting point is 00:46:36 And what they found was that when they put this poster up with these individuals and then descriptions that were positive and negative, people behaved differently toward an individual of Arab origin afterwards. So for instance, they did one iteration where they showed the. poster in physical therapy offices. And after being exposed to this poster, individuals and the patients in the physical therapy office would sit closer to a person of Arab origin in the waiting room. So they had another where they had a woman of Arab descent spill her bag. This was another partner of the researchers. A woman of Arab descent spilled her bag in front of subjects.
Starting point is 00:47:15 And of those who had seen this poster that depicted people of Arab origin as diverse, a lot more helped her pick up the belongings than those who had not seen this poster. So, you know, the researcher's interpretation is that if you start to imagine a group of people as really diverse from one another, it becomes much harder to stereotype. and as a result, harder to discriminate on the basis of stereotypes. This sounds like great propaganda for forcing people to read novels. I mean, obviously, meeting people is better. But if you can't, then... Yeah, you know, and one thing I think that's really interesting is we kind of think, well, maybe one of the ways to get over stereotyping is to show all the positive aspects of a group
Starting point is 00:48:03 that's maybe an outgroup from the mainstream or from the majority. And it's, yeah, not just reading novels, but any kind of. depiction of a group really needs to show the full complexity and variety of the people in that group. I have a project which nobody thinks is a good idea. I want to hear what it is now. Which is to make a manageable list. And notionally, I'm thinking like a list of 25, because I like 25 letter alphabets, of the biases I
Starting point is 00:48:35 really need to worry about. You're sort of creating a checklist for yourself, a biased checklist. Yeah, does that make sense to you? It's, I mean, it's kind of an interesting idea. I think it might not be as complicated as you think, because in a particular culture, there are, if you're talking about social identities, there are sort of like a handful of social identities that are the most salient, gender, race, age. Like, we tend to categorize those in microseconds, maybe milliseconds. So those, I think, are probably the most salient for everybody. then there might be, you know, an additional handful of other social identities that might also be important weight, disability, religion.
Starting point is 00:49:16 Do those, are those sort of on your list? Or am I totally off base? Maybe there are completely other biases that you're worried about that I don't even know about. I haven't written my list yet, but I mean, I think those will all be strong candidates for sure. I mean, I guess I'm curious, though, because you've thought about this for so long and gone into so many studies. Have any biases kind of come up to? the surface for you that you feel like are kind of unique to your, not necessarily unique to yourself, but like defining of yourself, ones you really struggle with and which deeply influenced the way you tend to think about people? I think I struggle with all of them, honestly. I mean, I think I went into this project thinking that I was a little bit less biased than other people. And I was like rapidly disabused of that notion. I struggle with gender bias, which is alarming to me. Like I struggle with making snares.
Starting point is 00:50:06 assumptions, snap judgments about women's competence. As in when I realized... You see them as less competent instinctively. I have noticed that there are times when I, that I make an assumption about a woman where I am sort of devaluing her expertise, for instance. And that is really disturbing as a woman. It's very disturbing to see how deep sexism
Starting point is 00:50:36 is in our culture that it influences me as a woman. I struggle with all of the other kinds of biases that are part of our culture. I mean, I think that we don't, I think it's very hard to opt out of one's culture's toxic inheritance. So I haven't created like a list or a, you know, sort of an ordered list yet, but I would say that I struggle with them. And I also realized that they were harming me. You know, it wasn't just a matter of my biases that I was projecting on other people was unfair to them or maybe impeding their ability to do something or to be seen in a certain way. I found that my biases toward other people were actually hurting me. They were making me less able to see reality.
Starting point is 00:51:35 They were like sort of separating me from other people. I think that that's one thing that maybe is less widely talked about in terms of bias. I think we think about it as overcoming our biases is something that we do kind of for the benefit of others. But I think what's maybe less recognized is the way that overcoming these biases can also help us. What meaning of freedom from bias do you say? as plausible and worth fighting for. Hmm. I think all people of all social identities
Starting point is 00:52:14 should be able to be seen and heard and evaluated based on who they are, who they actually are, not what the culture says about their group or what the headlines say about their group or what, you know, what are cognitive biases say about their group. And I think that's absolutely worth fighting for because without being able
Starting point is 00:52:40 to be seen truly for oneself, one can't really make one's way in the world. You know, I think all people of all social identities should be able to be free of the kinds of harmful biases that our culture unfortunately encourages. All right. Great. Thanks so much for talking to me, Jessica. You're welcome, Tom. It was really great talking to you. Jessica Nordell's book is called The End of Bias, a Beginner. Seems like a good moment to pause for a think. You were listening to The Bias List by Ideas producer Tom Howell. Thank you to Linda Besner and Mahid Mustafa for recording their bias confessions.
Starting point is 00:53:43 The Bias list is part two of a three-part series. theories on biases. Part one, called B is for bias, separates out the different meanings of the word from the oldest to the newest. Part three is called Beyond the Pale. It's the story of how people draw a line beyond which no reasonable discussion is possible. You can find those episodes on the Ideas podcast feed. Technical production Danielle Duval, web producer Lisa Ayuso, Senior producer Nicola Luxchich. Greg Kelly is the executive producer of ideas, and I'm Nala Ayyed. For more CBC podcasts, go to cBC.ca.ca slash podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.