Ideas - How to harness your own biases
Episode Date: April 17, 2026It’s easy to admit to having biases, but much harder to pin down what they are, let alone figure out what to do about them. Nevertheless, IDEAS producer Tom Howell gives it his best shot. He looks i...nto what the rewards might be, if we could name and identify our own most important biases.This episode is part one of a three-part series exploring the meaning of 'bias.' It originally aired on Sept. 7, 2021.
Transcript
Discussion (0)
The Powering Politics Podcast is available six times a week, but you might not be.
If you want to catch up on what happened this week in politics, join me, Laura Dangelow and some of Canada's most tuned-in political strategists to break down the week that was.
Short on time, the weekly wrap has you covered with a new episode every Saturday.
This is a CBC podcast.
I'm Nala Ayyad. Welcome to Ideas.
Today, pitting down what we can do about bias.
Have you ever seen someone make a judgment before all the facts are in?
It may have been bias.
My brain contains the thumbprint of the culture in which I live,
and that's now in my head.
No, I'm saying most Canadians, in fact, are not racist.
I don't know that for a fact.
Well, I know that for a fact because...
The word bias is many centuries old, but what people mean by it changes.
How do I pick a bias?
And it's important that you understand these different approaches because they're biases.
And I have my bias, and I'll talk about that.
We see this often in social media comments when accusations of bias and fallacies are indiscriminately dished out.
Depending on who you're talking to, a bias might be a good thing, an essential part of an ancient game.
and the pole will always curve towards the small insignia.
Or it's a quick way to identify yourself,
a way to glimpse your deepest characteristics and desires.
Today I'm doing a video about how I choose my biases.
So, onto my bias list.
So I stand a lot of groups now and I have a lot of biases.
Listing all of one's biases is a popular way for fans of K-pop music to interact online.
In that context, the way
The word bias refers to a person's favorite member of a pop group.
Hi, today I am going to be sharing my bias list with yours.
This is just a short list and does not cover all of our biases.
Enjoy and comment your bias list down below.
This new meaning of bias has now been added to the Oxford English Dictionary.
That's where you find the complete record of how the
the word has journeyed through our language, beginning in the 1500s.
Ideas producer Tom Howell takes inspiration from Oxford's latest update to their definition of bias.
He looks into what the rewards might be if we could name and identify our own most important biases.
In a messy world of contested facts and few objective measures, some experts dispute that such a goal could ever be achieved.
Seems to me like a bit of a fool's errand, to be honest.
Sorry.
This episode is called B is for Bias.
Martin?
Good. Good to meet you.
Are you shaking hands or I don't know?
I went in for that.
I guess what we normally do is we start by sizing you for bowls.
This is Martin Zeebauer.
We're at Cosburn Park Lawn Bowling Club.
Tonight is Toronto rain bowlers.
Rain bowlers.
Toronto's LGBT Lawn Bowling Society.
Thanks for coming out, everybody.
Just a reminder, we have our social with Riverdale curling
coming up on the 22nd.
The draw is up.
Any questions?
Talk to the drawmaster.
Here is a whole lot of lawn...
Do you just call them balls or bowls?
These are bowls.
We never call them bowls, and we'll get to that.
Martin's introducing me to what's very close
to the original meaning of bias.
your hands. So can I see your hands? Yeah, there's one of them. They're not huge. What do you mean?
So you're probably, probably a one. We'll start with a one and see how that works.
When you look at a lawn bowl up close, it's obvious why you don't call it a ball. It's roundish,
but it's not perfectly globular. No. So they're not spheres. They have kind of a flat,
a flat ring, and we call that the running surface. And they're also very slightly egg-shaped.
So one side is just a little bit thicker than the other, and that's what gives the bowls their bias.
I was going to joke that I was here to investigate allegations of bias in lawn bowling.
I didn't manage to get the joke out in time.
You can try it again if you want.
No, no, no, no.
Okay, so show me how it's done.
Yes, the bowls, as we mentioned, have the bias.
That's indicated by the two insignias that are on the side.
So there's one insignia that's big and one insignia that's little.
And the bowl will always curve towards the small insignia.
So some people use as a memory aid.
They'll say little to the middle or the bowl will always fall to the small.
Okay, little to the middle, fall to the small.
Little to the middle or fall to the small.
I like it.
But everyone still gets it mixed up.
Even experienced bowlers will occasionally...
Fall to the middle, little to the...
Yeah, we'll occasionally do it wrong.
By orienting the insignia in the direction that you want the bowl to curve,
you can use that curve to sort of get around obstacles on the green.
Once we have a bunch of bowls in the game,
often you'll want to get around something that's in front of the jack.
The jack is the target.
Interestingly, like the word bias,
the word jack has an extraordinary number of different meanings in English.
There are 39 separate definitions for it in the Oxford English dictionary,
and that's not counting all the little nuanced sub-meaning.
Jack was once just a working-class man's name, and then it came to stand for the common man in general,
and soon it referred to all kinds of things that are small or insignificant.
But I digress.
So you can sort of make the bowl turn left or right?
Yeah.
So the bowl will all...
The ball and bowl, sorry.
Different bowls have different biases, so one set of bowls will have a really wide bias,
so it'll make a big sweeping curve.
Other bowls, my bowls aren't too wide, so it's...
they're a little bit of a narrower curve.
Sometimes the curve can be, we call it a hockey stick curve, so it'll go almost straight
and then kind of suddenly curve at the end.
And it's all to do with the shape of the bowl.
Wow, okay, so can you show me yourself manipulating this feature?
Okay, so to deliver the bowl, so first I'm going to stand on the mat.
Lawn bowling doesn't really lend itself to live-action commentary.
So then I'm just going to bend down a little.
You have to be there to get the full.
excitement. And I'm just going to step forward and roll. And Martin rolls the bowl down the green grass
lawn on this hot summer day, the bowl on a trajectory that at first appears pretty straight.
Straight, straight, straight, straight, straight, oh no, it's turning, now it's turning, now it's turning,
and it's going. So you can see that it's coming back to the center. So in this case, I'm going to
make sure that my bias is pointed towards the center. Very CBC. I suppose it depends who you have.
game.
Lawn bowling jargon is a far reach from today's bitter feuds over truth-telling and
the dimensions of the Overton window, and yet one finds constant echoes of meaning between
this 500-year-old definition of bias and today's more common ones.
For instance, note that one of the game pieces is conspicuously unbiased.
So the jack is actually a sphere, so it doesn't have a bias.
This is objective jack.
Yeah.
Yeah, so it's a white ball. It's about the size of a billiard ball.
And that's our target.
So the unbiased ball in the center is always white. Okay?
You have some fairly old bowls. Older bowls tend to have the more pronounced curve,
so they'll probably see that it has a really wide draw.
The older they get, the more biased and fixed in their ways.
Well, it doesn't change over time. It's just sort of a more traditional bowl has a heavier bias, let's say.
Yeah, I'm sure we can do something with that.
The bowls we're playing with are a heavy plastic composite with a set-defined bias that's put there by the manufacturer and isn't supposed to change.
But once upon a time, lawn bowls were made of wood.
Because the bowls were made of wood, eventually old bowls would start to warp.
And if you were playing with the same bowls, you would get to know how it behaved, how it curved,
and you could use that to your advantage.
And so people would hang on to their old bowls because they knew exactly what they would do.
Well, that's nice. That's a little bit like how an old violin, you know, takes on its life experience and that sort of affects the music.
For sure. You know, especially experienced bowlers will have their own bowls, and you get to know how your bowls behave.
Because my set of bowls are going to be different than your set of bowls.
Wow, that just sounds like it's asking for a comparison to current events in some way.
I'm not sure I can look at that.
We'll add the segue in post.
because my set of bowls are going to be different than your set of goals.
All these theorists about critical race theory and white supremac and all the rest.
Eventually, bold bulls would start to warp.
This new wokeness, this critical race theory, the imposition of anti-bias, the hyper.
And I think affected sensitivity to business, university, and even the health community,
to the more fashionable virtue contest.
The blind spot is what you don't see in the mirrors.
That's why it's called the blind spot.
You don't pick a bias.
Your bias picks you.
So anyway, in lawn bowling, as in K-pop, a bias is a good thing.
But it's also essential that you can identify it
and learn its predictable characteristics.
That's true whether your bias is named something like narrow number two in lawn bowling,
or if we're talking K-pop, maybe...
But between the clarity of the ancient meaning of bias and the new supermodern one lies a battlefield
shrouded in mist and full of unmeasurable depths.
On to this battlefield, step the brave adventurers of cognitive science.
Okay, yeah.
Hi, everyone.
I'm Tally Sherrott.
I'm a professor of cognitive neuroscience at University College London, and I study human behavior
and the mind and the brain.
Professor Sharrett directs the London-based Effective Brain Lab.
Effective with an A comes from the word affect, which is just a fancy way to say emotion, really.
So we study decision-making and how emotion changes decision-making and how motivation changes decision-making.
We also study related concepts like how people form beliefs, how they process information, how they search for information,
and a lot of how motives bias all of these processes.
Talley Sharratt first caught the world's attention 10 years ago with her book, The Optimism
Bias. Her TED Talk explaining the term has been watched 2.6 million times.
I'm going to talk to you about optimism, or more precisely, the optimism bias.
It's a cognitive illusion that we've been studying in my lab for the past few years,
and 80% of us have it.
It's our tendency to expect the future, to be better than the past and the present,
to underestimate the likelihood of having negative events in our life, such as divorce or illness,
and overestimating the likelihood of positive events in our lives,
such as a successful professional career, having talented kids, and so on.
Tally Sharrett didn't invent the term optimism bias.
It's been around for 40 years, but her studies and books have raised it up to join the likes of
confirmation bias.
That's to say the famous names of biases that escape cognitive science and enter our popular
vocabulary. Confirmation bias is our tendency to believe information that confirms what we already
believe. So if I believe in climate change and I read an article suggesting climate change is
happening and scientists believe it's true, then I become more confident that climate change is
true, less likely to be influenced by this article if I come in from a skeptical point of view.
Now, the relationship between confirmation bias and optimism bias is the fact that many times
if we believe that, you know, the likelihood of something positive to happen is higher than negative,
then we will more likely to take an information that confirms that positive view.
So this is very useful if you're trying to change somebody's mind, right?
Yeah. So anytime when your current belief is an optimistic one,
then confirmation bias and optimism bias kind of converged to some extent, what we found, right,
the main mechanism that we found, one of the main mechanisms that we found,
of how you generate this optimism bias is by the fact that you're more likely to believe information
that's positive over information that's negative about yourself and specifically your future self.
So if I tell you, you know, there's going to be millions of people listening to this radio show
and they're going to love it and maybe you thought, oh, maybe it'll be only 100,000 and you hear that
I'm saying, millions and say, oh, yeah, well, maybe there'll be more than I thought and you
you update your estimate and you think it's going to be a great show. If I tell you, you know,
I don't think anyone's going to listen to this. You will say, well, she doesn't know what she's
talking about, right? So you're more likely to take my opinion or any kind of information if it supports
a desirable position versus an undesirable. So that is a mechanism that generates an optimism bias.
So you could be in a situation where these two are at war with each other, where perhaps you have
a confirmation bias would be willing to believe something you already think, but there's a
desirability bias might lead you to change your mind. Exactly. So and these are interesting examples and
there's been research on it. So for example, one study was conducted just before the 2016 presidential
election where people who were supporting either Trump or Hillary were asked about what they
thought the probability was that Trump and Hillary will win. So despite the fact that people were
supporting Trump back in August 2016, even the supporters didn't think that Trump was going to win, right?
So their prior belief was Trump is probably not going to win, but they really wanted Trump to win.
So then these individuals were given information suggesting that Trump will win. They were showing a poll that Trump will win.
So this information doesn't confirm what they believe, but it's what they want to believe.
And they were much more likely to change their predictions when they learned that the poll was showing that Trump will win.
but not win. Okay, so that's an example where confirmation bias came head to head with up,
with desirability bias or optimism bias, and the desirability bias won.
Desirability bias is an umbrella term that includes the optimism bias. It refers to any
bias towards something you want or consider positive. Professor Sherrett's lab has shown
how a desirability bias may also fail to beat a confirmation bias.
We've done a study where we showed the information related to climate,
change is more likely to be taken into consideration if it confirms what you believe,
less likely is it, it's not as important whether it's good or bad.
Oh, really?
So if I'm a climate change believer and you tell me that the scientists are saying things
are much worse than you originally fought, right?
It's going to get worse than even, you know, you're thinking.
I update my belief towards this, you know, dire extreme.
But when I give you information telling you that scientists are saying, it's not as bad,
it's going to be, you know, much better than we fought.
new data shows that I don't change my belief as much.
So here, a confirmation bias actually outperformed the desirability bias.
Talley speculates that the more likely a piece of information is to affect someone personally,
the greater the pulling power of the desirability bias versus the confirmation bias.
But she also points out that these particular biases are just two among myriad pushes and pulls,
countless, possibly immeasurable variables, spurring our decisions and reactions and sudden flights of fancy.
Her lab is just beginning to uncover how a few of these processes might interact.
We know from our own research that if you induce stress in someone, if you get people to be stressed,
the mechanism that generates optimism bias goes away.
So the mechanism that I told you about that you learn more from positive and negative, if I stress you out, that doesn't happen anymore.
This sounds like a really fun experiment.
How do you stress people out?
So we did two things.
The first study that we did is we brought undergraduates into our lab,
and we told them that they're going to have to give a talk in front of everyone else
on a surprise topic that we gave them for five minutes.
They're going to be rated by everyone else.
We're going to videotape them and put it on YouTube.
And so they got quite stress.
They told us that they were stressed.
We also checked their cortisol level in their saliva.
So when you stress, cortisol goes up.
We looked at their skin conductance to make sure that went up.
Again, when you're stressed, you start sweating.
Skin ductance goes up.
And then we did the experiments that we usually do.
So we have this task that we give people where they tell us about the predictions, for example,
how likely they are to get divorced, to get Alzheimer's, to get being in an accident, and so forth.
And we give them information about the average likelihood for someone like them, their age and gender living in their city,
to have this event happening to them.
And we often see that if people are not stressed, they learn more from unexpected positive information,
oh, you're less likely to get cancer than you fought than negative.
You're more likely.
But under stress, it disappears.
So under stress, they start learning much better from negative information than they did before.
So now, if I tell you, if you're very stressed, I'm going to get you very stress.
And I tell you, no, no one's going to listen to your radio show.
You'll be like, oh, no, no one's going to listen to my radio show.
So you change your beliefs in the negative direction much easier, much more easily than if you're not stressed.
This is all, I mean, it's just fascinating intellectually to sort of think about these things.
But of course, part of why you do this work is for practical applications as well.
And is there any way as an individual that I can in any way assess, like, am I more influenced
by my optimism bias right now?
Or is my stress, you know, killing that optimism bias and therefore I need to be more alert
to my stress bias?
Like, what am I supposed to do with that information?
I guess is my question.
I think the question is, what is your goal, right?
So is your goal to make the best financial decision, for example?
Is your goal to decide whether you need to have a, you know, see a, have a second opinion from a doctor?
Like, based on on that goal, the path to action would be very different.
So I can tell you, let's say, I mean, I can measure it and I'll tell you, okay, now you're under optimist advice or not.
But so what?
What are you going?
I mean, right?
That alone, I don't see how that will impact you.
I think the question that people want to ask is, when is this good for me?
when is this bad for me, right? And in the cases when it's bad for me, what can I do about it?
So having an optimism bias is not something that you sincerely want to change. People who are
optimistic tend to be happier. They tend to be more motivated. I think a lot of times the question is,
like, you know, how do we change people's mind about something about climate change, about vaccines?
How do we get them to get vaccinated to, I don't know, recycle, whatever it is? And that's when
knowing what is forming their beliefs is helpful. Sure. Yeah. No, I mean, I can see that it's extremely
valuable, you know, if I were a propagandist or if I were trying to manipulate other populations of people.
But I am still interested in how to use this in a personal self-improvement kind of way. I mean,
do you personally, have you found that your expertise is something that you are able to apply to
while you're in the middle of making a decision? There's two kind of ways to answer this.
So first of all is that most of the information is most valuable for, you know, just a single person in terms of how they interact with others and how they help others make better decisions.
So I do this with my kids a lot.
I use this information to say things in a way that I think is more most effective.
So let's take the optimism bias.
So let's say, you know, it's raining out and I want them to wear a coat.
you can say, well, if you don't wear a coat, you're going to get sick.
This is highlighting the negative, and they think, oh, I'll be fine, right?
Or you can say you wear a coat, you'll be nice, warm, and cozy, and, you know, you'll be in great shape to go through the birthday party next week.
So you're kind of highlighting the positive things in the future, rather than highlighting the negative,
because we know that the brain is more likely to take in the positive, unless, of course, they're stressed, right?
I imagine being an international neuroscientist can be a somewhat, or, sorry, internationally renowned neuroscientist
can be occasionally stressful.
Do you ever find it useful to apply to yourself in those contexts when you notice that you're stressed?
You mean in my decisions?
Mm-hmm.
You know, the way that I deal with stress is I go out for a run and then I make a decision.
Right.
I have to say that this bias, again, even the bias of undistress is not necessarily a bad thing, right?
Okay.
So if you're in distress, there's probably something, you know, bad, threatening going on.
This kind of mechanism is there because it could be adapted.
right if you're in a threatening environment there's lions everywhere right you should be focused on
the negative this is you know you should be like overcautious and oh right and so you know COVID is
happening and people are very stress and they're kind of over maybe overreacting but you know
could potentially save your life um and right in terms of of the decisions I think the best thing
is uh to think about given this optimism bias you know what are the potential negative consequences for me
So, for example, for me, it is that I don't put a helmet on when I bike, right?
So I identified a possible negative outcome.
Well, negative outcome is just something will happen and I won't have a helmet.
And then I was able to do this because I understand that, you know, I'm over-optimistic and so on.
But the solution is then to find ways to correct for my behavior.
So in my case, I tell myself, anytime I bike to the office and I'm not wearing a helmet,
I then need to, for example, put money into a charity that I dislike.
or any time that I am wearing a helmet,
I get like a chocolate treat.
You know, I get to go to the nearby cafe
and get a little chocolate treat, right?
So you're using other mechanisms to correct your behavior.
So you identify your saying optimism bias is causing me.
Or let's take another very personal everyday kind of decision or expectation.
So we tend to underestimate how long things will take us.
That's also due to optimism bias, right?
How long will it take me to get where I need to go, right?
Or how long will it take me to finish this?
project and so on. So given that you know this and then you think, okay, I know that this is a case,
I'm going to correct for it. Right. So some people, I don't know, change the clock in their home
to five minutes before or put alarm on like way before or or you, for example, believe that you
will finish the project in two weeks, but you tell your boss, but then you always tell yourself
anything that I actually think I'm going to add two days, right, or three days or whatever. And so
you tell your boss, I'll be, you know, the project will be done in two and a half weeks.
Right, right. One thing I really take away from our conversation is just the diversity of biases that we have in us and the fact that they're always presumably interplaying and so on. Has it been a big part of your study to look at how the biases create interference with each other?
It hasn't, in fact. No, not at all. You know, the way that we usually do think is we try to isolate everything, right? We go the opposite direction. Like, I try to, like, control everything and just change one little thing to look at, like, one.
one little kind of type of behavior while controlling for everything else.
And so we could do this nicely in the lab, right?
And of course, once you get to the real world, things become more complicated.
Right.
If you must get to the real world.
I suppose we must.
This is really fascinating.
Thank you so much for taking some time.
Thank you.
My pleasure.
Talley Sharrett is Professor of Cognitive Science at University College London,
and she's the author of the optimism bias, as well as the influential mind.
You're listening to B is for bias on ideas on CBC Radio 1 in Canada, across North America on SiriusXM, in Australia on ABC Radio National, and around the world at cbc.ca.ca. slash ideas.
You can also hear ideas on the CBC Listen app or wherever you get your podcasts.
I'm Nala Ayad.
On Big Lives, we take a single cultural icon.
People like Jane Fonda, George Michael, little Richard.
And we pull apart the story behind the image.
And we do this by digging through the BBC's vast archives.
Discovering forgotten interviews that change exactly how we see these giants of our culture.
We're here for the messy, the brilliant, the human version of our heroes.
I'm Immanuel Jochi.
I'm Kai Right.
And this is Big Lives.
Listen to Big Lives, wherever you get your podcasts.
The word bias came.
into the English language around 500 years ago.
Back then, it was a French term for the diagonal direction across a piece of cloth,
a line at an angle to the warp and wharf.
That's still what it means to tailors and fashion designers today.
Most likely, the word has its roots in geometry,
the relationship between a diagonal and the vertical and horizontal axes.
long ago the word bias was perhaps pronounced more like bi-axis, referring to those two axes.
In the five centuries since English absorbed this word, it's become a much more troublesome term.
In politics, psychology, in philosophy, or in finance, a bias is usually a bad thing, a distortion of truth, a recipe for mistakes.
A huge amount of research goes into identifying and understanding.
the mechanisms of bias.
But as ideas producer Tom Howell is learning,
that doesn't necessarily give the individual much instruction
on what to do when contemplating the bundle of biases at work
behind every thought and belief.
Originally in school, I think we covered about 10 or 12 biases, big biases.
And I kept thinking there's got to be more than this.
It's got to be more.
So I started digging.
And I think I came up with about 270.
of them. More than you could ever possibly use, more than you could ever want.
This is Harvey Norris. He's a licensed clinical social worker in Louisiana. He's also the
creator of a handy booklet of 194 flashcards. I'll pick out a couple of examples here.
On the front, this one says belief bias. Well, in effect where someone's evaluation of the logical
strength of the argument is based on the believability of the conclusion. And here's another one,
contrast effect.
The enhancement or diminishing of a weight or other measurement when compared to recently observed
contrasting objects.
Hypabolic discounting, hindsight bias.
Here's a good one.
The hostile media effect.
That's, quote, the tendency to see a media report as being biased due to one's own strong
partisan views.
I think knowing and being able to label something gives you the power to recognize it.
you know, if you can label it, if you can understand it.
Some people look up in the sky and see clouds.
Other people see cumulus, stratus, nimbus, whatever.
I would say those people who can name what they see are more able to understand it.
Is that true or is that itself a comforting thought that doesn't always pan out?
Because I know that I am probably vulnerable to a certain amount of pessimism and,
and catastrophizing.
But I have also read, persuasively, that I am likely to be vulnerable to a certain amount
of optimistic bias and my side biases, things that put a positive spin.
And I'm not going to pretend I think I'm invulnerable to any of these biases.
I'm sure I'm not.
But what I can't always work out is which is the one I need to be more worried about?
And how do I possibly adjust for these kind of unknown amounts of bias I may be under the influence of?
The only time I believe that a bias is a problem is when it makes you uncomfortable or makes your life
difficult. If the use of a bias causes problems in your life, if you realize that your bias,
whatever the bias is or whatever the biases are, are causing you to lose friends, to not want to be
around people, to be uncomfortable, then you need to take a look at them.
Remember, these things are meant to make you more comfortable.
there's nothing wrong with being more comfortable.
We don't have to naturally be uncomfortable,
but if the bias begins to distort the world
to the point where you are not comfortable with the outcome,
then you take a look at the bias.
Do you think that you see bias this way
because your job is to deal with clients
who are suffering in some way
and to help them think in more useful, productive,
healthier ways and that therefore this influences your view of a bias. Whereas if I were to go to
someone, a journalist who is concerned about media bias and say a bias is only a problem for you
if it makes you're uncomfortable, you know, that's going to sound wrong in that context.
I can see that, yes. Can you bring the two together at all? Or are we talking about totally
different kinds of biases and totally different kinds of problems? I don't know if you can bring them
together. I think that you're very right. I deal with biases,
biases, and cognitive distortions in a very specific framework, a very specific concept.
I don't know if it would go over to journalism or any other place. I am really interested
in what bias makes you feel uncomfortable or not good so we can get rid of it,
Like everybody else in the world, okay, like it or not, I am trapped by my training.
You know, my training has put me in a very specific mindset.
I'm trapped by it.
So I don't know.
I don't know.
I think I probably look at it because it's the way I use them, yes.
Do you ever, when you're watching the news or listening to people in the media,
do you ever get tempted to hold a flash card up with them and be like,
oh, that's a bit of fundamental attribution error that you just did there?
I actually do that, unfortunately.
It's really sad.
I don't do social media because I don't like...
Social media allows for groups of biases to lump together.
So the only social media I do is LinkedIn.
You know, and I have been known to respond to certain things that come up on LinkedIn with,
oh, this is a so-and-so bias.
That's a confirmation bias.
or I think a while back I did one, someone said something and I popped up and put
specious argument alert.
And did that help the situation?
No, not really.
It never does.
You know, it's kind of like telling someone who's mad to sit down and be quiet, you know,
or stop being mad, stop being angry.
It never helps.
But, you know, occasionally, I am human after all.
Harvey's Flashcard book offers a different approach to bias compared with Talley Sharratt's insights earlier.
She helps us zoom in and examine a small but powerful mechanism that may or may not be governing our minds at any time.
Harvey's flashcards offer a more bird's eye view, which is very satisfying if you're the type of person who likes to see the world arranged into alphabetized lists of items with simple one-sentence definitions next to them.
Maybe we'd call that a dictionary bias or a lexicographer's bias.
Still, flipping through 194 possible biases, each of which could be the secret cause of a wild error,
in any belief one is holding, this doesn't necessarily create a greater sense of control
in terms of knowing what to do about them.
I'm trying to put my finger on what I think we're accomplishing, or you're accomplishing,
with that, given what we've just said.
Well, at the time, I accomplished a feeling of, let me categorize as many biases as I can
to feel good about myself and maybe learn a little bit more.
But this was done in 2012, so I'm not really sure it's helped me now, but other than,
And it makes me very aware of biases in my own life as far as, you know, what I do for a living.
Harvey confesses to having a particular weakness for what he calls the information bias.
And that information bias is the tendency to seek out information even when it can't alter or affect what's happening.
So I'll come to a conclusion and then I'll say, well, let me still gather some more information.
and then I come to the same conclusion, and then I go, hey, let me gather some more information.
And then I still come to the same conclusion.
And I'm like, well, you know, I haven't quite exhausted the social service research on this.
So let me go gather some more information.
Why is that a bias?
The bias is to move yourself away from the belief that you've already achieved what you're trying to achieve.
You figured it out.
But I don't feel comfortable with myself for figuring it out.
So I've got to make myself feel comfortable.
So now I'm going to go find more.
And then I figured it out.
But I don't feel comfortable figuring it out because obviously I can't be right.
It's kind of a negative reinforcing piece, I would guess.
You know, putting yourself down.
And I think it's something that happens to a lot of people, to be honest.
But I know that I'm always overlooking things, you know.
Is that because you did something extremely wrong for lack of information in your early life
and now you're trying to compensate for it?
I don't know if there's one particular thing.
There's probably hundreds.
I remember my first part of my career.
I was a child abuse investigator, and I was pretty convinced I knew everything,
and only to come find out later on that maybe I didn't really know anything.
And yeah, I made some decisions during that time of my life,
which definitely affected people pretty negatively,
and were uncomfortable looking back upon them.
I would have done difference.
So yes.
Are you able to share a little more detail of what you mean?
I know you wouldn't name anyone or anything,
but was there a particular thing that sticks in your crop?
Well, there were times as a child abuse investigator
when I gathered information and I made a decision based upon the information.
And you were having to gather information quickly.
Well, let me give you a specific one.
Okay, if you want a specific one,
had a call about a,
a woman with a young child, and they were saying that she was developmentally disabled and wasn't
able to take care of the baby. It took me about two and a half hours to find her. I located her.
I evaluated the situation. She was about 19. She had a beautiful little four-month-old baby,
and she was, I spent about an hour with her, and she was quite competent at her.
taking care of the baby, quite competent. Okay. And I have a bias that mothers should be allowed to
take care of their children. That's kind of one of the biases I have. So I made a decision to leave the
child with her and go on about my way. Okay. About three hours later, her boyfriend came over.
she decided to take a shower.
While she was in the shower, the baby started crying,
and the boyfriend, who didn't know how to deal with the baby, shook the baby, caused brain damage.
Had I taken the baby, the baby would not have had brain damage.
But I didn't take the baby because with the information I had at the time,
there was no reason to take the baby.
But if I had more information, like knowing the husband, the boyfriend would come over and do that,
I would have taken the baby.
So my bias, mothers need to raise their babies,
overcame someone else's bias of babies need to be safe regardless.
Maybe someone who had that bias would have made a different decision.
And there's really no, except for hindsight,
there's really no book of the probabilities and the risks that you can rely on.
There's no yardstick exactly in that situation.
There is no yardstick.
But it is something you remember for your entire career.
Yes.
It's like one of the things I do now is I do a lot of suicide risk assessments.
You got to gather a ton of information to make a decision.
And you make a decision.
Sometimes your decision is right and sometimes it's wrong.
And what happens if you hospitalized someone too early?
and the next time they have a suicidal crisis, they don't come to you.
So they don't can help.
And I have a definite bias in that case.
I am biased towards not hospitalizing if at all possible.
Does that mean I've taken some risks?
Absolutely.
But I feel like the risks have outweighed the losses, so to speak.
You might ask how many losses have you had?
Well, I don't know.
I mean, after I deal with you, how would you know?
You know, I mean, it's not like three years later if you were to, you know, suicide.
Someone's going to contact me and go, oh, by the way.
Well, those are truly difficult decisions.
And I wonder, you know, the stakes there are much higher than really any decision I find myself making.
But do you ultimately find yourself going with your gut?
Or do you find yourself turning to a list of?
of analyzed possible biases or something like that and trying to work through it more analytically?
I think it's kind of a mix of both with another one added in. I have friends who are therapists
and I think very important decisions need to be made by group interaction. For instance,
let's go back to the little girl that was shaken, the baby that was shaken. Had I not been
working alone, which is what the custom was, had I had a partner or, or, you know, or, you know,
or someone I could have called on the phone at the time and said, hey, let me tell you this situation.
What do you think?
It might have been a whole different outcome.
But, you know, it's 7.30 at night.
I'm alone.
I got to make the decision.
Sure.
You never know when asking your friend is necessarily going to steer you towards the truth.
You could ask your friend to get steered away from the truth.
That is true.
You've just introduced a new set of biases now.
I mean, maybe you can hope that it averages out somehow.
And I don't think anybody takes a look at it analytically when you're in the middle of it.
But yes, you're right. You could add a whole new set of biases. You could have been looking to
hospitalize and your friend says don't hospitalize and then they hurt themselves and then you're in
trouble. I think the world is just messy.
The world being just messy has long given human beings some serious headaches. We want it
to be messy, probably, or at least we may need it to be up to a point. But if we found
this is satisfying state of affairs, we would never have made all this wonderful progress.
Decades upon decades of scholarship have gone into tidying up the messy business of how people
perceive, make decisions, and build beliefs. That's why every psychology student attending
university must learn a list of important biases to watch out for. Harvey Norris remembers there
used to be about 10.
Catastrophizing, your attentional bias, your availability, uistic bias.
Just the basic stuff people do all the time.
Right.
In terms of being able to identify your biases, is it better to, I mean, it seems that
194 is a lot to hold in one's mind?
I would go back to the original 10.
Yeah.
Okay.
I would go back to the original 10 or so, the understanding the perfection bias,
understanding the catastrophizing where, you know, if something's
starts to go bad, you just assume it's as bad as possible. And then you respond as if it's as bad as
possible rather than being able to make a determination as to where it fits on a scale of one to
10. It's either one or it's 10. The world is messy. And I think that just by categorizing and
labeling, you can't make it not as messy, but you can give someone else a chance to gather the
information quickly. And that was the reason I did it in a flash card book. I wanted people to be
able to look it up really quick.
All right, well, Harvey, this has been a pleasure. Thank you so much.
Tom, thank you very much for having me. I really enjoyed it.
Harvey Norris is a licensed clinical social worker and the author of Cognitive Distortions,
Bias and Fallacies, 194 Flashcards.
Returning to the simple and straightforward world of England in the 1500s, back when the
word bias hopped from its home in the tailoring and dressmaking trade,
into the bare-knuckle world of competitive lawn bowling,
it was a relatively short hop.
The extension in meaning is easily understood.
People wanted a word that meant
that bowl never travels in a straight line,
and so they grabbed a familiar word
that already meant roughly a line at a weird angle.
Now, between those early crystal clear meanings of bias...
I'm going to make sure that my bias is pointed towards the center.
And the newest K-pop meaning, very different,
but equally specific and concrete.
In between those two steps, the words meaning became vastly more abstract, mobile, and even logically incoherent.
Here's an Australian children's television program introducing young learners to the breadth of the concept.
Bias is a preference or prejudice for or against a person, a group, an idea, or anything really.
This is where biases become more than universal mechanisms or logical fallacies. They become personal.
And most often they're also dangerous and bad.
Hi, I'm Calvin Lai.
And I'm an assistant professor of psychological and brain sciences at Washington University in St. Louis
and chair of the Scientific Advisory Board at Project Implicit.
Project Implicit is arguably the world's best-known attempt to categorize and label our biases.
The website officially started in 1998, making it potentially the first or one of the very first online websites devoted to social science research.
Some of the kind of ongoing research has studied things such as understanding how implicit biases vary depending on where you live.
Some places tend to hold higher levels of implicit biases in average.
Some places hold less.
And also understanding what we can actually do about.
implicit bias, how to prevent their impact on our behavior when it's not desirable to kind of hold them.
Underlying this research, a controversial and almost fun game known as the Implicit Association Test,
or IAT. In the Implicit Association Test, people play a kind of rudimentary video game of sorts.
And in this video game, they're just told to sort words and images to one of two sides of the
screen. You might have white people and good words on one side of screen and black people and
bad words on the other side of screen. And so words and images representing these categories are going
to pop up. And your job as a participant is just to sort them as quickly as you can. And then after a while,
the categories now flip. Now white people are associated with bad, black people are associated with good.
And again, your job is just simply to sort as quickly as you can. And of critical interest for
us social scientists is to try to understand what the nature of your biases are by comparing how
quickly you respond in one of those situations compared to the other situation. And the speed at which
you respond in one situation versus the other tells us something about how closely these ideas
are associated together in memory. If you've never taken an IAT yourself, you can go do it now.
Search Project Implicit or follow the link at cbc.ca.ca slash ideas. They offer demo tests purporting
to measure your implicit associations
concerning race, age,
country, sexual orientation,
body weight, U.S. presidents,
and more.
Yeah, in terms of the kind of common version
that people take, the five-minute version,
it tells you a little bit about yourself,
but I want to emphasize,
it's not the end-all-be-all.
The five-minute version is similar
to kind of like a dinkier home
administered version of a blood pressure test,
where it's not super-duper-reliable.
So it will show
what your implicit biases are
according to the test that you just took,
but if you took it again the next
day or a couple days from now, you might
get a somewhat different score. Similar to
how, you know, if you were to go into
the doctor's office and your blood pressure is high,
on one administration, it's not the end-all-be-all.
Maybe you ate something funny that day
or you ran up a flight of stairs,
right? And in a similar fashion,
these implicit biases seem to be
fairly noisy when measured, and so it requires
a bunch of measurements to kind of
of get a reliable sense of what a person's overall implicit biases are.
I did read on one text to do with your work.
I think it was actually someone else who said,
no one wants to admit to being biased.
And I thought, well, that's not my experience.
I find that people in my circle are very willing to admit to being biased in general
and having a bunch of biases.
What people aren't able to do is identify which biases are in control of them
and then to know what to do about it.
I think so I feel like there's a way in which you're right and there's a way in which that's not quite right.
So it is the case that people might be generally accepting of the existence of bias that maybe you're vulnerable to it, right?
But that if you really pin them on it, they might still feel like they are a little bit better than average.
I may be biased, but that everyone else is much worse off than me, right?
Or they might be biased, but then if you ask for them to come up with examples, they might be like,
like, well, I actually can't think of a past time I've been racist, right?
So there's still these kind of defensive mechanisms that we have.
And there are also these types of cognitive errors that we make.
Right.
So one of the things we find in research on the bias blind spot,
which speaks to this phenomenon of us having blinders to ourselves,
when we think of whether or not other people are biased,
we just look at the outcomes of their actions.
But when we ourselves are wondering about whether we ourselves are biased,
We'll probe through our mind to see if there's any kind of alarm bells telling us that we're biased.
And there almost never is, right?
There is no mechanism in our brain that tells us when our thought comes from a biased source.
We just have our thoughts.
They come into our mind.
We don't know where they come from.
From my point of view, as a social scientist, really motivating getting people to, you know, be more honest to themselves and to make more rational decisions.
it's a great success when people catch themselves in this way of understanding they do have bias
and they are motivated to do something about it.
Some of the research that has been going on has been trying to figure out how to then close that gap, right?
A lot of people care about these issues, but they don't know what's the best thing that I can do.
Or they might kind of let their gas off the pedal a little bit when it comes to actually taking action, right?
you can see that oftentimes when people are generally pretty environmentalist, but maybe they don't end up recycling, they don't end up doing all these kind of sustainable things.
And you can see it in the context of things like race relations where they might generally be supportive of things like Black Lives Matter.
But when you look at what that amounts to, that amounts to liking an Instagram post or posting something on Twitter as opposed to taking some type of collective political action.
Do you see it as a reasonable project for an individual to attempt to name, catalog, measure one's own biases?
I think that it is definitely useful to be reflective of the fact that we are all vulnerable to bias at any given moment in time.
And most of all, the easiest person to trick is to trick oneself.
while I wouldn't necessarily recommend like trying to quantify in some way how you rank compared to everyone else,
I think it's useful, particularly when you're making the types of important decisions in life,
who to hire, who to date, who to be friends with,
that you reflect on whether or not these are the decisions that you're making may be swayed by these factors that you really don't value
and that go against what you think are the kind of higher values,
that you hold in life.
So I see it less of a thing of like that you should quantify exactly how bias you are
and more of like incorporating that in many of the decisions that you're making every single day
about who you're, you know, helping at work, what type of media you consume,
how you're speaking to other people in your life to make sure that you're the type of,
you know, careful and considerate person that lives up to the values that most of us hold.
All right, Calvin, this has been a wonderful conversation.
Thank you so much for telling me all about implicit bias.
Thanks for having me.
It was a pleasure chatting.
Calvin Lai is the chair of the scientific advisory board at Project Implicit.
Pushing aside all the caveats for a moment, you can use the IAT to get a personalized bias list and compare it with your friends, much like the K-pop fans are doing.
How do I pick a bias?
We've become one big family who love and cherish our old and biases.
Make sure to leave your bias list in the comments so I can see how we compare because I think that that would be an interesting thing to look at.
I think the question is, what is your goal?
Well, let's say the goal is to know what to do next time someone else helpfully points out what they see as one's biases.
It might be handy to somehow have turned the mass of terminology and wisdom around different biases
into a short, manageable, ideally alphabetized list of the biases you personally need to worry about.
the most, and to have decided which of them you need to do something about and which ones you can
just nurture and indulge. The next episode in this inquiry grapples with how to make and use
a bias list of that type. I'm just setting out on doing this. Do you have any overall cautions or
suggestions? Yes, I have a big word of caution for you. It doesn't work. Sorry.
One way or another, we will eventually deliver the bowl to the jack.
You were listening to B is for Bias by Ideas producer Tom Howell.
You can go to our website, cbc.ca.ca slash ideas, for information on the guests,
for resources aimed at helping you identify your own biases, including project implicit,
and you can weigh in on the best way to construct a useful personalized list of biases
to watch for. Tom Howell asks that you do this, quote, politely if possible.
Technical production, Danielle Duval. Web producer Lisa Ayuso. Senior producer Nicola Luxchich.
Greg Kelly is the executive producer of ideas, and I'm Nala Ayyad.
For more CBC podcasts, go to cbc.ca.ca.com.
