Stuff You Should Know - How Cognitive Biases Work
Episode Date: February 10, 2026Humans have all sorts of weird quirks that cause us to do silly things and make bad decisions. It’s not our fault though. Our brains are wired that way. Learn about the psychology of cognitive b...iases in this episode.See omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
This is an I-Heart podcast.
Guaranteed Human.
1969, Malcolm and Martin are gone.
America is in crisis.
At a Morehouse college, the students make their move.
These students, including a young Samuel L. Jackson,
locked up the members of the Board of Trustees,
including Martin Luther King Sr.
It's the true story of protests and rebellion in black American history
that you'll never forget.
I'm Hans Charles.
I'm Manilic Lamouba.
Listen to the A building on the I-Harton.
Heart Radio app, Apple Podcasts, or wherever you get your podcasts.
Welcome to Stuff You Should Know, a production of IHeart Radio.
Hey, and welcome to the podcast. I'm Josh, and there's Chuck and Jerry's here too,
and we are getting down to business, getting right to it, here on Stuff You Should Know,
because we got a lot to cover here.
That's right.
So, Chuck, I got a little bit of an intro.
Let's hear it. Was that it?
That wasn't it?
Yep.
Do you remember how homeostasis used to come up a lot?
Yes.
So for those of you who haven't been listening that long,
homeostasis is what your body and your mind and your brain wants to return to, right?
You just want everything nice and even keel and normal and without exerting too much effort and energy, right?
That's homeostasis?
That's, are you asking me?
Mm-hmm.
Sure.
Okay.
So one of the ways that your brain returns to homeostasis as fast as it can is to use shortcuts in making decisions, right?
Because if you're having to decide something, you're actively being challenged.
You're not in your homeostatic space.
So if you use a shortcut, you can say something like, I've had the red apple in the past and it was delicious.
I've eaten the brown, mushy one before and it was awful.
I'm going to eat this red apple, right?
Rather than going to the trouble of pulling both apples out and like analyzing them with a microscope and all that,
you can just kind of use a little shortcut.
That's a heuristic.
And it makes a lot of sense because your brain is like, great, I didn't use that much energy.
I made the right decision and we're good to go.
The problem that comes about, though, is that with heuristics, you're not always right.
You don't always make the right decision.
You're not always taking all the information into account.
And when that happens, you start stumbling into cognitive biases.
Yeah.
Like, this is a frustrating episode because I feel like the title could be cognitive biases,
everything you think you know is wrong.
Yeah.
Well, that's a great title.
Let's go with that.
It just made me feel like a dummy the whole time.
Oh, don't.
It's, you're not a dummy.
All humans are dummies as far as cognitive biases go.
It's not just you.
And this stuff is hardwired into us because, like I just said, we take mental shortcuts.
And the problem, Chuck, is what we're talking about mostly today are unconscious biases.
Right.
There's conscious biases.
We just usually call those biases, right?
Those are the active challenges that you need to overcome to be a better version of yourself.
These are like unconscious.
So there's not a lot you can do about it, although at the end we're going to kind of give you some tips and pointers.
but it's a challenge for absolutely everybody.
It doesn't make you dumb.
Yeah, I mean, I think the tips and things can help for sure,
but it's just part of being human, you know, the unconscious bias,
and there's not a lot we can do to completely eradicate them.
No.
And if it's a, you know, if it's a real problem, then I'm sorry.
Well, one of the big problems that we all kind of face is that we are,
were predictably irrational, as was said by Dan Ariely, who was a behavioral economist.
And because of that, corporations, marketers, basically everybody who wants to sell you something
knows about these things, and they can manipulate those things. They can trick you into making
decisions you wouldn't otherwise make. Yeah, for sure. And we wouldn't even be here
probably talking about this, but it hadn't have been for two kind of revolutionizing
thinkers who ended up being some of the more off-sighted researchers in the history of research,
as we'll learn, especially when it comes to economics. And they were a couple of psychologists,
Israeli psychologist, named Amos. And I looked up different ways to pronounce this because we always get
Guff. And I've heard everything from Hard to Berski to his colleague, Daniel Kahneman,
doing more of one of those Verski. Oh, I just heard him refer to him as Burski. I just heard him
refer to him as Big Tea.
The Big Tea?
Yeah.
Because it's TV, but those were the two guys working together.
They developed this concept in the 70s at the Hebrew University of Jerusalem and really, like, got down to it pretty quickly as a result of Connemon, I think, taking some issue with the Big T's research.
And I guess they kind of bonded over that or something.
Yeah, it was pretty cool because Tversky, um,
basically he was a mathematical psychologist, which any time you hear mathematical and it's to do with
something other than math, what that means is you've taken something and you've set it out in a very
standardized way so you can explore it, you can teach it based on certain facets. And the upshot of
mathematical psychology, as far as human behavior goes, these are the people who came up with the
Kroc idea that humans behave as rational actors. We're self-interested. We take all the best
information available to make the best decision for ourselves. And Daniel Kahneman was like,
this is not at all true. And he started challenging Amos Tversky's theories. And Tversky, instead of
saying, like, no, you shut up, he was like, all right, let's go figure out. Let's get to the bottom of
this. And because of that, yeah, like they formed this partnership that had a huge impact on the
world. Yeah, I think it's kind of heartening that they, as academics, you know, got together. There
were no ruffled feathers or at least it didn't end up that way and they worked together it's
kind of a heartening thing i think these days yeah there's got to be at least one yeah that's right
they came up with a program called the heuristics and biases program to basically you know
study how human beings make their decisions how they go through life making choices when they
don't have um like all the information at hand all the most perfect information to make that choice
or if they don't have like all the time in the world to look at the information that they do have to make that choice.
So like how are people making decisions?
How are they making mistakes in their decision making?
And they ended up coming up with a couple of different systems, one which is super quick and one which is much more deliberate.
Yeah, Daniel Kahneman came out with Thinking Fast and Slow, which was one of those super popular airport books, you know?
Yeah, thinking comma.
Fast and slow.
Yes, thank you.
Eat shoots and leaves.
That's right.
And in it, he basically lays out this kind of shorthand model.
He's very explicit to say, like, this is not how your brain is actually laid out, but it's a good metaphor for it.
And system one is how you think quickly, you think almost unconsciously.
You make rapid decisions.
And that is kind of how we generally navigate life.
System two is much more deliberate.
it's where we take into account like different ideas.
It's where we really stop and think about something before making a decision.
And they're essentially competing.
There's something called interference.
And System 1 has a really great tendency to interfere with System 2.
And there was a psychologist working all the way back in 1935 named John Ridley Stroop,
who basically discovered the Stroop effect that is a way of demonstrating how System 1 interferes
with the slower, more deliberate system too.
Yeah, oh boy, I bet he patted himself on the back up to this one
because it's one of those things that's so simple,
but I bet he winked it everyone like, watch this.
Yeah.
This is going to break your brain.
It's genius.
It really kind of is.
So what they did was they simply wrote down the names of colors,
but they would write down the name of that color in a different color.
And then he would just say, ask to the person to read out loud,
the color of the word that is written, not the color that it's written in. And it is surprisingly
difficult to do that. It's just a little weird brain-breaking thing. Yeah. Yeah. So he's showing that
your system one just wants to hurry up and read it. Yeah. And it's getting it wrong. And that's,
that's interference, right? So that kind of like started to lay the groundwork for this idea that we do
have kind of competing ways of seeing the world and making decisions. And what kind of
was saying is that most of the decisions we're walking around making are actually the system one, super fast, shorthand decisions.
But we think that we're using our more rational mind because we make post hoc explanations for why we decided that.
And that's not to say we're all walking around with these creepy little secrets that we know we're like fooling ourselves.
We don't realize we're doing this.
That's why these biases are unconscious.
Even if you stop and think about what you're doing, you may still not come up with the answer like, oh, yeah, I was making up explanations after the fact to explain why I actually used System 2 when I didn't. It's really hard to do that.
Yeah, for sure. Livy gives a pretty good example of that as far as like hiring somebody. Someone may make an impression and an interview that kind of locks it up from the second they walk in. Maybe they look like their mom or dad or a relative or maybe they remind them of themselves or maybe like who knows what it could be. Then they end up getting that job. And later if you ask the person who hired them, you might say, oh, well, it was because of this, this, this and this. When in fact, that's really just system to kind of, kind of,
of confirming like, no, it's because the guy walked in wearing a New York Giants t-shirt.
Yeah. And we'll get into some of the problems with the stuff throughout, but this is a good
example, right? If the opposite happened, if like you didn't hire somebody because they weren't
quite like you, that's an example of a bias too, even if you don't think that that's why you did
it. If you're looking at their CV afterward and you're like, oh, they didn't graduate from college,
that's why. But really, it was not because you're racist. It's not because you're a woman.
hater, it was because you're preserving your own level of comfort because other groups that are
different than you make you uncomfortable. And that's how groups can become entrenched, right?
You just, once one group kind of dominates an organization, they tend to continue doing that
because people hire other people who they're comfortable around rather than pushing themselves
outside of their comfort zone and probably improving their organization. And that's why diversity
programs exist in the first place because of that human tendency. Yeah. Or maybe they were just
to Jets fan.
That's possible.
I mean, no Jets fan is going to hire a Giants fan.
Yeah, well, here's a tip.
I don't know a lot about interviewing other than be yourself and try and get someone to like you,
but don't go into any interview wearing any sort of branded sports apparel.
Yeah, especially a jersey.
Yeah.
I think that says quite a bit.
Yeah, you wear that Giants jersey in there.
Well, I guess you're rolling the dice.
You've either got that job.
Right.
Or there's no way you're going to get it.
So.
Maybe it's not a bad idea then.
I don't even know what I'm, I might be wrong.
Yeah.
I mean, I guess if you dressed it up with a bow tie, maybe, you could get away with it.
But yes, it is still a gamble regardless.
Well, not all jobs.
You have to wear a suit and tie, you realize.
I know, but I'm saying, like, you dress for the part you want.
If you're wearing a Giants jersey with a bow tie, I think you're making a good impression out of the gate.
All right.
So I guess we can talk about, we're going to go through a list of about 10.
10 different biases.
They're all pretty interesting, and I know everyone can identify with probably each of these at some point.
But before we do that, we need to point out that, like, these are all mental shortcuts or the
result of the mental shortcut, but not all of them work in the same way and how our brains work.
A lot of, you know, there could be a lot of things at play.
Emotions can come into play.
Maybe, like we were just talking about, like, it's hard to reassess something after you've
gotten a first impression.
people, humans historically, through their life,
tend to make bad guesses at things.
Because if you make great guesses,
then things like gambling would be super easy.
So all of these things come into play.
There's not just like a single way that it's a broken system.
Right.
Yeah, yeah, yeah.
But people generally, it seems like universally in a lot of these cases,
behave in these ways under the same circumstances.
Yeah.
Yeah, like some of their stuff wasn't replicatable, but that's sort of standard for studies in psychology.
Like, a lot of this stuff, as we'll see, has checked out across cultures.
Yeah, which is huge, you know, considering the whole weird problem in psychology in particular.
The weird people are...
Western, educated, industrial, rich.
What's the R? Yeah, rich.
Democratic, I think.
All right, so we'll start.
We'll leave the biggest guy for last, I think, which will be after a break probably,
but we'll start then with maybe hindsight bias.
Yeah.
And this is the idea that after something has occurred,
and we've talked about this one before here and there,
that we tend to think like, oh, well, of course that was going to happen.
In fact, not only was that should I have seen that coming,
it was probably inevitable that that happened.
And a lot of a time, it may be because you're misremembering your expectation
before it even happened.
Right.
Like we can rearrange our memory of how we felt about the event.
or the outcome of the event afterward to basically match the outcome.
Yeah.
I guess because we have this never-ending need to be right.
Yeah, that probably has something to do with it.
And I knew you were going to say that.
So I've got another one for you, Chuck.
All right.
Self-serving bias combined with the little fundamental attribution error on the side.
Yeah, that's a good one.
It's a good side dish.
So these things basically go hand in hand.
It's basically how we see ourselves in a great light, how we see other people in a more negative light.
Self-serving bias is basically saying if something good happens to you, it's because you are good.
Like you earned it.
It's because of you doing something right.
Something bad happens to you.
It's external forces that made that happen.
Fundamental attribution error is the exact opposite with other people.
If they do something right, it was just luck.
If something bad happens to them, it's their own fault.
So a good example of this is like if a coworker comes in late one day, you're like, they're just lazy and slack.
But then you come in late the next day and you're like, it was traffic.
Right.
That's basically those two things going hand in hand.
And those are both biases.
Yeah.
And I hope people understand that like all of those things can also be true, you know?
Sure.
So if you're thinking like, well, no, but sometimes I did deserve the thing and sometimes it was someone's fault.
Yeah, sure. That can happen.
That's, we're not, these aren't absolutes.
No, it's more just, yeah, your, your tendency to think in certain ways.
Yeah, sometimes you're going to be right.
Sometimes you're going to be wrong for sure.
Like humanity's tendencies.
Yeah, you got to take a big broad view here.
Yeah, but also you specifically.
Right.
You, James Kirkland, listening in Baltimore.
There's one, oh, man, James Kirkland is going to pull over to the side of the road right now and really freak out.
I hope, man, I hope I nailed it.
Yeah, I think you picked a common enough name.
We'll see.
All right.
So anchoring bias is another one.
This one I've fallen prey to.
I'm going to say that probably about all these.
But that is the first piece of info that you get about something can really affect,
and even in a very disproportional way, things that happen after that.
Like once something is kind of locked in, it's hard to unwind that.
Yeah, that first piece of information, it's like, oh, okay, this is.
this is going to basically prime you in your answer, your decision, right?
Yeah.
So a good example I saw is there was a study that said like, okay, the Mississippi River is
less than two miles long.
How long is it?
And those people would say something like 1,500 miles.
And then other people would say, okay, the Mississippi River is less than 500 miles long.
And people would say, like, it's like 300 miles.
And then another group was the Mississippi River is less than 80 miles long.
Those people would answer like 60.
It's the same thing, the length of the Mississippi River,
but they were presented with this basically this priming number,
a large one, a middle number, or a smaller number,
and their answers were related to that first piece of information that they got,
and that's anchoring bias.
Yeah, and Livia pointed out another little side dish here,
which is called the decoy effect,
and that's when you will go into a restaurant, and they might,
and this is how, just kind of one way this can affect economics,
which will come up a lot,
but you'll go into a restaurant and they might have one super expensive bottle of wine on the menu.
And maybe it's even placed at the top so you see it first.
And then the other bottles of wine might seem like a decent deal after that, even if they're also overpriced.
Yeah.
Exploitation.
But all wine and restaurants is overpriced.
I hope everyone realizes that.
I mean, buy a lot, right?
Yeah.
Don't they make most of their margin on that?
Oh, totally.
It's frustrating, but that's the business.
That's also, I think, Chuck, this anchoring bias is why they say you should never lead in a negotiation with your actual price that you want to go higher or lower depending on your position.
Oh, like if they ask what you want to get paid?
Yeah.
Or like you're trying to buy a car or something like that, you know?
Yeah, I hate all that stuff.
Yeah.
Yeah, if somebody, it's like, well, what do you think I should get paid, you know?
What do you look in to make at this job?
But that's not saying, what you make should always be the answer.
Exactly.
You just want to add, I don't know, 50% to what you actually want, and then you have room for negotiation.
There's the one other thing that's related to this framing bias, and that's basically the same thing.
But rather than the first piece of information guiding you, this is more directly guiding you.
So, for example, some drug maker says 10% of patients die.
You're like, oh, go on, that's a lot.
Right.
You could say at the opposite way, 90% of patients live.
You're like, oh, that's great.
Right.
Same amount of people dying.
It's just framed differently to exploit your response.
To exploit your aversion to dying.
Pretty much.
And that's a human thing, isn't it?
For sure.
Shall we take a break?
Yeah.
All right.
Josh said human thing, so it's time for a break,
and we're going to come back with more biases right after this.
Welcome to the A building.
I'm Hans Charles.
I'm in Alec Lamoma.
It's 1969.
Malcolm X or Martin Luther King Jr.
had both been assassinated, and black America was out of breaking point.
Writing and protests broke out on an unprecedented scale.
In Atlanta, Georgia, at Martin's Almemata, Morehouse College, the students had their own
protest.
It featured two prominent figures in black history, Martin Luther King's senior, and a young student,
Samuel L. Jackson.
To be in what we really thought was a revolution.
I mean, people would die.
In 1968, the murder of Dr. King, which...
traumatized everyone.
The FBI had a role in the murder of a Black Panther leader in Chicago.
This story is about protest.
It echoes in today's world far more than it should, and it will blow your mind.
Listen to the A-building on the I-Heart Radio app, Apple Podcasts, or wherever you get your podcasts.
All right, Chuck, up to bat, is number 23, availability heuristic.
Can we put like a stadium echo effect on that?
Next to bat, many motos.
Very nice.
So what is that?
Availability heuristic.
And I'm sure this has happened to you before, right?
Yeah.
None of these have ever happened to you,
which is the funny thing about us doing this episode.
But the availability heuristic is what you have available to call up in your brain at any given moment.
So you're going to rely more on what you can immediately think of in the moment.
And chances are what you're immediately able to think of in the moment is something that probably aligns with your worldview or something like that, which is sort of a, well, we won't talk about the sea bias because that's coming up.
Well, yeah, or something that like really kind of goose you emotionally.
Like that's very available because it's like, you know, loud and scary in your mind kind of.
Like if you saw something about a plane crash in the last, like, day or so,
and somebody asks you how frequent plane crashes are,
you're probably going to give a much higher estimate than you would have before that,
you know, maybe based on the number of times you've flown and nothing bad happened.
Yeah, that's a good one.
There's also inattentional blindness.
And before anybody, before we talk about this, because we're going to spoil it.
Yeah, yeah.
I want to send everybody, if you have the means to do this,
go on to YouTube and search for selective attention test,
and this is Daniel Simon's YouTube channel,
and then watch the test where the people are passing the basketball back and forth.
We'll wait a second.
All right, that's enough.
All right, great.
So hopefully you press pause and you didn't just try to watch it while Chuck was doing the Jeopardy theme.
It is short, but it's not that short.
It's like a minute and a half or something, right?
Yeah.
So tell them about this video, Chuck, because it's pretty great.
That's right.
In the video, they have a group of, what was it, like six people probably?
Six on the nose.
Six college students.
I guess three are wearing white shirts, three are not wearing white shirts.
Yeah.
And they're in a very tight, small circle.
Yeah.
It looks very awkward.
They have a basket, I think, was it two basketballs?
Mm-hmm.
There's two groups.
Yeah, two groups, two basketballs.
And what you're told is the task at hand is to count the number of times that people in white, the white team are passing the basketball.
So you're counting.
I was right.
One, two, three, four, four, five, six, seven.
And that's all you're supposed to do.
And at the end, you're supposed to say, you know, how many times they pass a basketball.
Right.
And now, now hit them, hit them with the good stuff.
So apparently half of the people who do this, which is astounding to me, half of the people who,
who watch this video and take this test,
don't notice that in the middle of it,
a person in a gorilla suit walks into frame
and turns to the camera and I think beats on their chest
and then walks out of frame.
Like in the middle of these people throwing these basketballs
around the gorilla,
half of the people are paying such close attention
to counting how many times the people wearing white t-shirts
are passing the basketball
that they do not notice the gorilla
until the end of the video when it's pointed out.
Yeah, and we're assuming it's a person in a guerrilla costume.
I'm hoping.
First of all, that might be a bias at play, that it's not a real gorilla.
Well, I guess it depends on the amount of funding they had.
Yeah, it looks actually like the gorilla from trading places.
It totally does, which is like, were they even trying?
No.
Okay, kid. I just wanted to make sure.
Did you watch this video before you knew about this?
I had heard it from some friends who do magic, and they were basically talking about this on a little,
podcast that they made.
Whoa, whoa, whoa, whoa.
You have friends who do magic?
So, you know, our friend Toby?
Oh, yeah.
He has very good friends that do magic.
Wow.
Like professionally?
Yes.
I became kind of friends with him.
So, yeah, I guess I do.
I have friends that do magic.
Well, buddy, next time we are in Los Angeles at the same time, our good friend and
friend of the show, Adam Pranica.
Yeah.
Is a member of the Magic Castle.
And it's one of his and his wife, Elaine's,
favorite thing to do is to take friends to the Magic Castle. So have you ever been? No.
It is great fun. Adam Pranica just keeps getting better and better, doesn't he? Yeah, I haven't
been with him, but I've been a couple of times, once many, many years ago, then another probably
10 years ago. But it's a lot of fun. I'm a big fan of magic. Yeah. And it's pretty magical when
people don't see that gorilla in a very tight frame. It's not like it's on a big basketball court
and the gorilla sneaks in there. Like there's six people and then there's a seventh, very clearly.
It is very obvious. So, I mean, what this is showing is that our attention is limited, right?
When we're really focused on a task, you saw that gorilla, those half of the people who didn't notice the grill. You still saw it, but you were so focused on the task that your brain was just getting rid of information that was unrelated to the task because it's not pertinent. It can become pertinent, though, when that gorilla decides to attack you. And so this is a cognitive bias we have where we're,
ignoring potentially unimportant information to take in the stuff that's related to the task at hand.
Yeah. You know where they could really get away with this? Because where you have great
concentration, is it a professional sports game on the Jumbotron when they have the baseball under
the helmet or whatever? Uh-huh. And then they're moving them around and you got to find,
you know, it's like three card money. Yep. Because you're concentrating so hard on that. They could
put whatever they wanted on that screen while that's going on. And I bet you most people would not. Or maybe
I guess it's half if that's what they found.
Yeah, I'll bet you're right.
I'll bet you're right, man.
I was just trying to think of something where you're super trying to follow.
Because I was happy I came up with the correct amount of passes at the end.
You did.
You said, what does it mean if I noticed the gorilla and got the correct number of passes?
And I said it means you're a perfect human.
That's right, which we all know is not true.
None of us are, Chuck.
None of us are.
Yeah, I know.
So there's another one that you may have heard of before,
even if you've not heard of any of these other ones,
called the Dunning Kruger effect, it became kind of viral because if you take it through the
pop culture meat grinder, it becomes much more simplified and kind of loses some of its
actuality, but people still like it because it's a good way to put other people down.
Yeah, it is. This is the idea, the correct idea is that people with little understanding in an
area tend to overestimate their ability and their knowledge about something. Right. Because they don't,
They know so little they don't even know what they don't know kind of.
Right, exactly.
But what you are talking about, it's kind of been transformed into like morons have them or the most like braggadocious, which can be true.
It can be.
You know, I think that's one of the things.
Like you said, you can be right with cognitive biases.
You're not wrong with them all the time.
So yeah, that kind of supports that.
But that's not what the Dunning Kruger effect actually says.
You said it.
And then there's the opposite way, too, where the more.
experience you have, the more expert you are in a field, the more you assume that it should be
easier for you than it is. Yeah, that's a very valuable thing to understand, I think. And you get
much further in life if people are like, well, you're the expert. And the expert's usually the one going,
yeah, but I don't know, maybe we should hold off because, you know, X, Y, and Z. Right. Yeah. So that's
the actual Dunning Krueger effect. And I saw that it's, it's being a sales.
right now. People are saying a question, even the basic version of it, like the actual academic
version of it. Yeah, so we'll see what happens with that. Oh, interesting. We've got the gambler's
fallacy next, and that is, oh boy, if you have ever gambled anywhere, but if you like go to
casinos and stuff like that, you're going to see this all over the place. You're going to hear it
spoken out loud. And this is the idea that you find patterns where there are no patterns. So if you're
at the blackjack table and you hear the person next to you like well oh man see i've lost four in a row
so i'm gonna bet like i'm gonna go all in on this because i'm bound to win because i've lost four in a row
there's no way i'm gonna lose five in a row right the problem is is those each of those hands of
blackjack are unrelated to one another they don't form a pattern but you are predicting a pattern
that just doesn't exist yeah that means you're a fallacious gambler i can get you in real trouble
I mean, you can do the same thing on the playground with coin tosses.
In fact, coin tosses, I think, is a lot of times the way they sort of try and prove this.
Yeah, because each coin toss, considering like you're playing with a perfect, unflogged coin that has no bias whatsoever,
each coin toss is totally unrelated to the last.
So you could get 100 heads in a row, and that doesn't mean anything.
It doesn't mean a tail is coming because each of those hundred heads in each of those coin tosses,
had nothing to do with the last one or the next one.
I know.
That's hard to break out of, though, because it seems very human to think, like, they flipped four heads in a row.
There's no way there's going to be a fifth.
Well, that's another reason why this is so hard.
We're hardwired to find patterns and stuff.
It's a way to navigate the world.
It's the way we navigate the world is by finding patterns so that we can recognize things in the future
and thus spend less energy getting back to homeostasis.
That's right.
This is all so interesting to me.
I love this stuff.
I knew that you loved it. This is Josh Clark Central.
I love observing it because I can't grasp what it feels like to suffer any of these.
So just to discuss it in this way is really fascinating to me.
All right. Let's talk about the base rate fallacy. That means you put more weight on just like one very specific piece of information instead of looking at all the pieces of information that have come your way.
Yeah, and usually it's individuated information, meaning like say some quality or character.
characteristic of one person, and then you're ignoring the base rate, which is like pure statistical
information about what you're trying to figure out. And a really good example of this is like,
let's say that you're looking at somebody who is super fit, a woman who's very fit and athletic,
and you're asked, do you think that woman is a personal trainer or a teacher? Because the basically
the only evidence you have there is that this woman is very athletic and fit, you might say personal
trainer. But if you took all the base rate information into account, you would know that even the
very, say, very small portion of teachers who are very fit and athletic may be small compared to the
total number of teachers. It's still much larger than the total number of personal trainers in the
world. So statistically speaking, it's much likelier that that very fit athletic woman is a teacher
and not a personal trainer. You don't do that because you think personal.
trainer, athletic fit, must be a personal trainer. You've just fallen prey to the base rate fallacy,
my friend. Yeah, but she has on yoga pants and hokas. Exactly. That doesn't narrow down anything
these days, you know. How about the mere exposure effect, Chuck? And like mirror is part of it.
I'm not making a judgment about it. That's right. That means just merely being exposed to something
has a vast impact.
So the more we experience something,
the more you like it,
which is why you see that commercial
for the thing over and over and over
that Burger King ad,
over and over and over,
although I wouldn't say
that you might like that one,
the more you heard it.
That's the outlier for me too.
But that's the idea, though.
There's just mere exposure.
We'll get you there.
And then there's a related thing
called the illusory truth effect,
which is basically that repeated exposure to a lie
causes you to eventually believe in it
if you hear it enough times,
even if you initially knew that it wasn't true.
So that makes me wonder if like it just wears you down over time.
Like your brain is tired of defending itself
against being assailed with a lie.
And it's just like, fine, that's true.
I don't care.
Yeah, I mean, sure.
Politics certainly comes to mind.
Repeat the lie, repeat the lie, repeat the lie.
Yeah, and I mean, like, it's a viable way to exploit people's cognitive biases in that respect.
Should we end up, should we close out with the big daddy of them all, the big C, the big confirmation bias?
Yeah, let's do it, baby.
All right.
Why don't you start this one?
Okay.
So there's a guy named Peter Watson back in the 60s.
He coined the term confirmation bias.
And he basically had an experiment that's really clever.
It's hard to understand at first, but it's very clever.
He basically said, hey, here is a sequence of numbers, two, four, and six.
Figure out what the pattern is.
Just to be clear, this is really hard to explain.
If you find somebody who can explain this well, you'll get it.
But I don't think I'm a candidate for that.
I think we all know that I'm not going to explain this very well.
Oh, I don't think that's true.
Do you want to take a crack?
Okay, the original numbers were 246, and people might tend to,
who go with like, all right, 8, 10, 12.
And they're thinking it might be, all right, it's even number sequence, ascending even number.
Right.
And they would say, no, that's not correct.
And you said, well, maybe it's 4-812 and it's like doubled or something.
And they would say, well, that's also incorrect.
And then you're at wits in because what you haven't done is just done any ascending order.
You didn't go 179, 300.
All right, let me take a crack at it.
you ready?
Sure.
So the original number is 246, and the participants would try to come up with the explanation of why, like, what are the, what pattern are those numbers following, right?
So you might say, like, does 8, 10, 12 work?
And they would say, yes.
And you'd say, okay, well, then you're just looking at even numbers.
And they would say, no.
You still got this right.
This still fits the pattern, but your hypothesis.
for it is wrong.
Right.
That's the key.
Right.
Here's where the confirmation bias came in.
People would then go back and continue trying to find versions that fit their hypothesis to explain this, even though it was wrong.
Yeah.
Rather than take their hypothesis and say, okay, this is right.
This fits the pattern, but it's still not correct.
And start trying to break their original hypothesis by coming up with like just completely
random stuff that doesn't fit their original hypothesis.
In which case, they might have said something like, does 1-6 or 27 work?
And they would say, no, that's not, that doesn't fit.
And then that might lead the person to see that actually the only thing that has to be correct to be part of the model is that the numbers have to ascend in order.
That was it.
But people, people, man, just go.
You might even try it.
You might even try and break it by saying 357, but you're still using that original,
a version of that original hypothesis.
Exactly.
That there's, yes, say like that you think it goes up by two or something like that.
Yes.
You're like very few people go back and try to break their own hypothesis.
And that's the point of confirmation bias.
Let's move on from that experiment.
The point of confirmation bias that this shows, if you actually can understand it from other people than us,
is that we tend to take our initial ideas, our beliefs in a lot of cases,
and look for information that supports those and discard information that doesn't support it.
Right. And that's, of course, you know, I mentioned politics a minute ago. That's where you most
firmly see that these days is you are in a media bubble probably. I don't know a ton of people
that get their news sources from completely disparate points of view.
And news these days that you're getting is so a lot of it is so slanted to begin with. It's probably
not even the best example anymore to use. But you're probably a long way of saying, you're probably
going to be seeking out news that confirms your beliefs, because you don't want your beliefs challenge.
Yes. I mean, I, like everybody else, I have trouble with that as well. But I have to boast,
Yumi is actually really good about getting news from different sources. Yeah. And one reason that I find
it difficult to do is because I have like physical reactions sometimes. Yeah. And that is a thing.
That's one of the reasons why they think we use confirmation biases because it sucks to be to have your beliefs challenged, right?
It's really difficult to overcome that.
And there's this thing called belief perseverance, which is even when your beliefs are challenged with say like an indisputable fact, you can still use confirmation bias to preserve that belief because we usually attach our identity or build our identity around our beliefs.
That's who we are.
So it's like we're being personally attacked.
And then even more than that, there's the backfire effect, right?
Did you see that?
I did not.
So the backfire effect says that in the face of being presented with information that basically counters your own beliefs,
it can make you actually solidify your original incorrect belief in the first place, right?
You'll believe it even more strongly, even though you've just been given facts that contradict it.
So we really, really hang on to our beliefs as much as possible.
And that is a huge, huge thing that humans trip over.
Confirmation bias is probably the granddaddy of all biases, I think.
Yeah, that's why I saved it for last.
Yeah.
And, you know, a lot of reasons people do this, you might be protecting yourself, like your self-esteem,
because otherwise you can, you're admitting that you may have been wrong about something.
and it takes a big person to do that.
You want to believe that you're right about stuff.
And it also might just be difficult to process more than one hypothesis at once.
It might just be a little too brain-breaking.
Yeah, because once you lock into an explanation, your brain just, it's like, no, we've got it.
We don't have to figure this other thing out.
Homeostasis, homeostasis, you know?
It is very hard to entertain something that is counter to what we already think is true.
That's right. All right, everyone. As you can tell by the clock, we are taking our second break. This is a long one, and we're going to come back and talk about behavioral economics right after this.
Well, wait, before we do that, let's try to explain this confirmation bias study again.
Yeah, we should. All right, we'll be right back.
Welcome to the A building. I'm Hans Charles.
I'm in Alec Lamoma. It's 1969. Malcolm X and Martin Luther King Jr. have both been assassinated, and Black America was out of breaking point.
Writing and protests broke out on an unprecedented scale.
In Atlanta, Georgia at Martin's Almermata, Moore House College,
the students had their own protest.
It featured two prominent figures in black history,
Martin Luther King Sr. and a young student, Samuel L. Jackson.
To be in what we really thought was a revolution.
I mean, people would die.
In 1968, the murder of Dr. King, which traumatized everyone.
The FBI had a role.
in the murder of a Black Panther leader in Chicago.
This story is about protest.
It echoes in today's world far more than it should,
and it will blow your mind.
Listen to the A-building on the I-Heart Radio app, Apple Podcasts,
or wherever you get your podcasts.
Okay, I promise to talk about behavioral economics.
A lot of the work that Sversky and Kahneman did
was super applicable and kind of revolutionary in a lot of ways.
for the world of economics and how people buying behavior is affected.
They didn't invent it.
Like Adam Smith wrote about stuff like this.
And starting about World War II is when they started really kind of homing in on stuff like this,
like using mathematical models.
And it all kind of started with the assumption that people and companies and organizations
are really just trying to pursue their self-interest at the end of the day.
Yeah, and they're going to make the most rational decision.
Yeah.
And that's just, and they were like, yes, we know people make irrational decisions, but these are outliers.
Like if you take all of the information or data in aggregate, you'll see that humans generally try to make the most rational decision.
That's just not true.
People don't do that.
We make all sorts of irrational decisions that very frequently run counter to our own best interests.
And again, we'll even reject stuff like information that would help us make decisions.
to our own best interests if they counter our beliefs.
So there's a guy named Richard Thaler who ended up becoming a colleague of Dversky and Kahneman,
and he took some of their papers and he realized that these mistakes, these cognitive biases,
they can be predictable, right?
You can actually basically map how somebody's going to make a bad decision,
and this became the basis of behavioral economics.
Yeah, he, well, let's talk about this.
this prospect theory because this was from Tversky and Conneman.
It was an article from 1979 from this idea of prospect theory, colon, an analysis of decision under risk.
And Libya says it's probably the most cited economics paper of all time.
Like this was a revolutionary landmark paper.
And they didn't write a ton of papers for researchers.
They did like eight total, which just shows what an outsized impact they had.
But they talk about in this paper a lot of attitudes about risk.
One is loss aversion, which is the idea that you're going to experience more emotional suffering when you lose money than you will gain happiness if you gain something.
So you may pass up an offer that gives you equal odds of winning 25 or losing 20.
There was another example I think kind of gets it across more is there was an experiment in 1996 where,
they gave participants a lottery ticket.
And before you scratched it off or whatever, let's say it was a scratch off, they said,
all right, well, hold on.
Before you do that, I'll give you another lottery ticket plus $10 in cash.
And for no logical reason at all, people tended to think that that first ticket was the one,
even though there was no, it was a lottery ticket.
There was no difference at all.
They would turn down that extra $10.
I think less than 50% of them took that deal.
Yeah, because in giving away,
trading that first ticket, they
risked a loss, even
though the gain was right there.
Just trading the ticket, you got an extra 10 bucks, right?
Yeah.
That's fairly irrational.
We also have a lot of trouble
with rare events.
Yeah.
We tend to overestimate them.
It can be a positive event.
It can be a negative event.
But we're really bad at probabilities
and statistics.
And this is
essentially, it's like
you won't let your kid walk to school because you're afraid of that your kid being kidnapped,
even though the chance of your kid being kidnapped is just ridiculously low.
It's technically irrational, even though very few people would fault you for that.
But it's still an irrational decision.
Yeah, for sure.
We talked about that in the, well, was it?
Free-range kids?
Maybe.
I can't remember.
It tied in with a satanic panic and stuff like that.
that I think. Definitely did.
Back in the day.
Relative rather than absolute terms, that is a theory, a monetary theory where, and this is a great example, you might drive an extra 10 minutes in a car to buy a shirt that you know is selling the shirt for 20 bucks rather than the one closer to you that sells it for 30.
But you're not going to do that because that saves you 10 bucks.
You won't save $20 on a car even though you may even have to drive five minutes down the road because you're like,
oh, it's $20,000, the car's $20,000.
But it's really a relative, you know, absolute thing is you're saving twice as much money as you did on that t-shirt purchase.
Right.
But it's, like you said, it's all relative.
Yeah.
Again, totally irrational.
But all this stuff relates to economics.
And like you said, the stuff can be replicated.
There was a 2020 study that looked at the prospect theory in particular.
And this major study was conducted in 19.
countries in 13 different languages
held up. Not bad.
No, that's not bad at all.
And so it's not just economics.
It's not just being exploited by
the wine list or
you know,
Kentucky Fried Chicken or something like that
to make you buy
their stuff.
This actually, this can have like life
and death consequences too.
Although I guess so can
wine and Kentucky Fried Chicken.
You know what you're going to get at Kentucky?
fried chicken. What? Pepsi. That's right. You will get some Pepsi. You know how I know that? Because you just
had Kentucky fried chicken? I did. After our tour, I was a little tired and needed just some fried chicken,
so I got fried chicken. What do you get just the original or extra crispy? Because you're crazy if you
don't get extra crispy. I get extra crispy, but they were out, uh, they could satisfy. I got the three
piece. They had two more pieces of extra crispy and they asked if one piece of, uh, OR was available. And I was like,
Yeah, sure. I'm not going to not eat a piece of chicken.
It is good. They do chicken right.
Yeah, they do.
Did you get the mashed potatoes and gravy?
You know it, buddy, times two, and extra biscuit. I went all in.
It was a rare eating frenzy.
Did you drink a Pepsi?
I did.
Awesome.
Well, that all fits somehow.
I know. I don't know how, but it somehow fits this episode.
So where I was saying that this can be life and death is with medicine.
because although doctors have God complexes
and like to present themselves as infallible,
they are quite fallible.
They're humans,
and they can suffer the same cognitive biases as us,
but they have your life in their hands.
We rarely have others' lives in our hands.
Yeah, do you watch the show The Pit?
I tried, and it just did not grab me.
I gave it like 10 minutes,
but I hear nothing but good things.
Yeah, I mean, I really like it.
I've never been a hospital show guy,
so this is kind of one of my first forays into it,
But I like it a lot.
I haven't started season two.
But I notice when reading through these bias, like medical biases, that they do a, or at least Noah Wiley does a really good job on the show with these younger residents trying to bust through.
And a lot of this stuff comes up.
He doesn't say, hey, that's affect heuristic.
He just will talk about what that is.
And now that I know the definitions, I'm like, oh, what he's talking about is an outcome bias or an anchoring bias.
It's fairly interesting.
Yeah, rather than say like being presented with a really high price for a bottle of wine to make the other overpriced wine seem like a bargain, this can be like your first lab work comes back and that forms the anchoring biased impression of your condition.
And even as new lab work comes back, that doctor may fail to adjust their view of your condition because they're not taking into account this new stuff.
They're giving more weight to that original, that original number.
So, yeah, and that's just the anchoring bias, the way that it can affect.
Like you said, there's all sorts of other ways for it to happen.
And all of it can result in poorer outcomes for patients just because their doctors are humans.
And we don't really approach cognitive biases in a really methodical or deliberate way.
Yeah, in fact, now that I'm thinking about it, they do this so much on the show.
the show could be called medical confirmation bias.
Nice.
The show, because you see it all the time.
Outcome bias is when a shift in the patient's health,
you're convinced as the result of a treatment.
Like, it's because of that thing I did.
Or affect, heuristic that I mentioned,
an emotional reaction to a patient,
you know, kind of overrunning, you know,
deliberating on this thing in a logical way.
This happens all the time on the show.
Yeah, well, another field that it happens with this forensic science,
which we've gone to great links
to kind of point out is junk in the most in most cases.
And a lot of that junk is just based on cognitive biases.
Yeah, for sure.
I mean, certainly the way they do lineups is flawed.
I mean, the way they, I feel like you're right.
We've done this a lot on the show.
The way they have done a lot of this is super flawed.
And I think maybe they're looking at it some, but not a lot.
No.
So if you want to fight,
cognitive bias in your own mind, Chuck. What do you do? What do you do? Well, there's a list of
good tips here, and I think these are pretty good tips. The first one is just being aware that you
have these, which is something that we've already kind of kind of worked through on this episode,
except for you, of course, because you don't have these. Sure. But studies show that like just being
aware, it's not one of those things where like, well, being aware is half the problem. It's like
being aware seems like 2% of the problem. Yeah, it's like, you know, you know,
you're aware that you have an unconscious bias.
It doesn't make you understand the bias.
You just know that they're there, right?
That's the problem with it.
It's unconscious.
What else?
There are some, like, actual things you can do, like, delay decision making.
Like, don't come to snap judgments.
Go get more information.
Go get information from a contradictory source or different source or something like that.
And then it, like, kind of tied into that.
You can have, like, personal, like,
rules. Like if there's a big decision, you will not make that decision until you've slept on it.
Yeah. For example, don't buy a TV unless your friend says, yeah, good idea.
Try and consider your past experience, for sure, because optimism bias could come into play.
Like, hey, worked out last time. Yeah. Like, why would I take more time this time?
Yeah, and that's another way that you can kind of do that, an exercise you can do, is write down,
your expectations for an outcome, and then go back and look at it afterward and see if you were right or not.
That can kind of help you realize, like, I do kind of tend toward the optimism bias.
Yeah, because I believe that was one of the other biases is even, like, it's hard to recognize because you're biased and that you misremember what you thought going into it.
So writing it down is a good, that's a good one.
Right.
But if you're super, super unconsciously biased, you might be like, someone else wrote this in my handwriting.
I've never been this wrong.
What about Thomas Bays and Bayesian reasoning?
So he was a minister from the 18th century, and he basically came up with a standardized
formula for taking into account the probability of an outcome, right?
That things aren't essentially, so I saw this on less wrong.org, founded by one of the guys
who wrote, if anyone builds it, everyone dies, about AI, Elias, or Yukowski.
The whole point of less wrong.org is to overcome your biases in a methodological.
way. And they love Bayesian reasoning. And it basically says there's no such thing as something is
just true. Everything is just a probability. And you can kind of try to determine how probable something is
based on whatever evidence you can gather about it. Just basically going through life like that.
You know who hates that website? Who?
L-E-S-R-O-N-G, that dude who started his own personal comedy website, less wrong.com.
That's right.
He's just getting smashed.
What else, Chuck?
Um, what else is?
I cultivate a growth mindset.
That's a big one.
Hey, I make mistakes.
I screw things up.
And like, I need to recognize that and try and grow from that rather than, you know,
just being confirmed with my own biases constantly.
Yeah, maybe like looking around at some of the ways that you're commonly exploited,
say like by advertisers, like scarcity is.
one. When somebody says act now, supplies are limited, they're creating a scarcity mindset in you.
Social proof, basically like these people like this, so you probably should too, and you're like,
oh, I should like that too. Yeah. And then two other things I saw, there's something called
cognitive bias mod modification, I think is what it is. Okay. You can use this for like treating
anxiety, right? Like, people, like, tend to seek out negative facial expressions. Oh, yeah. And this
treatment is like, like, here's a thousand frowny faces. Find the smiley face in there. And just
screen after screen, you're looking for the smiley face and you're training your brain to stop
putting as much weight on negative facial expressions. Just using, like, basically, exploiting your
cognitive bias to get over your cognitive bias.
Oh, wow.
And then the last thing, Chuck, is apparently AI are starting to show signs of emergent cognitive biases because they use heuristics too.
So they're starting to make cognitive, they're starting to make errors in judgment in predictable ways, which are cognitive biases, just like humans.
Rob Zombie, more human than human.
That's right.
You got anything else?
I got nothing else.
This is a good one.
This is fun, Chuck.
I'm glad we did this.
Well, since Chuck and I both like this episode, that means we have no choice but for listener mail to be triggered right now.
I'm going to call this follow up on Sebastopol, because I wondered what the connection there was.
Sure.
Hey, guys, because if you didn't listen, Sebastopol, California, and we were talking about the Sevastopol in the Crimean War, and I was like, there's no way that's a coincidence.
And it's not.
Hey, guys, listen to the podcast on the Light Brigade from Sonoma County.
our Sebastopol was named after Sevastopol.
And here's a little information.
The settlement was apparently originally named Pine Grove,
and the name changed to Sebastopol
was attributed to a bar fight in the 1850s,
which allegedly compared by a bystander
to the long siege of the seaport of Sevastopol during the Crimean War.
Wow.
So the original name survives in the name of the Pine Grove General Store downtown.
only. And that is it. There's also the Russian River there, Russian River Valley. So
apparently there are some Russian influence in that area, which I didn't know about. And that is for
Marsha Ford. Yeah. Also, we want to apologize to all of our Iron Maiden fans who wrote in to be like,
dude, that song, the Trooper, is about that whole battle. Yeah, I didn't know. I am not,
I like Iron Maiden, but I didn't have as much shame upon my head as you. But you didn't,
reading the lyrics, it doesn't say, you know, Crimean War, in charge of the Light Brigade, does it?
I don't know. I haven't heard it in a while. I'm a big fan of the poster. I love the poster a lot.
Yeah, me too.
Well, sorry, all of you Iron Maiden fans out there. We'll try to do better next time.
Yeah, missed opportunity.
Who was that that wrote in about Sebastopol?
That was Marcia, I believe.
Thanks, Marsha. Marcia, Marcia, we really appreciate you.
And if you want to be like Marcia, you can email us as well.
Send it off to Stuff Podcast at iHeartRadio.com.
Stuff you should know is a production of IHeartRadio.
For more podcasts, My Heart Radio, visit the Iheart Radio app.
Apple Podcasts are wherever you listen to your favorite shows.
Over the last couple years, didn't we learn that the folding chair was invented by black people
because of what happened in Alabama?
This Black History Month, the podcast, Selective Ignorance with Mandy B.
Unpacks Black History and Culture with comedy, clarity, and conversations that shake the status quo.
The Crown Act in New York was signed in July of 2019, and that is a bill that was passed to prohibit discrimination based on hair styles associated with race.
To hear this and more, listen to Selective Ignorance with Mandy B from the Black Effect Podcast Network on the IHeart Radio app, Apple Podcast, or wherever you get your podcast.
This is an IHeart podcast. Guaranteed human.
