3 Takeaways - The Knowledge Illusion: How Overconfidence Shapes Our Lives (#256)
Episode Date: July 1, 2025We’ve landed on the moon and built global networks—yet most of us don’t understand how a toilet works. Cognitive scientist Philip Fernbach explores the paradox of human intelligence: our success... depends on shared knowledge, not personal depth. But that creates an illusion—we think we know far more than we do. How does this illusion quietly shape our politics, beliefs and risks and is it time we all got a little more curious - and less certain?
Transcript
Discussion (0)
Our brains can actually store only a very limited amount of information.
Do people overestimate their understanding of the world?
And if our knowledge is more superficial than it seems, what are the implications?
Hi everyone, I'm Lynn Toman and this is 3 Takeaways.
On 3 Takeaways I talk with some
of the world's best thinkers, business leaders, writers, politicians, newsmakers, and scientists.
Each episode ends with three key takeaways to help us understand the world and maybe even
ourselves a little better. Today I'm excited to be joined by Phil Fernbach. He's a professor at the Lead School
of Business at the University of Colorado at Boulder. He studies how people think and is the
co-author of The Knowledge Illusion, which was chosen as an editor's pick by the New York Times.
He is the perfect person to ask why we think we know so much more than we do, and the profound implications
for individuals and society, as this is the topic of his book,
The Knowledge Illusion. Welcome, Phil, and thanks so much for
joining Three Takeaways today.
Thanks so much. I'm excited to be here.
Phil, how do you determine how much people know compared to what they think they know?
Well, in one kind of experiment, we would use a three-part design.
So we would start by asking them to assess their own knowledge of something.
How well do you feel you understand this?
Then we would ask them to explain in detail how it works.
And then we would ask them to re-rate their knowledge.
And the difference between their rating at time one and their rating at time two after
they've engaged in the explanation is the measure of how overconfident they were before
they tried to explain.
It turns out that we know very little about the way that the world works.
And yet that initial impression we have
is that we understand things in some level of depth.
We call that the knowledge illusion.
Sometimes the more jargony term
in the cognitive science literature
is the illusion of explanatory depth,
the feeling that we can explain things
much more deeply than we can.
And where do people experience this illusion of knowledge?
Is it in everyday objects? Is it in political issues? Is it in everyday objects?
Is it in political issues?
Is it in scientific topics?
Across the whole gamut of those things.
And that's why I got so interested in this,
because I started by looking at work
where they tested everyday objects, like the toilet,
or the zipper, or a stapler.
But when I started looking into this,
I started doing work in the domain of politics,
political issues, and scientific issues like global warming
or the safety of genetically modified foods or vaccination.
And I found that this illusion is very broad
and applies to all these different domains.
So shocking to me.
And that essentially enables us to believe
that our opinions are justified by
our knowledge and that our actions are grounded in justified beliefs, even though our knowledge
is an inch deep.
That's exactly right. The fact that we oversimplify the world and we only see it at a simple level
allows us to feel like those issues are not as complex as they actually
are.
And that gives us the support for taking a really strong position on them.
And that also gives us the confidence for bold actions and progress.
Exactly.
Our positions on issues, who would care if you just had a position on an issue, but our
positions determine our behavior. And that's what we really care about.
And can you give some examples?
Sure. If somebody believes, for instance, we just had a horrible attack in Boulder,
Colorado, where innocent protesters were attacked by a person who threw Molotov
cocktails into the crowd. And that's an action that is definitely predicated
on that person's belief about the issue.
And if we ask the person to explain all
of the intricacies of the issue, I
would bet dollars to doughnuts that they probably
have an oversimplified view.
What are the benefits of this illusion of knowledge?
First, let me take a step back and talk a little bit
about where we think the illusion comes from.
People are not really built for individual level thinking.
What we're really built for as human beings, what
makes human beings special and different from any other animal,
is our ability to collaborate in these vast networks where
we share knowledge.
And different members of our community
have different pieces
of knowledge, and we have the cognitive capabilities to be able to combine knowledge, collaborate,
specialize, and share the burden that allows us to pursue really complex goals. That's
what we think gives rise to the illusion. So we participate in these communities where
we're just very used to knowledge being not in our own heads, but being out there in our communities and sort of everything works. We believe that
by virtue of participating in these communities, we sort of don't draw very firm boundaries
between the knowledge in our heads and that exists out in our communities. And so it's
like how can people be so smart and so dumb at the same time? That's sort of the paradox at the heart of human beings.
People engage in crazy behavior.
They believe crazy things.
You look at somebody and you say, how can they possibly believe that?
What is wrong with them?
And yet you look at human society and it's like, wow, we went to the moon.
We do incredible stuff.
That's sort of the bright side of this illusion is the fact that it allows us to participate
in our communities in ways that can really be productive and successful.
What we're talking about is overconfidence.
People are habitually overconfident about their knowledge.
And overconfidence can be a very dangerous thing, but it can also be a really important
thing. And if you think about, for instance, an entrepreneurial culture where people like to undertake business
risks to start new companies, to chase after new ideas, if you actually understood all
the complexity involved and the likelihood of failure, most entrepreneurs would never
get started.
It's also a secret to humanity's success that there's a real vibrancy to our ability to engage in risk-taking that benefits the whole, even though the individual undertaking that risk-taking
is probably not going to succeed a lot of the time. That that's sort of the bright side of overconfidence. It allows us to jump in with two feet into an endeavor
that may be very unlikely to succeed.
It essentially gives us the confidence
or the overconfidence to take bold actions.
Absolutely.
How many people have gone to fix their broken toilet
instead of calling the plumber, opened up the thing and
realized they have no clue what they're doing. So they were overconfident. But you know what?
They might go and look it up and figure it out. And now they've gained some valuable
piece of knowledge for the future. And think about all the entrepreneurs who went to start
companies and failed. You know, entrepreneurs talk all the time about how their failures
were much more meaningful in their life than their success because they learned something really important that helped
them to succeed the next time.
So I think that that sort of double-edged sort of overconfidence is really important
to understand.
Interesting.
That would, for example, enable President Kennedy to say, we're going to land a man
on the moon within 10 years. That's right.
I don't know if Kennedy himself was overconfident
when he made that statement.
He might have thought that this was nearly impossible,
but that it would promote a lot of innovation.
But the fact that people are willing to jump in and take him
up on that moonshot, that's one of the amazing things
about human beings, that we're able to do things that we set our mind to, even if they're against the odds.
Can you talk more about the drawbacks, how the illusion of understanding or knowledge
can lead to war or nuclear accidents or partisan gridlock?
Absolutely. I think the idea of political gridlock is a good one. That's something that has been getting worse and worse and worse and worse in the United States for the last
20 to 30 years, and maybe even longer. We used to have a lot more people in the middle.
And why is that? I think it's because a lot of these issues that we talk about are demagogued.
They're oversimplified. And because you're a member of a different community where people
are advocating for different things, you take on member of a different community, where people are advocating
for different things, you take on the beliefs of your community, and then you feel like
you understand them by virtue of taking them on. I do think that all kinds of negative
outcomes in society have to do with extremism in general. And I've done a lot of work looking
at the roots of extreme beliefs. So for instance,
why do people have beliefs that are counter to the scientific consensus? And those really
extreme beliefs can often reflect the largest amount of overconfidence about how much one
understands the issue. And you know, extremism is usually not a good thing for social harmony, because extremism
often does reflect sort of an oversimplification of a complex issue.
So those are some examples.
When you think about more at the individual level, we talked before about risk taking
being a good thing, but risk taking can also be a bad thing.
You know, if you feel like you understand the crypto marketplace so well
that you know exactly what is going to happen, what do you do? Well, you'll maybe not just
buy crypto, but you might actually buy crypto with leverage, meaning you borrow money to
buy crypto. And then when the prices go down, not only does your investment go down,
but you might lose your entire investment because you get what's called margin called.
So there tends to be this slippery slope with people taking on more and more financial risk
when they feel that they understand things. And I think it's a very common thing for people to
learn that lesson. You know, you touch the stove and you get burned, and maybe you learn your lesson
and you are more careful in the future, you might have really harmed yourself in the process.
And an effect that to me is really interesting is the Dunning-Kruger effect.
Can you explain briefly what that is?
The Dunning-Kruger effect is some really influential research that is closely related to the kind of things I've been talking about
today and the work that I've done. The main finding is that
if you look at a domain, the people who have the least
expertise in the domain, often will overestimate their
capabilities in that domain the most. So if for instance, you
ask people how funny they think they are, and
then you ask them to make jokes and have someone else evaluate how funny those jokes are, then
you can get an objective measure of how funny they are. And you look at the difference between
those two things. It turns out that the people who are the least funny overestimate their
ability to generate humor the most. They call it unskilled and unaware.
So the people who are most unskilled in a domain are the most unaware and the least
able to accurately evaluate their own capabilities.
So interesting. So the unskilled just don't know what they don't know.
I think that's exactly the right way to think about the Dunning-Kruger effect and in fact
the way that Dunning and Kruger think about their own effect.
As we gain more knowledge in a domain, we appreciate how much goes into it, how complicated
it is, how much there is to know and all that kind of stuff.
And that tends to moderate our ability to be overconfident to some extent.
When I first started playing guitar 25 years ago, I played for a few months and I thought
I was really good. And now 25 years later, when I'm actually pretty good, I think I suck. And it's
because I know how much there is to know. And I've seen all the people who are better than I am and
all of the learning that I have to do. And so I think that's a very natural human thing
is that when we start off in some domain we tend to be more overconfident.
And is it fair to say that according to Dunning our overrating of our skills
matters because all of us are unskilled in most domains of our lives?
Yeah, absolutely. You know, some people are kind of habitually less overconfident, tend
to question more whether they actually understand something, how good they are at things and
stuff like that. But for most of us, we tend to be overconfident most of the time and in
most domains. And that is because in general, you can't be expert in that many things. The
world's just too complicated.
And so the fact that the world is so complicated, there are so many different domains of skill and
knowledge means that we are going to tend to be, most of us are going to tend to be overconfident
about most things most of the time. And that's a pretty profound observation, I think.
How does our ignorance shape our lives?
It can shape it in one of two ways.
One is it's going to shape our lives in ways of making us overconfident about our understanding
of things and bring along all of the stuff that we've been talking about, both good and
bad.
Another way that ignorance could shape our lives is that we could go the other direction.
Say your listeners now start thinking every time, like every time they go to the toilet,
they're like, oh my gosh, I don't know how this works. And they become diffident. That could shape your life in a
very different way, not necessarily in a good way. Ignorance can also shape our lives in leading us
to feel like the world is overwhelmingly and overly complex. I think the best way that ignorance
can shape our lives is if we sort of embrace it and we become curious.
That's the way I like to think about it, which is that the world is incredibly complex.
And there's so much to learn and know. And that provides a learning opportunity that provides for us to be curious and try to learn about why it is that your neighbor thinks differently about something or why is it that my friend loves doing X
but it seems so dumb to me, you know, maybe I should go check it out.
That's sort of the productive way to think about ignorance, that it's a natural thing
and not something that we should be scared of or something that we should be embarrassed
about but it's something that we should embrace.
It's so interesting to me that ignorance shapes our lives in ways that we do not realize,
and that people tend to do what they know and that they fail to do the things they have no
conception of, and in that way that ignorance profoundly shapes our lives. Do people fail to
reach their potential because they're not aware of the possible? I think that's right. It's a big and exciting world. And I think most people end their lives
regretting not the things that they do, but the things that they didn't do. And so I think
opening our minds to the world being more complex and that there's more to learn than we naturally
appreciate is a really
great way to grow as human beings. I love that idea of opening our minds to the
world. Phil, what are the three takeaways you'd like to leave the audience with
today? The first one is that it's okay not to know everything. The world is
endlessly complex and that's okay.
Human beings are successful by virtue of the fact that as individuals, we don't know that
much but as groups, we can know a lot.
The second takeaway is that we should try to explain more than we advocate.
Usually when we engage in dialogue, we advocate.
That is, we say, here's
what I believe and this is why, and this is why you're wrong. But it can be much more
mind opening to engage in dialogue where instead, we try to explain we ask questions like, how
does this work? What would happen if we did x, y or z? That type of dialogue can be much
more productive. The third takeaway is that dialogue can be much more productive.
The third takeaway is that it can be good practice to try to practice intellectual humility. Intellectual humility is a mid
point between being overconfident about how well we
understand things and being diffident feeling like we can't
possibly understand anything and we can't take a position on
anything. We need to get in the habit of finding a middle ground
where the strength of our belief in an issue or a position
is grounded in how well we understand the issue.
That can lead us to both not being overconfident
when it can be really dangerous,
but also being open-minded
and pursuing new opportunities when they arise.
Bill, this has been great. I very much enjoyed your book, The Illusion of Knowledge. Thank you.
Thanks for your time.
If you're enjoying the podcast, and I really hope you are, please review us on Apple Podcasts or Spotify or wherever you get your podcasts.
It really helps get the word out. If you're interested,
you can also sign up for the Three Takeaways newsletter at three takeaways.com
where you can also listen to previous episodes. You can also follow us on
LinkedIn, X, Instagram and Facebook. I'm Lynn Toman and this is Three
Takeaways. Thanks for listening.