Making Sense with Sam Harris - #261 — Belief & Identity
Episode Date: September 30, 2021Sam Harris speaks with Jonas Kaplan about the neuroscience of belief change. They discuss the illusory truth effect, the backfire effect, failures of replication, “The Fireplace Delusion,” the con...nection between reason and emotion, wishful thinking, persuasion and the sense of self, conspiracy theories, the power of incentives, in-group loyalty, religion, mindfulness, cognitive flexibility, and other topics. If the Making Sense podcast logo in your player is BLACK, you can SUBSCRIBE to gain access to all full-length episodes at samharris.org/subscribe. Learning how to train your mind is the single greatest investment you can make in life. That’s why Sam Harris created the Waking Up app. From rational mindfulness practice to lessons on some of life’s most important topics, join Sam as he demystifies the practice of meditation and explores the theory behind it.
Transcript
Discussion (0)
Thank you. of the Making Sense podcast, you'll need to subscribe at samharris.org. There you'll find our private RSS feed
to add to your favorite podcatcher,
along with other subscriber-only content.
We don't run ads on the podcast,
and therefore it's made possible entirely
through the support of our subscribers.
So if you enjoy what we're doing here,
please consider becoming one.
Okay. Okay, well today I'm presenting a conversation on belief and identity, and in particular it's focused on the problem of belief change and resistance to belief change, the significance of which, both personally and collectively,
is really hard to exaggerate. We really are in the belief formation, maintenance, and
occasionally belief change business. And when you look at what it takes to get millions of us
and billions of us to cooperate with one another. It really is just
a matter of persuading one another to change our representations of the world and converge on
common projects. And failing that, we resort to forcing one another to converge, and that eventually becomes a bloody mess. So belief change and its
impediments is incredibly important to understand, and this is just a first volley in that effort.
And to do this, I've enlisted my friend and collaborator Jonas Kaplan, who is a cognitive
neuroscientist working at USC, where he is an associate professor at the Brain and Creativity Institute.
This institute was founded by Antonio Damasio, who has also been on the podcast.
And Jonas has focused on issues related to consciousness, identity, empathy, and social relationships.
He uses functional neuroimaging, mostly fMRI, combined with
machine learning to examine the neural mechanisms that underlie our sense of self.
He's also done research on how the brain processes stories and beliefs and values.
And in this conversation, we focus on some work that we did jointly with fMRI on the nature of belief and belief change resistance.
And that work was published in 2016, along with our co-author Sarah Gimbel,
as The Neurocorrelates of Maintaining One's Political Beliefs in the Face of Counter-Evidence.
And that was in Nature's Scientific Reports.
Fun fact, I just looked at this paper for the first time in years.
Again, it was published back in 2016.
And it's on nature.com.
I just wanted to see the metrics around its engagement. engagement, and it's been accessed 250,000 times, cited 71 times, cross-referenced 75 times,
looks like. But in measures of online attention,000 tracked articles of similar age in all journals, it's ranked 27th
out of 400,000. And it is ranked first among the 5,000 plus tracked articles of similar age in
the scientific reports section of Nature. So amazingly, there appear to only be 26 articles
in all of science of similar age
that have received more online engagement than this article.
And I say that not to boast,
but to point out how bizarre and ineffectual and balkanized so much of our science is.
I mean, from what I can tell, the biggest engagement in the media that this article got was from Dr. Oz,
who I consider a near-total charlatan, in the New York Observer. It was in various blogs.
I can't remember what press coverage it got beyond any of that. But anyway, I don't believe
I've ever met anyone who's read this paper. And yet, according to Nature's website, there are only 26 scientific papers on the planet that have received more engagement than this article.
Make of that what you will. I can just say that in my world, the experience of publishing this article was of simply dropping something into the void.
But that notwithstanding, we will talk about some of the implications of
this research. And this conversation is part of a larger series I've done with Jonas for the
Waking Up app. There's a section there titled Mind and Brain, which is essentially its own
podcast series. We've covered the science of mindfulness, social emotion, disgust, empathy, islands of awareness,
touching on some of the work that Anil Seth, another podcast guest, has done on isolated consciousness in the brain.
And we have forthcoming conversations on gratitude, the predictive brain, sleep and dreams, the default mode network, willpower,
and there's much more to come. That's a track where we're trying to make insights from neuroscience
as personally relevant to one's day-to-day experience as we can. Anyway, as I said,
this episode focuses on belief and belief change and the way in which identity poses an obstacle to the latter.
We cover things like the illusory truth effect, the backfire effect, failures of replication in this area.
We talk about an essay I once wrote titled The Fireplace Delusion.
essay I once wrote titled The Fireplace Delusion. We discussed the connection between reason and emotion, wishful thinking, persuasion and the sense of self, conspiracy theories, the power
of incentives, in-group loyalty, religion, mindfulness, cognitive flexibility, and other
topics. And again, I think both of us consider this just a first installment on what really
should be a series on belief change and its enemies. And now I bring you Jonas Kaplan.
I'm here with my friend Jonas Kaplan. Jonas, thanks for joining me.
Thanks for having me, Sam.
So, you and I now go way back as father time is meeting out blows year by year. I've known you
for at least a decade, a decade plus, but perhaps summarize your background as a neuroscientist and
the kinds of issues you focus on now. Sure. I am a cognitive neuroscientist, and I use mainly neuroimaging techniques to study
how the brain works. Maybe I can list off my litany of academic titles. It's like a medieval
court. It'll give you an idea. So I'm a research professor at the Brain and Creativity Institute
at USC. I'm the co-director of the Dornsife Neuroimaging Center.
And I'm also the associate director for mindfulness and neuroimaging at USC's Center for Mindfulness
Science.
Nice.
So that gives you some idea of what my research interests are.
But I've studied a lot of different things ranging from belief and values and empathy
and how we resonate with other people and a whole bunch of other
things that interest me. So now we're talking about something that's really in our wheelhouse
because we have done some neuroimaging studies together on this very topic. We're talking about
belief and belief change and resistance to belief change. Why is this an important topic?
to belief change? Why is this an important topic? It's such an important topic. You know,
belief flexibility is just essential to everything we do as a society in so many different ways.
We just need to be able to influence each other when we have conversations. The whole point of having a conversation is to get some information across. Especially in a democracy, it's really
important that we're able to influence each other on the basis of conversation, because if we can't, the only
option available to us is some kind of violence, right? So to be able to change as new evidence
comes in to fruitful conversations, to advance science and education, all of these things require
some amount of flexibility in our belief. It's particularly prominent for me as a scientist. I mean,
this is the very basis of science is some kind of assumption that as we gather new evidence,
we can update our models of the world and our beliefs. And so if we have difficulty doing that,
or if there's things in our psychology that make it hard for us to do that, we need to know about
them. Yeah. So we're recording at a moment which is where these concerns are especially salient,
because we're in the middle of the COVID pandemic and just awash in misinformation about more or
less everything. I mean, there's political partisanship of a sort that I don't think
we've ever seen in our lifetime. There are conspiracy theories on almost every topic of social importance. People have
balkanized into these echo chambers online. The public health messaging during this pandemic has
been almost impossible to get across because every shred of reasonable skepticism on any point gets amplified into just a complete breakdown of epistemology where we
think we know nothing for certain about anything of consequence. And so people can't even agree.
As you get anywhere toward the edge of mainstream opinion, you find otherwise intelligent people who can't even agree that the, in this case, the pandemic
is real in some basic sense. Almost every aspect of this can fall under doubt, and then it becomes
almost impossible to have a conversation about what's real. Just trying to converge on a set of facts that all parties can acknowledge
becomes an impossible task when people start out sufficiently far apart and they're being
emotionally hijacked once any of these conversations get started, and they're mistaking
their emotional reaction for further evidence of the truth of their beliefs.
And you can become sensitive to this in yourself. You know, you have certain things you believe are
true, and then you bump into counter-arguments or counter-evidence or just, you know, people who are
espousing an alternate view of reality. And it sort of depends what we're talking about here,
but in the generic case, you're either attached to these beliefs because you think they're true,
or you're perversely attached to them because you want them to be true. But in any case,
you meet in yourself an unwillingness to reconsider the matter, and an almost visceral
feeling of revulsion or contempt for people
who would push too hard on a door you're trying to keep barred. And it's really the only place
where we are at all disciplined and good about getting out of our own way here and revising
our beliefs is in science. I mean, that really is what makes science science. It's a
methodology for being increasingly sure that you're not fooling yourself. And granted, we're
imperfect here, and there's a history of scientific fraud and scientific ineptitude, but obviously the
remedy for that is always more science and better science. It's not some alternate mode of wish
fulfillment or you're merely imagining what's true. This conversation, I'm sure, will be evergreen.
And if you come back in five years, this will still be relevant to think about. But at the moment,
its relevance is fairly excruciating. It's hard to believe. I mean, when you and I first started
working on this issue in neuroscience,
maybe 15 years ago or so, it was certainly not an issue that was on the forefront of everyone's
minds. And now it seems like it's what everybody wants to talk about. And I think one thing you
said there is really important, this aspect of trying to recognize this process in ourselves,
because a lot of times when I talk to people, the biggest
question is, how can I get my aunt or uncle to believe me, to understand what I'm saying?
And how can I influence someone else? What's the key to persuasion? And that is one way of looking
at the problem. But I think it's actually potentially more fruitful to think about it
the other way around. So instead of how we can be better persuaders, how we can
make ourselves more open-minded and more available to evidence as it comes in, as we recognize the
reasons why we're not in the first place, which is what we're going to talk about today.
Okay, so let's talk about what we know about changing beliefs. Obviously, we learn things about the world, albeit slowly and sometimes begrudgingly,
and that is synonymous with belief formation in the sense that we're using the term. I guess
perhaps one thing we should clarify here is that in its colloquial use, people often distinguish belief from knowledge.
In our usage here, and really throughout most of the relevant fields, certainly within philosophy,
and I think within cognitive science generally, that's not really the point of separation. I mean,
you can believe things with greater or lesser conviction.
It's really like, it's a probability distribution of knowledge we're talking about. There are things
that you are absolutely certain of, you would bet your life on. There are things that you think are
very likely to be true. You still count them as knowledge, but, you know, until you hear otherwise,
you'll think this is probably the way things are, but you wouldn't bet everything on it. And then there are gradations below that where you think the
preponderance of evidence and argument is pointed in one direction. You're certainly weighted that
way, but you don't really think you have a complete picture of that part of reality in hand. And all of this is a matter of belief
to one or another degree, in the sense that we're using the term.
That's right. We're not going to distinguish there. We're going to treat belief as basically
anything you hold true about the world. And, you know, you mentioned the process of how we
form beliefs. I think that's an interesting topic, how we gain knowledge and how we develop our kind of initial models of the world. And we're not going to get too deeply into
that. I think we're going to sort of bypass that issue and just start from the point at which we
have formed some belief, we've accepted some piece of information as true. And then what happens?
Let's say we encounter a new piece of information that contradicts the old one. How do we revise
our beliefs? Because this really is the biggest challenge that we face. And there are a couple
of effects from cognitive psychology that are relevant to this that we're going to talk about.
The first one is called the continued influence effect. The idea here is that even after correcting
a belief that was formed on the basis of misinformation, we still show evidence of that initial wrong belief
affecting the way we think. So there's a classic experimental paradigm, which was developed in the
1990s, where you give people a fictional story about something like a warehouse fire, and you
tell them there was this big fire in the warehouse that was started in a closet, and there was paint
cans and oil left in the closet, and there was paint cans and oil left in
the closet, and that's probably why this fire got out of control. And then for half the subjects,
you correct one piece of information, and for the other half, you don't. So for half the subjects,
you might additionally tell them, well, a police report came in later, and it turned out there
wasn't any oil and gas in the closet. And then you interview these people and you ask them about the fire.
You ask them to explain why the fire happened and to give you some details.
And even when the information about the oil and gas in the closet was corrected, they've
been told there wasn't anything in that closet.
People still explain the fire in terms of things like, well, you know, oil fires are
harder to put out or the negligence of the company
leaving those dangerous things in the closet.
You can still show that the belief persists.
They weren't able to go in there with an eraser and just erase it or delete it and all of
the subsequent thoughts that they had about it.
And this is the fairly sinister bug in our software, which I'm not sure what the remedy for it is in the end. I mean, false information, the initial false information gets ramified in people's memory.
And I noticed this in myself. Actually, I'm going to ping you as a naive subject on this point and see if you have a similar contamination of memory. Do you remember the McMartin Preschool saga? This was part of the kind of satanic ritual
abuse panic craze that happened in the 80s. Yes. And there was the McMartin Preschool,
which was the most famous instance of this alleged abuse. Do you remember that case at all?
Vaguely. I think that there were some parents who basically got in some kind of hysteria about satanic abuse of the children. Right. Now, what do you think
the net result of that case was? Was there actual abuse at a preschool or like, what's your memory?
Right. My revised narrative is that there was no abuse and that there was no actual satanic
cult involved.
Okay, good. Well, you were better than I was. I hadn't thought about this in years, and then I had a podcast on some related topic maybe two years ago. I forget who I was talking
to. And I went to look this up, expecting that there was some fire where there was all this smoke,
that there was some fire where there was all this smoke, but I just didn't remember the details.
And it turns out this is just the ultimate example of belief persistence in my case,
because this had been fully debunked. I mean, this trial went on for years. I mean, one of the teachers spent five years in prison and then finally got acquitted.
All charges were dropped. Hundreds of kids were interviewed with techniques that are now like
textbook errors in how not to interview children about alleged abuse. They created a psychological
experiment seemingly designed to produce false memories and false confessions and just sheer confabulation. And this whole thing
exploded, but it had been lodged in my memory as, God, there was probably something really
heinous that happened over there at the McMartin Preschool. I'm so glad those people were brought
to justice. But this is an awful piece of our code where we have a truth bias. And it seems like this may be based on kind of a default setting
of accepting anything propositional that we understand
may include some tacit acceptance.
And actually the philosopher Spinoza conjectured about this
back in the 17th century.
And there have been several studies that have supported this.
And actually our own
studies of belief with fMRI supported this based on our behavioral measures, in that we saw that
people were faster to accept propositions as true than they were to reject them as false.
And this is true even of propositions that are equivalently simple. So, you know, if I give you a set of equations, you know, 2 plus 2 equals 4,
2 plus 3 equals 4, you know, 1 is true, 1 is false, they're equally simple, and yet you will
answer the true ones, you will respond true to true on average faster than you will respond
false to false. And that seems to suggest that
our default setting is to accept it as true and that rejecting it as false is a further
cognitive judgment that takes time to render. That's interesting. Yeah, that's definitely one
of the features. I think it is easier to accept the statements that were given as true. You hit
upon one of the other cognitive bugs at play, which is this repetition effect in memory, where just hearing something
multiple times, you know, the more times we hear it, the more likely we are to accept it as true.
And this is particularly sinister in the case of misinformation correction, because
the correction itself often involves a restatement of the false belief. So if you say, you know,
there wasn't paint in the closet, the idea of paint in the closet has to be invoked in order
to understand that sentence. And so the correction can serve as another repetition and make it more
difficult to delete that. The other factor here is that when we accept something is true,
you know, we don't stop thinking about it. It's not like we just have this one sentence that exists on its own, separate from all other ideas in our mind that there was paint or oil in that closet. We start thinking about all of the ramifications, the consequences, the other things that follow from that belief, and we start to build our mental models upon these foundations that we have.
And so it's like pulling out one piece of code when there's all these other pieces of code that have already followed from it. Now, there's an alleged further iteration of this, which
seems even more dysfunctional. Although I think there's some question as to whether or not this
is replicated. But this meme spread widely in the culture. It'll be ironic if we have to debunk it
and find that we can't because the putative effect is invoked. But there's something called
the backfire effect that many people now think they know something about. What is this,
and what do we think we know about it? Yeah. So let's see if we can do our own
little experiment with the continued influence effect if we describe this effect first and then
try to debunk it. So there's a classic study from Brendan Nyhan and Jason Reifler back in 2010,
where they presented people with a little fictional news story
about the Iraq War. And so this is 2005 or so that the experiment was done, and the Iraq War
was fresh in people's minds. And remember that from that war, there was this whole issue about
the Bush administration used the presence of weapons of mass destruction in Saddam's
stockpile as a justification for the attack. So this little news story contained a
quote from President Bush where he made comments alluding to the dangers posed by Saddam Hussein
having these weapons. And this is the information that they attempted to correct. So some subjects
were given an additional corrective piece of information, which was actually a true piece
of information, that there was this extensive report, the DOLFA report, which conclusively established that there were basically
no weapons of mass destruction, at least not in any quantity that could have made a difference.
And half the subjects weren't given that correction. And they were asked afterwards,
you know, how strongly they agree with a statement that there were indeed weapons of
mass destruction in Iraq. And in the conservative subjects who came into this with a pre-existing bias that they probably already believed,
and there's evidence that conservatives at the time believed this, believed that the weapons
of mass destruction were there. When they received this corrective information, their belief in the
weapons of mass destruction actually got stronger. So not only were they unable to correct the
misinformation, but the act of correcting made the belief stronger. And not only were they unable to correct the misinformation,
but the act of correcting made the belief stronger. And that's why it's called the
backfire effect. This is a total backfire. You're trying to make the belief weaker,
and instead you make it stronger. Right. So, and I forget when this happened,
a couple of years back, this might have been born of a New Yorker article on the topic, but this suddenly became very prominent in the
culture for people to talk about, think about, worry about the backfire effect. What efforts
have been made to replicate this? So there have been many efforts to replicate this,
and it has been difficult to replicate. There was a study a couple of years ago by Thomas Wood and
Ethan Porter, which pretty much eviscerated the
backfire effect. They performed a really large study, 10,000 subjects, 52 different political
issues that they gave them corrective information about, and they were not able to find any evidence
of backfire effect across any of these 52 different political issues. In fact, most of the
people in the study,
something like 85%, did show some significant corrective response to the factual information.
So why is the backfire effect difficult to establish? When does it occur, if ever? These are questions of ongoing research, but there's probably a lot of context that matters here.
of ongoing research, but there's probably a lot of context that matters here. You know, it could be easier to give up on one particular fact than it is to give up on some underlying important
issue for you. So, for example, in one of the experiments that you and I did, we gave liberals
arguments against gun control. And these are people who believe that gun control laws are good,
and we gave them information, statistics about, you statistics about how likely people were to get in gun accidents and things like that.
And it might be easy for one of the people in this experiment to change their minds about one
of these individual facts, one of these statistics that we gave them, while still maintaining their
general position on gun control. In fact, it might be easier to retreat on an individual fact than it is on some underlying value. That may be the easiest path for
you to take if you're trying to maintain your core belief about gun control. So there's some
complexity there in terms of the context. It also probably matters what the issue is, right? I mean,
in this Wood and Porter study, they tried to replicate very specifically the weapons of mass destruction experiment, and they were not able to establish a backfire effect there. For other kinds of issues, it's very difficult.
Those issues that people tend to be most resistant on are the ones that they have some motivation to maintain their belief. And that motivation can be a social motivation. These are some of the
most common motivations we have now. Beliefs connect us to other people and beliefs that
we share with our social group, and particularly those beliefs that help
to form our social identity, our sense of who we are in a group, are very, very resistant to change.
And they may be more likely to show a backfire effect. In the end, I think the focus on the
backfire effect is a bit of a red herring. It doesn't really matter that much whether
corrections backfire or not. The real important question is, why do the corrections
not work at all, right? If they're not correcting, if they're not softening our belief, it doesn't
really matter that much that they made the belief a little bit stronger. What we really want is to
be able to correct our beliefs. Yeah, I mean, one of the things we found in our neuroimaging study
on belief change was that the signal in the amygdala and the insula, both regions that report
emotional salience above anything else, especially the amygdala, but also the insula,
that predicted people's resistance to changing their beliefs under pressure. So there's the
feeling component of it, and also those cases where there's a kind of a direct line or direct justification for the feeling of emotional charge based on one's beliefs about oneself in the world and one's identity. That's really the framework that I think we would expect would produce this resistance
to belief change. Because there's the not liking how certain facts sound piece,
but then there's the really not liking it when you sort of do the emotional math,
however implicitly, and realize that if you're the sort of person who accepts this new argument or this
new set of facts and changes this specific belief, well, then you're no longer the sort of person who
can have the friends you have, be in the political party you're in, talk to your family at dinner.
Many things begin to come under pressure depending on just how fundamental
or cherished the belief is that is now on the table to be revised. I think the punchline for
any one of us to just be better people in the world is to notice when this machinery is getting invoked. You can feel it happen. You can feel when
you're disposed to take the way certain ideas make you feel as a thoroughgoing analysis of their
truth, right? If you're kind of doing epistemology by fear and anger and disgust and some primary emotions that are getting triggered by specific
ideas. The place where I've experimented personally with this is on the topic of burning wood in a
fireplace. I wrote a piece called The Fireplace Delusion a few years back. When I stumbled upon this example, quite literally
at a dinner party, you know, I'm somebody who has known for many years that there's nothing magical
about fireplace smoke. It's, I mean, the fact that we have, you know, we feel this deep nostalgia for
it and sentimentality around it. People love the smell. It conveys an idea of Christmas and other happy thoughts to most people.
All of that notwithstanding, if you can smell smoke when you're burning a fire, that is,
from a health point of view, more or less indistinguishable from a diesel engine running
in your living room, right? I mean, you should be no more sentimental about the smell you're
smelling than the fumes you would be smelling in the case of the engine. But I found that whenever
I push people on this, it triggered a very familiar quasi-religious pushback in them.
And this was no matter... These people could be scientists, these could be... You could just see
the triggering. I'm feeling one brewing within myself.
Yeah, that's right.
I don't know if you want me to let it out in this context or not, but I do have an argument
against this.
Okay, yeah, let's hear it.
Let's hear your argument.
Well, maybe you've heard this one.
It's going to be spectacular.
Well, the argument is that fire may have played a special role in human evolution.
Yeah.
So I don't know if you're familiar of the work of Polly Weissner, an anthropologist, and she studied what happens when the bush people of Africa, the hunter-gatherer societies, sit around fires at night.
And she studied the nature of the types of conversations and communications that happen during the day compared to at night. And what happens at night
is that because you don't have the world in front of you, and you basically can't talk about
business and the here and now of perceptual things that are confronting you, the conversation turns
to other times and places, and people start telling stories. And there's this whole sort of
storytelling culture around the fire that comes out of this. And this may have been something that's been very important for human culture
that we therefore have a nostalgia for. And I think it extends into things like
watching movies and theaters, where we all sit around a flickering light and watch things together.
Yeah, well, that's quite a heartwarming thesis. It's actually something I do discuss it in The Fireplace Delusion.
And I mean, I would just point out that, you know, whether something has played a role in evolution is rarely an indication of whether it's normative or optimal now.
Right. I mean, obviously, outgroup violence, you know, tribal violence has...
No, I just think it might explain why we have those feelings about it.
Yeah, yeah. But it offers no indication that breathing in wood smoke is healthy, you know,
or any healthier than smoking cigarettes or, you know, breathing in other forms of air pollution.
And the data on this is just, we just know this to be true. And as a matter of
public health, we know that I believe there's nothing that kills more people globally than
dirty sources of fuel in the home every year. I mean, we're talking about literally millions of
people who die every year because they're largely in the
developing world. They use wood and other kinds of fuel.
I got to say that fact sounds like one of the ones we made up for our experiment.
We should remember. Yeah, that was hilarious. We were making up facts for the experiment and
perhaps did people lasting damage if we couldn't correct those facts, if we only ramified those facts in the correcting
them afterwards. But no, I'll have to get the data on how many people die, but it is enormous
because much of the world is still using dirty fuel. But it's the air pollution in a city like San Francisco or Los Angeles,
based on just the recreational burning of wood in the winter, you know, it's not even being used as
a fuel source. It's just people are burning wood fires in their fireplaces just for the fun of it.
There's no question that that increases emergency room visits based on pulmonary and cardiac events. And this is all
stuff that has been studied and bemoaned by public health people. But we have this sentimental
attachment to burning wood, and people are reluctant to get over it. Yeah, the role of
feelings and emotions in this whole process of belief is really interesting.
And there's multiple aspects of it.
I mean, on the one hand, you're right that if we rely too much on our feelings, we can
be led astray.
And just because we feel something is true, for example, or good for us, like the fire,
doesn't mean that it is.
And certainly in our experiments, we saw the involvement of negative emotions. When you're challenged, you can have this feeling that it feels bad. You want to get
away from the source of the challenge. And in fact, one of the most effective self-protective
mechanisms we have against changing our beliefs is to completely avoid being challenged. And we're
very good at avoiding information that challenges our beliefs and avoiding putting ourselves in situations where we might have to encounter something that we don't
like to hear. So there are these negative emotions that can underlie our decisions about what to
believe and about what evidence to even look at. And there's some evidence that these feelings
might mediate the whole process of belief change. On the other hand, there are other feelings. Just
because something is a feeling or an emotion doesn't mean that it's necessarily part of an irrational
process. We have to recognize that emotions are there because they have conferred some advantage
throughout the course of history, so they're at least potentially helpful. And there are feelings
that are more subtle that are involved in this process, like the feeling of certainty or the feeling of uncertainty. Those are not purely cognitive experiences.
They have some kind of a feeling component that can help increase the saliency of those thought
processes for us. Well, yeah. As your own colleague, boss, mentor, Antonio Damasio has demonstrated,
this classical split between reason and emotion doesn't make any sense.
I mean, just neurologically speaking, and when people have specific injuries to the orbital frontal cortex
and thereafter can't feel the implications of knowledge they otherwise seem to have,
they can't make that knowledge behaviorally relevant and operative.
There's a kind of classic gambling tasks where people seem to know the right strategy
but continually bet unwisely because they can't make the right strategy guide their behavior. So it is an
interesting problem. Feeling states are part of our cognitive apparatus. The feeling of certainty
and the flip side, the feeling of doubt, they're not dispensable, and yet they can also become
uncoupled to the legitimate modes of thought that should deliver certainty and doubt.
In some ways, they're orthogonal to cognition, and in some ways, they're indispensable for it.
The red flag for me is when you realize that you want reality to be a certain way,
and you're trying to convince yourself that it is that way. I mean,
there's a reason why wishful thinking and obvious bias in the direction one is arguing for,
conflicts of interest, this goes under many framings. There's a reason why all of that
is stigmatized when it comes time to think clearly about what's going on in the world.
all of that is stigmatized when it comes time to think clearly about what's going on in the world.
That's right. Wanting something is absolute poison to the process of trying to find the truth. And that's why we have all these mechanisms within the scientific method to try to eliminate
the effects of those things. And just to emphasize one of the other things you talked about earlier,
it's really hard to underestimate the effects
of wanting things to remain the same in our social relationships. I mean, the stakes can
be so high for some of these decisions that it's virtually impossible for us to change our minds.
I talked to someone who was a career political analyst and worked in the Bush administration and Reagan administration.
If you'd like to continue listening
to this conversation, you'll need to subscribe
at SamHarris.org.
Once you do, you'll get access to all full-length
episodes of the Making Sense podcast,
along with other subscriber-only content,
including bonus episodes,
NAMAs, and the conversations I've been having on the Waking Up app.
The Making Sense podcast is ad-free
and relies entirely on listener support.
And you can subscribe now at SamHarris.org.