Freakonomics Radio - How to Change Your Mind (Ep. 379 Update)
Episode Date: March 17, 2022There are a lot of barriers to changing your mind: ego, overconfidence, inertia — and cost. Politicians who flip-flop get mocked; family and friends who cross tribal borders are shunned. But shouldn...’t we be encouraging people to change their minds? And how can we get better at it ourselves?
Transcript
Discussion (0)
Hey there, it's Stephen Dubner.
The episode you are about to hear was originally published in 2019.
We were inspired to play it again now as a way to think about what's going on in Ukraine.
It is by now well established that when Vladimir Putin chose to invade Ukraine,
after a long buildup and warnings from the U.S. and many others,
that he was expecting a quick and simple victory.
Whatever happens next, it's clear that
the invasion has turned out to be more difficult than he expected, which got us to wondering,
has Putin changed his mind about the wisdom of his plan? If he had it to do over, would he choose
a different course? And now that he's facing a potential quagmire, is there a way for him to extricate Russia without losing face?
The following episode is called How to Change Your Mind.
Even though we gave it that chipper title, the reality is that changing your mind, especially in public, can be very costly.
Especially for a politician, and even more so for an autocrat.
What's the point of being an autocrat if you ever have to admit you were wrong?
But this isn't just about autocrats.
For instance, I'd love to know what Barack Obama would think
if he listened back today to this tape cut from a 2012 presidential debate with Mitt Romney.
Governor Romney, I'm glad that you recognize that Al-Qaeda is a threat.
Because a few months ago, when you were asked,
what's the biggest geopolitical threat facing America, you said Russia.
Not Al-Qaeda.
You said Russia.
And the 1980s are now calling to ask for their foreign policy back.
Because, you know, the Cold War has been over for 20 years.
Would Barack Obama, if he were on our show today,
would he say he has since
changed his mind that perhaps Romney had a point about Russia? Four years before that debate,
Obama won the Democratic nomination in part because of another quagmire, the American invasion
of Iraq in 2003. Obama, then an Illinois state senator, had been firmly against that war. His chief
opponent in 2008, Hillary Clinton, had voted for the Iraq War and then later changed her mind.
Primary voters decided they wanted a candidate who got it right the first time.
As you'll hear in today's episode, many of the people who lined up behind that war maintained it was a good idea, even as things went badly sideways.
But you'll hear about one prominent pro-war policy figure who did change his mind.
It wasn't easy. He lost a lot of longtime friends.
I have mixed feelings about replaying this episode now.
Russia's war in Ukraine has been so brutal that maybe it shouldn't be used to make an argument about psychology.
But maybe, we were thinking, if we can get better at changing our minds, maybe someday there might be a bit less war. Here's an interesting fact.
Legislators in several Republican-controlled states are pushing to eliminate the death penalty.
Why is that interesting?
Because most Republicans have typically been in favor of the death penalty. Why is that interesting? Because most Republicans have typically been in favor
of the death penalty. They've said it's a deterrent against the most horrific crimes
and a fitting penalty when such crimes do occur. But a lot of Republicans have come to believe the
death penalty does not deter crime, which happens to be an argument we offered evidence for in
Freakonomics. They also say the lengthy legal appeals on death penalty cases are too costly for taxpayers.
Some Republicans also cite moral concerns with the death penalty.
And so, a lot of them have changed their minds.
We've all changed our minds at some point about something.
Maybe you were a cat person.
You became a dog person.
Maybe you decided the place you lived or the person you loved or the religion you followed,
that they weren't working for you anymore.
But changing your mind is rarely easy.
It's not something you set out to do.
Although if you're like most people, you would very much like other people to change their minds,
to think more like you. Because as you see it,
it's impossible for the world to progress, to improve, unless some people are willing to
change their minds. But like I said, it won't be easy, because changing your mind means admitting
on some level that you used to be wrong. It can be seen as an act of weakness, even heresy.
Today on Freakonomics Radio, how to change minds, or at least try to.
Well, there's no silver bullet.
But can't you just present some compelling facts?
This model where people just take facts and draw conclusions is completely wrong.
The incentives are also tricky.
If they were to change their tune,
everybody would see them as a loser.
This is Freakonomics Radio, the podcast that explores the hidden side of everything
with your host, Stephen Dubner.
Tell me something that you believed to be true for a long time until you found out you were wrong.
The list is endless.
I used to be a very serious pianist, and I was one of those snot-nosed classical ones who was appalled by nightmares of Ethel Merman and trombones blasting in the background and who knows what else.
And then the wonderful person
I married turned out to be a musical theater fanatic. And in fact, my wife is a musical
theater director. So it wasn't just a case of you accommodating out of love and familial
attachment. Your actual preferences changed? Oh, I thrill at the excitement of, like, seeing a bunch of barely remembering their lines
high school students stumbling their way through Music Man.
Or, actually, that's not true. I still loathe Music Man.
But I actually have come to, like, like musicals a whole lot.
She and I have done 19 of them now together.
She's directed.
I've been sort of the rehearsal pianist.
Oh, boy.
You really went, you crossed the border then fully.
Yes, yes.
Who is this guy?
And why should we care that he's changed his mind?
I'm Robert Sapolsky.
I'm a professor of neuroscience at Stanford University, and I'm kind of half neurobiologist, half primatologist. For about 30 years, I've divided my time between your basic rat lab neurons growing in petri dishes and then studying populations of wild baboons in the Serengeti in East Africa.
So, considering that I'm not a neuroscientist, in fact, pretty much as far from it as could be,
I do have a sense that the brain and the mind may be two separate things, but I'd love you to
comment on the relationship between the two. I am completely of the school that mind is entirely the manifestation
of brain. So when there's a change in mind, there's got to be a neuropyological underpinning.
Sapolsky, as he noted earlier, has changed his own mind quite a lot. He started early.
I was raised as an Orthodox Jew in a major neighborhood specializing
in that in Brooklyn. And somewhere when I was about 14, something changed. And that change
probably like involved updating every molecule in my body. And that I sort of realized, this is nonsense. There's no God. There's no
free will. There's no purpose. And I have not been capable of a shred of religiosity or spirituality
ever since. And was there a familial schism then? Oh, I was one of those terribly nerdy, scholarly, passive-aggressive kids where
I never said a word about it to my highly religious and demanding father, and he went
to his grave having no idea. No kidding. How old were you when he died? In my 30s. So had you come home and gone to Yom Kippur with him and faked it, or how did that work?
Yeah. Yeah. And not just for the high holy days.
I'm home for three days visiting, and you know what? He's not going to change.
He doesn't need this sort of headache or heartache at this point.
So whatever.
It just would have been very hurtful to someone of enormous importance to me.
One thing Sapolsky noticed about mind changing is that it's easier when you're younger.
Just noticing the general phenomenon
that we get less open to novelty as we get older.
So he worked up a survey to look at people's preferences
in food, music, and so on.
What you wind up seeing is, basically,
if you're not listening to a certain style of music
by the time you're 28 or so,
95% chance you're never going to.
By age 35, if you're not eating sushi, 95% chance you never will. In other words, these windows
of openness to novelty close. But then as a biologist, the thing that floored me is you take
a lab rat and you look at when in its life it's willing to
try a novel type of food and it's the exact same curve. The equivalent of 10-year-old lab rats
hate broccoli as much as 10-year-old humans do. And late adolescence, early adulthood, there's
this sudden craving for novelty. And that's when primates pick up and leave their home troops and transfer to new ones.
And then by the time you're like a middle-aged adult rat, you're never going to try anything
new for the rest of your life. It's the exact same curve, which fascinated me.
Did it make you say, my goodness, I'm biologically programmed to never want to try any new music,
food experience again, and therefore I'm going to push through
that? Or did you accept your fate? It had no impact on me whatsoever. I'm one of those
like scientist professors types who's capable of like lecturing on a subject and paying no
attention to what I'm saying. Like I've spent my whole life studying about the adverse effects of
stress on your health and your psyche. And I'm unlike the most frazzled, stressed person around, I've gleaned absolutely nothing useful from any of my
life work. There are a lot of reasons why it may be easier to change your mind when you're younger.
It could be the fact that your brain is simply more plastic than, something scientists assumed
for a long time, but now we're starting to question. Or it could be that your brain is simply more plastic than something scientists assumed for a long time,
but now we're starting to question. Or it could be that your positions are less entrenched,
so it's less costly to change them. Or it could be that the stakes are lower. The fate of the world does not hinge on whether you are pro-broccoli or anti-broccoli. But as life goes on,
now there's nothing wrong with a little indecision. As the stakes rise. As long as your
job doesn't involve any responsibility. Changing your mind can get more costly. John Kerry has
changed his mind on all these important issues. When Massachusetts Senator John Kerry ran for
president against the incumbent George W. Bush in 2004, Kerry's campaign began to crater after it
was shown that he'd changed his position,
or at least his votes in the Senate, on a number of issues.
If you thought you could trust him, you might want to change your mind too.
So I think, you know, that's the way that politics itself works.
That's Francis Fukuyama. He's a political scientist at Stanford.
My work really centers on research and practice about political institutions.
In 1992, Fukuyama wrote a book that became a sensation.
It was called The End of History and the Last Man.
In the late 1980s, as I was following events in the Soviet Union,
I said, well, to the extent that there's an end of history,
it's going to look like liberal democracy tied to a market economy.
In other words, democracy had essentially won, not just the Cold War, but the future.
And yet, a lot of the recent political momentum is going in the other direction, toward populism and authoritarianism, with a backlash against globalism. So, to what degree do you think
your argument was wrong, or at least premature? How confident are you that what we're seeing now
is just a backlash and not actually a reversal or an entirely new strain?
I am still reasonably confident. You know, the way I've formulated my hypothesis right from the beginning was that you needed to show not just that there was unhappiness with liberal democracy, but you needed to posit some other form of social organization that was superior or that was somehow going to displace liberal democracy in the way that communism, you know, asserted that
it would displace liberal democracy ultimately. And if you look around the world right now,
there are competing systems that are not liberal or democratic. So the Chinese have one,
Saudi Arabia and Iran have their versions of it. But I actually don't think that any of those alternative models
are likely to become, you know, universal in the way that liberal democracy has become in a,
you know, fairly impressive way, the default form of government for very many countries around the So Fukuyama has not changed his mind about his most famous assertion, although he is open to it.
If in 30 years China is bigger than the United States, richer, continues to be stable, continues to be growing faster, then I'd have to say, well, you know, maybe that is the alternative model.
But he did change his mind on something else.
It goes back to that Bush-Kerry era and the Iraq War.
In which direction would John Kerry lead?
Kerry voted for the Iraq War, opposed it, supported it, and now opposes it again.
At the time, Fukuyama was well-established as a prominent political thinker.
In addition to writing a landmark book, he'd done two stints in the State Department.
So his views on the Iraq War were taken seriously.
I signed on to a letter, you know, a couple of years before the war,
saying that the United States ought to take military action.
He wasn't opposed to the U.S. desire to intervene and topple a dictator, in this case Saddam Hussein.
I think that that's happened in the past and it's had good effects.
But as the invasion drew near, Fukuyama did have a concern.
My main concern was whether the United States was ready to actually stay in Iraq and convert it into, you know, a kind of stable, decent country.
And the United States has not had a really great record in doing this in Central America and
Vietnam and so forth. And in the months prior to the war, I began to get increasingly worried that
we weren't prepared to actually stick it out. But even I was astonished at how bad the planning had been and how faulty the assumptions were that we were going to be greeted as liberators and that there would be a rapid transition just like in Eastern Europe to something that looked like democracy. In retrospect, I wish I had taken a much clearer stand against it before the war actually happened.
The U.S. invaded Iraq in March of 2003. and a much clearer stand against it before the war actually happened.
The U.S. invaded Iraq in March of 2003.
I was at a dinner at the American Enterprise Institute in February of 2004.
The AEI is a conservative think tank in D.C.
Dick Cheney was the featured speaker, and everybody in the room was cheering,
like this was the biggest success for american foreign policy that you know they could imagine and i just looked around at the people at
my table and i said you know why are these people clapping because clearly this thing is turning
into a huge fiasco and that's the moment that i decided you know these people are really nuts i
mean they're they're so invested in uh seeing this as a success that they can't see this reality that's just growing right in front of their eyes.
And to this day, I mean, it does seem strange to me that a lot of the people that were strong supporters of the war, even today, are not willing to admit that that was a mistake.
The investment that you're describing, how would you characterize it?
Was it more personal, do you think, or more political? Was the thinking more emotional or logical and using logic to find facts that supported the underlying argument? psychology, lately. Like this model where people just take facts and draw conclusions from them,
and then base their opinions on that is completely wrong. I mean, that's just not the way people
think. They start out with an emotional commitment to a certain idea, and then they use their
formidable cognitive powers to organize facts to support what they want to believe anyhow. So the
partisan affiliation
comes first, and then the reasoning process by which you justify it comes second. And unfortunately,
I think affects all of us. We tend to see the world and cherry pick facts that support our
version of the world, and it takes a really big external shock that just clearly proves you wrong.
So I understand that even though you were seen as having defected from or abandoned
the neoconservative movement, primarily over the Iraq war, that you were not met so warmly
by the left where you moved to.
You said in 2006, I've gotten many emails that said, in effect, well,
you're trying to apologize, but you've got blood on your hands. We don't accept your apology.
Yeah, it's interesting. You know, you're seeing a similar process with a lot of other neocons
right now. The neocons as a group have been the core of the Never Trump conservative movement,
all of whom had been big supporters of the Iraq War and of George W. Bush, have really turned against Trump
in a big way.
And there are a lot of people that are not willing to accept them.
They say, you know, it's too late.
Exactly those words, you've got blood on your hands.
And you know, I think that that is an unduly rigid position because in that case, no one should ever change their mind.
They should never be hit on the head with reality
and then realize that, you know,
they've got a different position that they should take.
When we talk about changing your mind,
we need to acknowledge that every situation is, of course, different.
Let's say someone in your family holds a position that you find odious.
Why do you find it odious?
Maybe you think they're ignoring the facts.
But can't people hold different positions based on the same facts?
Maybe you feel their position lacks moral reasoning.
But who said morality is one size fits all? Or maybe,
just maybe, they hold the opposite position simply because it is the opposite.
Suppose a person has some idea about something which doesn't correspond to reality.
It may be that they derive pleasure from having this idea in itself.
That's Julia Schwetz. She's an economist at Christ's College, Cambridge.
I study people's decisions empirically in order to understand better what drives people.
And in the case of someone deriving pleasure from an idea that you disagree with...
In that case, you have to ask yourself whether it's actually to their benefit for them to be changing their mind.
This idea that we can be so invested in our beliefs even if we suspect they are wrong,
Schretz has found evidence of this in her research.
The incorrect vision of the world may actually deliver some benefit to them.
And she's found this effect not just in models or lab studies,
but out in the real world where people are constantly making decisions about their work, their families, their lives.
It seems to be a very important question whether the beliefs we hold about the outside world
are somehow connected to these beliefs about ourselves. When there is a
link between these beliefs, it's not so clear that we should be changing our minds and what
are the costs and benefits of this. Consider, for instance, an expert who's dedicated their career
to a certain policy or line of thinking. What happens in the face of new information?
Do you seriously reconsider your long-held position
and go against the tide you've been swimming in?
A lot of times you just, you know, you just feel uncomfortable
if you say things that disrupt a consensus and you just don't want to do it.
Francis Fukuyama is recalling his change of mind on the Iraq war.
A lot of my friends were very, very heavily on the other side, and I lost a lot of them.
I haven't spoken to several of these friends since then.
There are two separate questions, whether the person should change their mind and what
the effects are for him, and and what the effects are for him
and then what the effects are for other people.
There's another factor that Julia Schwetz sees as contributing to our reluctance to change our mind.
Confidence, or more accurately, overconfidence.
Our own belief that we are right even in the absence of evidence.
Just how much unearned confidence is floating around out there?
Consider a recent study by Schwetz and some colleagues
that surveyed over 200 managers at a British restaurant chain.
They averaged more than two years on the job,
and their compensation was strongly tied to a performance bonus.
I mention the bonus because it's related to the survey that Schwetz administered.
The managers were asked
to recall
their past performance
and to predict
their future performance.
Presumably,
they should have had
a pretty good grasp
of their standing.
What we found is
that only about 35%
of managers
were accurate
about the quintile of the performance distribution they were
falling into.
In other words, barely a third of them were able to correctly say whether they fell in
the top 20% of all managers, or the bottom 20%, or another 20% block somewhere in the
middle.
And 47% of managers were overconfident about it
And these were people who had detailed feedback about their performance every quarter
Which is a lot more than most employees get
So the next question we asked is
How is it possible that people remain so overconfident when they have so much information.
This is where memory comes into play.
Or maybe you'd call it optimism or delusion.
People who did worse in the previous competition
tended to remember slightly better outcomes.
People seem to be exaggerating their own past performance in their head
when this performance is bad.
So what we conclude from this is that people use memory selectively.
They remember good outcomes and they tend to forget bad ones.
So maybe it's not so much that people refuse to change their minds or refuse to update their priors, as economists like to say,
maybe they just have self-enhancing selective memories.
The data we observe are consistent with them making a choice to suppress some past information.
But there's also the possibility that people who've been at something for a while,
who may consider themselves expert, that they simply don't believe
that non-experts have information that's worth paying attention to.
So I was in the State Department in the policy planning staff in 1989.
Francis Fukuyama again.
And in May of 1989, after there had been this turmoil in Hungary and Poland,
I drafted a memo to my boss, Dennis Ross, who is the director of the
office that sent it on to Jim Baker, who is the secretary of state, saying we ought to start
thinking about German unification because it didn't make sense to me that you could have all
this turmoil right around East Germany and East Germany not being affected. The German experts in
the State Department went ballistic at this.
You know, they said, this is never going to happen.
And this was said at the end of October.
The Berlin Wall fell on November 11th.
And so I think that the people that were the closest to this situation,
you know, so I was not a German expert at all,
but it just seemed to me logical, you know.
But I think it's true that if you are an
expert, you really do have a big investment in seeing the world in a certain way. Whereas if
you're an amateur like me, you can kind of say whatever you think.
As you can see, there are a lot of reasons why a given person might be reluctant to change their mind about a given thing. Ego, selective memory, overconfidence, the cost of losing family or friends.
But let's say that you remain committed to changing minds, your own or someone else's.
How do you get that done?
The secret may lie not in a grand theoretical framework, but in small mundane objects.
Toilets and zippers and ballpoint pens.
We'll get into that right after this.
Think of something you have a really strong opinion about.
Maybe the best ways to address climate change,
the perils of income inequality,
how to balance privacy and security.
Now think about why you have such a strong opinion.
How well do you think you could explain your position?
If you're forced to give an explanation, you have to really understand.
And you have to confront the fact that you might not understand.
Whereas when you give reasons, then you do what people do around the Thanksgiving dinner table.
They talk about their feelings about it, what they like, what they don't like.
That's Stephen Sloman.
I'm a professor of cognitive, linguistic, and psychological sciences at Brown University.
And that means, in a nutshell, that you try to understand what?
I try to understand how people think.
Easy question first.
How do you get someone to change their mind?
Well, first of all, there's no silver bullet.
It's really hard. But if you're going to try, the first thing you should do is try to get them to
change their own minds. And you do that by simply asking them to assume your perspective and explain why you might be right.
If you can get people to step outside themselves
and think about the issue,
not even necessarily from your perspective,
but from an objective perspective,
from one that is detached from their own interests,
people learn a lot.
So given how hard it is for people
to assume other people's perspectives,
you can see why I started my answer by saying it's very hard.
One experiment Sloman has done is asking people to explain,
not reason, as he pointed out,
but to actually explain at the nuts and bolts level how something works.
People don't really like to engage in the kind of mechanistic analysis required
for a causal explanation. That's true not only for big thorny issues like climate change or
income inequality, but even for things like toilets and zippers and ballpoint pens.
Unless you are a plumber or you make zippers or ballpoint pens,
you probably can't explain these very well,
even though before you were asked the question,
you would have thought you could.
This gap between what you know
and what you think you know
is called, naturally,
the illusion of explanatory depth.
So the illusion of explanatory depth
was first demonstrated by a
couple of psychologists named Rosenblatt and Kyle, and they asked people how well they understood how
these things worked. And people gave a number between one and seven. And then they said,
okay, how does it work? Explain in as much detail as you can how it works. And people struggled and struggled and
realized they couldn't. And so when they were again asked how well they understood,
their judgments tended to be lower. In other words, people themselves admitted that they had
been living in this illusion that they understood how these things worked, when in fact they don't.
Where does this illusion come from?
We think the source of the illusion is that
people fail to distinguish what they know from what others know. We're constantly depending on
other people. And the actual processing that goes on is distributed among people in our community.
In other words, someone knows how a toilet works, the plumber.
And you know the plumber.
Or even if you don't know the plumber, you know how to find a plumber.
It's as if the sense of understanding is contagious, right?
When other people understand, you feel like you understand.
You can see how the illusion of explanatory depth could be helpful in some scenarios.
You don't need to know everything for yourself
as long as you know someone
who knows someone who knows something.
But you could also imagine scenarios
in which the illusion could be problematic.
So we've shown that that's also true
in the political domain.
Sloman and his collaborator, Philip Fernbach,
basically repeated the Rosenblatt and Kyle
experiment.
But instead of toilets and zippers, they asked people about climate change and gun control.
We gave people political policies.
We said, how well do you understand them?
And please explain them.
Unsurprisingly, most people were not able to explain climate change policies in much
detail.
But here's what's interesting.
The level of confidence in their understanding of issues,
which participants were asked to report at the start of the experiment,
was drastically reduced after they tried and failed to demonstrate their understanding.
In other words, asking people to explain depolarized the group.
Now, was this a case of simply slowing down and thinking the issue through? asking people to explain depolarized the group.
Now, was this a case of simply slowing down and thinking the issue through?
Could it be that we're often inflexible in our thinking simply because we come to conclusions too quickly?
Apparently not.
If instead of saying, explain how the policy works, if what we said to them was,
give us all the reasons you have for your view on
this policy, then we didn't get that effect at all, right? That didn't reduce people's sense
of understanding. It didn't reduce their hubris. The ability to change your mind,
would you say that's really important as a human?
I see the mind as something that's shared with other people.
I think the mind is actually something that exists within a community and not within a skull.
And so when you're changing your mind, you know, you're doing one of two things.
You're either dissociating yourself from your community, and that's really hard and not necessarily good for you, or you have to change the mind of the
entire community. And is that important? Well, the closer we are to truth, the more likely we are to
succeed as individuals, as a species, but it's hard.
Do you think that most of us hold the beliefs that we do because the people around us hold those beliefs,
or do you think we're more likely to assemble people around us
based on the beliefs that they and we hold?
I think the former is more often true.
That is, we believe what we do because the people around us believe what they do.
This is the way humanity evolved. We depend on other people. And it's not simply a matter of
getting us to think more independently. I actually think that this is one of the major
problems with the kinds of solutions people are talking about today for our current
political problems. I don't think the solution is give people the information they need.
More information can be good if it's very well filtered and curated, but that's not easy to do in an unbiased way.
It's Matthew Jackson, an economist at Stanford.
Yes, I realize this episode is leaning heavily on Stanford professors.
Anyway, Matthew Jackson studies social and economic networks.
So in particular, how the structure of social interactions affects people's behaviors.
Anything from how our opinions form to whether we decide to vote for a certain candidate.
Here's something Jackson has changed his mind about.
One thing I used to think was that people, if you gave them the same kinds of information, they would make decisions the same way. They might have different experiences in their past, different influences, but somehow the fundamental ways in
which they'd think about things and process things is the same. That, however, is not what the data
say. The more you look at data, and in particular, the more you look at experiments where people are faced with facts or information.
You realize that some people are very single-minded.
In one experiment, Jackson also asked people about climate change.
He had everyone read the same batch of abstracts from scientific articles.
We asked people their opinions before they went into the study,
and you could see that people looking at exactly the same article
would interpret it very differently depending on what their initial position was.
So again, information isn't necessarily the solution.
In fact, information can be weaponized.
There was a group of about a quarter to a third of the subjects who actually became more polarized,
who interpreted the information heavily in the direction of their priors
and actually ended up with more extreme positions after the experiment than before.
We've talked about this phenomenon before on the show,
that well-educated people who consume a lot of information tend to hold disproportionately extreme views.
Apparently because they're really good at seeking out information that confirms their priors and ignoring information that might run counter.
One aspect of people seeing exactly the same information and coming away with different conclusions is how we interpret and store information in our brains.
It's very easy to sort of snippet things
into small little pieces that we can remember.
Oh, this was for or against.
We don't like breaking things down in detail.
We just kind of, most of us,
like to have a superficial understanding.
Stephen Sloman again.
Why do you think Obamacare is good or bad, whatever you think about it? Now, the fact is,
most people have very little to say about that, right? Most people just have a couple of slogans,
right? They have the Republican slogan, they have the Democratic slogan, but they don't actually
know about Obamacare because, after all, it's a 20,000-page document.
I like to say even Obama doesn't understand Obamacare.
But even if Obama does understand Obamacare, there's a question of whether his understanding
is unduly circumscribed by the people around him.
People tend to associate with other people who are very similar to themselves.
So we end up talking to people most of the time who have very similar past experiences and similar views of the world.
And we tend to underestimate that.
People don't realize how isolated their world is.
You know, people wake up after an election and are quite surprised that anybody could have elected
a candidate that has a different view than them.
So one antidote to inflexible thinking is simply balance.
In worlds where our network is well-balanced
and we're actually eventually incorporating everybody's viewpoint,
this system works extremely well.
Unfortunately, a great many of us are quite bad
at creating diverse, well-balanced networks.
And there's a reason for this,
a reason we struggle to listen to opposing voices
and, therefore, have a hard time changing our minds.
We are basically hardwired to divide the world into, and, therefore, have a hard time changing our minds.
We are basically hardwired to divide the world into us and thems,
and to not like the thems a whole lot.
That, again, is the half-neurobiologist, half-primatologist Robert Sapolsky,
who's changed his own mind many times.
The domain that I'm most interested in these days is that change thing of turning
thems into usses and how do we do that. And what the studies tend to show is take somebody else's
perspective, try to go through what somebody else's rationalizations are, individuate somebody, break them out of being an automatic them, and think about,
do they like the same pets that you do? Do they love their kids? Look at a picture of them singing
lullabies to their children. Look at a picture of them enjoying the same like food that you do.
Contact. And this has been floating around for decades as a theory, give people of thems enough
contact with each other and they turn into usses. And it turns out contact works under very
specialized circumstances. You got to spend a bunch of time with thems and usses and thems need
to be in equal numbers and in a neutral setting. And you got to have a shared sort of goal. I mean, all of these work to at least some degree.
The peoples we hated in the past are allies now.
There are outgroups that spent centuries being persecuted
where we don't even know what the word refers to anymore.
And in all those cases,
there's something resembling biological pathways
that help them stop being so objectionable.
So before this conversation, if you had asked me, what are the primary barriers that keep someone
in a given situation from changing their mind, I would have certainly opted for the social
and economic explanations. But it sounds as though you're
saying a larger share would go to the physiological and biological reasons. Is that right?
Well, the really irritating thing I would say is that the two are one and the same.
We are nothing more or less than the sum of our biology. Every time you learn something,
from something profound to something idiotic,
something changes in your brain. Every time you have a sensory experience,
your brain is constantly rewiring in major ways.
This idea that the brain continues to change physiologically throughout our lives,
this is yet another idea that Sapolsky himself had to change his mind about.
Yep, this is an aspect of my field where I have missed the boat every step of the way.
When I started off this, dogma had been in place for like a thousand years worth of intro to neuroscience classes,
which is the adult brain doesn't make new neurons.
This is the basic premise of all the miserable, untreatable neurological diseases out there.
And starting in the 60s, there was one lone prophet named Joe Altman,
whose career was basically ruined because he was about 30 years ahead of the curve.
And then late 80s, early 90s, some technique got a lot more sensitive and was able to show
adult neurogenesis in the brain like crazy. And it became the hottest subject in the field.
And I kept saying, nah, that's not a real phenomenon. So I was like, really blew it
on that one. It turns out that there's a little pocket, a little population of stem cells sitting
in the hippocampus making new neurons. And what was even better was it made them at all the logical
times in response to learning, stimulation, exercise, and a ton of work showed
that these new neurons actually are useful and they're critical for new types of learning.
So that ushered in this whole new world and then this beautiful new edifice of revisionism came
potentially crashing down about a year ago. An extremely, I think, important
and well-done paper that wound up in the journal Nature showed that despite the clear presence of
tons of neurogenesis in rodent brains throughout the lifetime, in monkey brains, there was a lot
of reason to think that not a lot of the same occurred in the human brain.
And that a lot of the prior evidence for it was pretty circumstantial.
And as you might expect, the specialists in the field have been stabbing each other over this one ever since.
And it's not clear what the resolution is.
It doesn't get much more meta than that. A bunch of scientists
changing their minds and trying to change others' minds about whether the brain changes when we
change our minds. Robert Sapolsky's own research about us's and them's led to one more change of
mind for Sapolsky. I would say the biggest thing that came out of that is I am in every fiber of my soul a profound pessimist and like sitting and obsessing for three, four years on what we know about the biological roots of humans being rotten to each other and humans being kind to each other.
There's actually a fair amount of room for optimism.
So your belief was that humans are disproportionately cruel to each other?
That was the old belief, and the new belief is that that is not necessarily the case?
It's, well, we're pretty lousy to each other, but the basic paradox of humans is simultaneously we are the most miserably violent species on this planet, and we are the most
cooperative. We do stuff which from the standards of evolution of cooperation, game theory, all of
that would make, you know, stickleback fish just flabbergasted at how cooperative, how altruistic we are, how often we could do that for strangers.
Each one of us, depending on the context, can be awful, can be wonderful, or ambiguously somewhere in between.
That, again, was an update of Episode 379, How to Change Your Mind.
If you want to hear more from the fascinating Robert Sapolsky, he was a guest on Steve Levitt's podcast, People I Mostly Admire.
That episode was called I Don't Think We Have Any Free Will Whatsoever.
By the way, Barack Obama, if you are listening and if you want to come on this show to talk about that Cold War line in the 2012 debate with Romney, we will make some room for you in our busy schedule.
Meanwhile, coming up next time on Freakonomics Radio, who is afraid of the big bad wolf?
Not economists.
My view of the wolf is that they have an important ecological role, and they're also incredibly impactful on the economy.
Impactful on the economy how?
I think it's a great, great question to think about.
And don't even get us started on Bambi.
That is a stupid inaccuracy.
The economists who are crying wolf.
It's next time on the show.
Until then, take care of yourself,
and if you can, someone else too. Freakonomics Radio is produced by Stitcher and Renbud Radio.
We can be reached at radio at Freakonomics.com. This episode is produced by Matt Hickey. Our
staff also includes Allison Craiglow, Greg Rippin, Gabriel Roth, Zach Lipinski, Ryan
Kelly, Mary Deduc, Rebecca Lee Douglas, Morgan Levy, Julie Canfor, Emma Terrell, Jasmine Klinger,
Eleanor Osborne, Lyric Bowditch, Jacob Clemente, and Lena Kullman. Our theme song is Mr. Fortune
by the Hitchhikers. All the other music was composed by Luis Guerra. You can get the entire
archive of Freakonomics Radio on any podcast app.
If you would like to read a transcript or the show notes,
go to Freakonomics.com. As always, thank you for listening.
By age 21, if you haven't gotten the tongue stud,
95% chance you're never going to do anything that bizarre.