The Peter Attia Drive - #130 - Carol Tavris, Ph.D. & Elliot Aronson, Ph.D.: Recognizing and overcoming cognitive dissonance
Episode Date: September 28, 2020Renowned social psychologists Carol Tavris and Elliot Aronson are the co-authors of Mistakes Were Made (But Not By Me), a book which explores the science of cognitive biases and discusses how the huma...n brain is wired for self-justification. In this episode, Carol and Elliot discuss how our desire to reconcile mental conflicts adversely affects many aspects of society. The two give real-world examples to demonstrate the pitfalls in attempts to reduce mental conflict, or dissonance. The examples reveal that no one is immune to dissonance reduction behavior, how intellectual honesty can be trained and lastly, how to think critically in order to avoid engaging in harmful dissonant behaviors. We discuss: Carol and Elliot’s respective background, collaboration history, and their decision to write Mistakes Were Made (But Not By Me) [4:00]; The theory of cognitive dissonance, and real examples of dissonance reduction in action [11:15]; How Elliot advanced the theory of cognitive dissonance [23:00]; The evolutionary reason for dissonance reduction, and cultural differences in what causes cognitive dissonance [30:30]; The great danger of smart, powerful people engaging in dissonance reduction [35:15]; Two case studies of cognitive dissonance in criminal justice [39:30]; The McMartin preschool case study—The danger in making judgements before knowing all the information [43:30]; How ideology distorts science and public opinion [56:30]; How time distorts memories [58:30]; The downside of certainty [1:05:30]; Are we all doomed to cognitive dissonance?—How two people with similar beliefs can diverge [1:09:00]; Cognitive dissonance in the police force [1:21:00]; A toolkit for overcoming cognitive dissonance [1:27:30]; Importance of separating identity from beliefs, thinking critically, & and the difficulty posed by political polarity [1:30:30]; How to impart the lessons from their work into future generations [1:48:00]; and More. Learn more: https://peterattiamd.com/ Show notes page for this episode: https://peterattiamd.com/caroltavris-elliotaronson/ Subscribe to receive exclusive subscriber-only content: https://peterattiamd.com/subscribe/ Sign up to receive Peter's email newsletter: https://peterattiamd.com/newsletter/ Connect with Peter on Facebook | Twitter | Instagram.
Transcript
Discussion (0)
Hey everyone, welcome to the Drive Podcast.
I'm your host, Peter Atia.
This podcast, my website, and my weekly newsletter, I'll focus on the goal of translating
the science of longevity into something accessible for everyone.
Our goal is to provide the best content in health and wellness, full stop, and we've assembled a great team of analysts to make this happen.
If you enjoy this podcast, we've created a membership program that brings you far more
in-depth content if you want to take your knowledge of this space to the next level.
At the end of this episode, I'll explain what those benefits are, or if you want to learn
more now, head over to peteratia MD dot com forward slash subscribe.
Now without further delay, here's today's episode.
My guest this week are Carol, Tavris and Elliott Aronson.
Carol's name may sound familiar to some of you because she was actually a guest on the podcast
back in early 2017, along with Averum Blooming when she she and Abram were on to talk about hormone replacement therapy.
In this podcast, with Elliott, we talk about something very different, which is a book that they co-authored in 2007.
Mistakes were made, but not by me. Why we justify foolish beliefs, bad decisions, and hurtful acts.
Now, if you've listened to this podcast much, you've probably heard me talk about this book at least once.
It's certainly one of my favorite books and one of the books I recommend most to other people.
It's important to understand the background of Carolyn Elliott to understand how they came together to do this.
So, Carol received her PhD in Social Psychology from the University of Michigan.
She's a fellow of the Association for Psychological Sciences.
She's received numerous awards for her efforts to promote gender equality, science, and skepticism.
And that's something that Carolyn I bonded over early. Elliott received his PhD from Stanford
in the late 1950s, and it was during his time at Stanford in the 1950s that he trained
with the great Leon Festinger, the father of the
theory of cognitive distance. He went on to teach at Harvard University, the University of
Minnesota, University of Texas, and UC Santa Cruz, and eventually, back at Stanford.
Elliott is the only psychologist to have won the American Psychological Association's
highest awards in all three major academic
categories for Distinguished Service and Writing in 1973, Distinguished Teaching in 1980,
and Distinguished Research in 1999.
And in 2002, he was listed among the 100 most eminent psychologists of the 20th century.
In this episode, we talk about how Carol and Elliott began to work together on this project,
what prompted them to ultimately write the book.
And we talk a little bit about how cognitive dissonance shows up in many aspects of our lives,
not just in science, but also in politics, in criminal justice, how it is shown up historically,
and perhaps most importantly, how we can train ourselves to not be victims of some
of the worst aspects of cognitive dissonance and dissonant behavior and instead how we can use our
understanding of why our minds naturally try to reduce the pain of cognitive dissonance to actually
hack it a little bit and try to instead be as intellectually honest as possible. By the pain of cognitive dissonance to actually hack it a little bit and try to instead
be as intellectually honest as possible.
By the end of this episode, you'll understand that there's really no one who is immune
from this, whether you're a doctor, a lawyer, a DA, a mental health worker, a scientist.
We all suffer from the pain of cognitive dissonance.
And really, the question is, what do we do with that discomfort?
Are we able to sit in it or do we succumb to it? I found this to be a fascinating discussion
and I hope you do as well. So without further delay, please enjoy my conversation with Carol
Tevers and Elliot Erons.
So Carol and Elliot, thank you so much for making the time to do this and the back story to
this is pretty funny so if you'll both bear with me along with the listeners I want to
explain it.
I read mistakes were made but not by me for the first of many times circa 2012 maybe 2013.
It was love at first sight and the first thing I did was Google you guys,
and somehow I came across Carol's phone number, or maybe it was an email, but I think it was
actually a phone number.
And I called, and I actually got a hold of Carol, and I don't know if you thought I was a
psycho-carol, you probably did, but you were too kind But I somehow could jolt you into coming down to San Diego for dinner
Which you did and we had this amazing evening of talking about cognitive dissonance and it was the beginning of a beautiful friendship
And then two years ago when I started this podcast you were one of the first people I reached out to and I said Carol
Can I please interview you and you said Peter of the first people I reached out to and I said, Carol, can I please interview you?
And you said, Peter, I'm sorry.
I just can't talk about it anymore.
I've said all I can say about this subject matter.
Can I?
You did.
You just said, I just don't have it in me to talk about this anymore.
Oh, no, yeah. And you said, but please don't hold it against me. I said,
I would never hold it against you. Carol, if you ever change your mind, let me know. And so here we are.
And then the second point that's noteworthy is Elliot. This is the first time you and I have
got to meet. So Carol has always said to me, your your the brains behind the science behind
your your the brains behind the science behind sort of the foundation upon which this field that we're going to dig into very deeply is sort of informed the work
you two have collaborated on. And so I think I want to kind of start a little bit
with with your relationship your collaborative relationship which has been how
many years have you guys been collaborating? Oh boy boy do we go back to
psychology today when I think I was psychology today when I was a baby?
I was a baby.
I was a baby and you were a toddler.
What?
It's been close to 50 years, right?
Is that right?
I guess.
72 or something like that.
I think we met at an American Psychological Association
Convention.
Carol was working as an editor of a magazine called Psychology
Today, which used to be a good magazine when she was on it.
And I had just won some prize for writing a book,
the senior editor, the Ask Carol, to interview me
in the hopes that I could do an article for psychology
today based on the book I wrote, which was called The Social Animal.
One of the great social psychology textbooks ever written and deservedly famous and you
had gotten the APA's, I think, the Stinguish Writing Award. So I was dispatched
to try to get you to write an article for the magazine. And we both shared a love of, not
just a love of social psychology, but a love of communicating social psychology to the public
in what can only be called English as opposed to jargon.
And we even made a movie together, a documentary film
and we've been friends ever since.
And this particular project came about partly
because I was losing my eyesight.
And Carol became my eyes, my ears, and my brain.
And we collaborated on this very, it was really great fun, great fun to write
this book. It was an interesting harmony I think for the two of us as well because Elliot, as I'm
sure we will discuss, took Leon Festinger's theory of cognitive dissonance and made it into
theory of cognitive dissonance and made it into a focus on self-justification. And we were sitting around talking about the Iraq war and how it came to be and why even though there
were no weapons of mass destruction, George Bush was holding on to his determination to
believe it was absolutely the right thing to do. And Elliot said, you know, I think that George Bush
was not lying to the American people.
I think he was doing what all of us do,
which is make a decision and then justify it
by cherry picking the evidence to show
that we were right in making that decision.
And from that conversation, we thought,
you know, this is really an important message
for people to hear how in so many domains of our lives, the way we think can really get
us stuck and hard to get out of the mistakes we've made.
One little correction, Carol, you said cherry picking, which is true, but that implies consciously cherry picking.
The point that I was trying to make is that the cognitive dissonance reduction is an unconscious
process.
People don't say, hey, I think I'm going to reduce a little dissonance right now, they just do it.
And it flies just below the level of awareness. So that when George Bush was convinced there were
weapons of mass destruction, he convinced himself of that, even though the evidence was ambiguous.
There was some evidence that indicated that Saddam Hussein did have weapons of mass destruction
and some evidence indicated that he didn't, and he simply was hell bent on invading
Iraq so that he downplayed the importance of the evidence that would have cautioned him not to invade.
And I think that is something we all do if we're not careful.
And there are certainly some fields in which I think the implications of I have done something
incorrectly and new information emerges that suggests I've done something incorrectly, and new information emerges that suggests
I've done it incorrectly, becomes very hard to swallow.
Probably nowhere near as difficult to swallow
as if you're the commander in chief,
and you've been single-handedly responsible for what followed,
what I guess would have been March 19th, 2002,
I guess, or three would have been that invasion.
But I think what gripped me and what got to
me so much, even from the first reading of your book was as a doctor, you think about how many times
in medicine we do things. And then new evidence emerges that maybe that wasn't the right thing to do.
And sometimes it's not immediately obvious, right? But it's subtle. It's like, well,
we used to nutritionally tell people
that this thing was the right thing to eat.
But more and more evidence seems to suggest it's not,
but you've been telling everybody that thing.
So what is the implication?
But before we get into these examples,
maybe let's give people some real explanations of dissonance.
So I think you guys, either it was in your book or I've seen this elsewhere, but there's
a great example of the dissonance, it's like a person who knows that smoking is bad for
them, but still smokes.
How does that person get through the day?
What is that?
How do you describe that tension that must exist in that person?
That's our most famous example, Elliot. Take it away.
I have to say that Leon Festinger is an example.
I was a student, I was very lucky, I was a student of Leon Festinger just as he was inventing
the theory of cognitive distance. I was his graduate student and his major
domal and became his protege and friend so that I kind of inherited cognitive distance
theory. And the example he uses is a person who smokes two packs of cigarettes a day and then the evidence starts
becoming clearer and clearer that smoking can cause cancer and other diseases.
And what does he do with that? Well, those two cognitions, I am a smart, sensible
person and I'm smoking cigarettes even though I know it causes cancer.
Well the simplest sounding thing to do is to give up smoking.
But it's a lot harder to do than people might think because it is addictive.
If a person tries to give up smoking and can't,
then how does he reduce that dissonance?
And dissonance is a negative drive state.
It feels terribly unpleasant,
like being extremely hungry or extremely thirsty,
but it takes place in the mind.
So it makes you very uncomfortable.
And if you can't reduce the dissonance place in the mind, so it makes you very uncomfortable.
And if you can't reduce the distance by giving up smoking, then you work on the other cognition
that smoking causes cancer.
And you might try to convince yourself that it's really mostly correlational evidence,
and therefore not really definitive.
No one has done a controlled experiment with hundreds of thousands of people forcing some to smoke and forcing others not to smoke, which would be, of course, an unethical experiment, but in the mind of the person that experiment would need to be done before I be convinced. Or you could convince yourself that obesity is a health
risk and by smoking two packs a day, I'm keeping myself from eating all of those rich desserts,
which would have made me obese and I probably would die of a heart attack. Or it's debonair to fly in the face of danger and smoke a cigarette like Humphrey Bogart in the movies as I'm a really exciting person.
I would rather live a shorter but more interesting life than one where I was forever being cautious, all of these things, each one of them and all
of them together can be used together as a way of allowing me to smoke and still feel
good about myself.
It lets us sleep at night to use your point, Peter, as well.
The ability to reduce dissonance is what allows us to say, I'm doing something stupid, but
look here are all the reasons that I justify it.
There was a study not long ago of pregnant women smoking
and pregnant women have a double knowledge of smoking
being bad not only for their own health,
but for the baby's health.
And what did these women say
and explaining why they were gonna keep smoking?
Well, I've cut down.
Now the amount of cigarettes I smoke every day
isn't really as hazardous as it would have been
before I cut down.
So this is indeed how we sleep at night.
And as Elliot has often said, that's
the benefit of our ability to reduce dissonance.
But you know what?
Sometimes some sleepless nights are called for, especially
if you're the president of the United States making life and decisions for millions of
people.
Well, that's what's interesting, right? You've discussed this as an incredibly negatively
valence to motion, right? I mean, I love the way you compare it to the incredible
discomfort of starvation or thirst, but it's psychological discomfort of that variety.
Because I know so little about psychology, it's amazing to me to understand the timeline of
these things. Now, you show up at Stanford in 1955 as a grad student. 55 is about when Leon got there. It's only two years later that this theory is put forth.
In the grand scheme of things,
that seems relatively recent in my mind.
I'm not saying that to discredit that,
but or the field.
But what you're describing sounds so fundamental
to the way we as humans live,
that it strikes me as such a major breakthrough. Like, why
didn't this happen a hundred years sooner? Was there some other critical piece of insight
that was necessary that preceded this amazing observation that took place barely 70 years
ago?
I think psychologists for a long time had the notion of rationalization that people
often rationalize their own behavior, which is a kind of a pale version of cognitive dissonance
theory.
And we all knew about that.
And that was it.
Okay, people rationalized.
But the genius of Leon Festinger, first of all, the way he really invented the theory was because he was studying rumor transmission.
And in India, there was a major earthquake, and a lot of people got killed.
And what he learned was rumors spread at the Fee Center of the earthquake.
And Leon was studying rumors, so he saw that these rumors were very reasonable rumors.
They spread, don't worry, help us on the way.
People are coming, they're going to rescue us, they're going to bring food, we're starving,
but things are going to be okay. Those were the kinds of rumors that made sense.
They were comforting rumors.
Meantime, there was a city about 15 or 20 miles away
where there wasn't a great deal of damage,
but there was enough shaking and enough damage
and enough people got mildly injured.
That they were really anxious and really scared.
And the rumors that spread in that area was that there was a typhoon coming, that there
was going to be a hurricane, that there was going to be a huge flooding.
And people were really worried about all of that.
And festering as Christchurch and said, why in the world would
people spread rumors that would increase their anxiety? And what he arrived at as a strong possibility
was that the earthquake made them feel extremely anxious. But they had very little to be anxious about because hardly anyone got hurt and hardly any
destruction occurred, so they invented future things that were going to happen and spread rumors
about them in order to justify their anxiety. And that was the beginning of cognitive dissonance theory.
That's one part of it.
The second part of it was that Leon Festinger was a genius
as an experimental psychologist,
so that he immediately thought up three or four
really interesting experiments that went way beyond
what anyone ever conceived of in terms of mere rationalization showing that
cognitive dissonance reduction works in ways that are often counterintuitive.
It isn't the obvious thing that your grandmother thought about and would tell you about.
It happens in ways that are exciting and interesting when you understand the theory and seem
completely off the wall if you don't know the theory.
I'd love to hear an example of one of those experiments or such, yeah. I'll give you the best example that I can think of,
which is one of the early experiments by Leon Festinger
and an undergraduate at Stanford,
who was, he eventually became my graduate student
when I was became a professor, a guy named Merrill Carlsmith.
And what they showed was, if you pay someone $20 for telling a lie to another person, he
knows it's a lie.
And if you ask, like, what he had to tell the other person who was about to go into the experiment to do a tedious task, like the kind of task that
you would be doing if you worked on an assembly line, packing spools, turning a screw, half
a turn to the right for a couple of hours, which was really tedious.
But he had just come out of the experiment doing that, and his job was to tell the participant
who was waiting to come in next that it was really an interesting experiment.
And Cosmith gave him $20 to do that.
In another condition, he gave him $1 for doing that.
And what happened was that the students who were given $1 for doing it actually came to
believe that the task was more interesting than the students who were paid $20 for doing
it.
Completely upside down from what would be predicted by the dominant theory of the
time, the behaviorist theory of reinforcement, that the more you paid for something, the
more you liked it. What cognitive distance theory predicts is the less you're paid for
it, the more you have to add justifications of your own.
So, if you're paid $20 for telling a simple lie, you can say to yourself,
well, I sold my soul, but $20 is a pretty good price for my soul.
But if you're paid only $1, in effect, you're asking yourself, no, why did I do that for a lousy dollar?
Well, you know, it wasn't such a big lie
because, you know, the task on the surface,
it looks like a boring task,
but it's really a lot more interesting and more intricate
than it really looks on the surface.
And they convinced themselves not that the task was exciting, but it wasn't so bad.
So here's the relevant thing about this Peter
when you were saying about the origins
of cognitive dissonance, psychological science
when I was a graduate student was almost an oxymoron.
Everybody joked about it.
Psychology, what are you talking about?
It's not a science.
It's an oxymorongy. It's not research-based, it's not empirically-based. When Elliot was first doing his
work, the dominant paradigms in psychology were psychoanalytic or behaviorist. Those were the two
big schools that were devoted to explaining how human beings operate and how they think and
what motivates them, and both of them were passed their prime by the middle of the last century.
What Eliot was doing in terms of cognitive dissonance was, first of all, looking into the black
box of the mind, which behaviorists were ignoring completely, we just have to observe behavior.
It's all a matter of rewards and punishments.
And saying, no, there's something happening in there that affects our behavior most
profoundly.
And it was also a time of questioning the qualitative observation, the non-scientific
observations of the psychoanalytic approach to understanding
behavior, which were lively and popular and wrong.
So comes cognitive dissonance and the cognitive revolution and the world of psychology changed,
the world of psychological science changed. Oh, that's so good. I have to come in because I got an example of the psychoanalytic thing.
I came to Stanford in 1956 and I got my PhD in 59 and then I went to teach at Harvard.
When I arrived at Harvard, there was a guy named Michael Conn doing an
experiment for his PhD dissertation. He was a really good Freudian
psychologist and he wanted to do a test on one of Freud's notions called
catharsis. catharsis says that when you feel angry at someone, you need to get it out of your system
by punching a punching bag or even punching the person who made you angry in the nose,
assuming it's someone who's a little smaller than you are, I guess.
So he did an experiment like that, trying to demonstrate Freud's theory of catharsis that acting out on your
aggressive feeling is going to make you feel better and less aggressive. And what
he found was just a reverse. When a person expressed his anger by getting his
torment or into trouble so that his torment actually lost his job
as a result of it.
This was all in an experiment.
That was the scenario that the participant actually believed.
He was costing the person his job.
It actually increased his negative feelings about that guy.
actually increased his negative feelings about that guy.
And Michael Khan was really confused and said, how can this be?
It not only didn't my experiment prove that Kaffa's work
just the opposite happened.
How could that possibly be?
And somebody around and said, oh, we got this new guy
just came as an assistant professor, Aaronson, I think he might have an answer and it's exactly the answer is cognitive dissonance.
If you make me angry and I retaliate in a way that causes you an extreme thing like more
than the simple act that made me angry, I have to justify it somehow.
So the fact that I cost you your job makes me feel dissonant. My God, I really hurt that guy.
Well, he must have really deserved it. He's a terrible person anyway. Look at the awful thing
he did to me and that needed to be punished and
he'll probably find another job anyway, but he's a jerk and he would have done the same
thing to me if I did that I did to him, etc. And that really explains the phenomenon of
blaming the victim. That if a person gets hurt and we can't account for it, we try to figure out
maybe he did something that brought that on.
What it allows us to do is say, I am a good kind compassionate smart person.
And if you're telling me I did something that wasn't good kind, compassionate or smart,
I could accept your evidence or I could, I could say that to
hell with your evidence in a way that allows me to continue
thinking of myself as a good, kind, smart person.
That's what Elliot brought to this as the fundamental heart of
the reason that we so often reduce dissonance in a way to
preserve our self-concept. How we see ourselves.
Obviously the seeds of this are sown like I said 60 years ago, but I've seen lectures where people
talk about the impact on the actual brain itself structurally and functionally. So if you look at
FMRI in a person who's placed in a dissonant situation, how you actually see a change functionally.
Do either of you care to comment on how that looks?
I have to preface this by saying one of my favorite studies in all the world show that
if you give a lecture and you wave around some FMRIs and you give the exact same lecture
without the FMRIs, everybody thinks that's first version was really scientific.
We just love those brain studies, we do.
Well, yes, you are describing studies by Drew Weston, which basically brought people
to the laboratory and showed what was going on in their brains when they were confronted
with dissonant information about someone from their own political party.
If someone from your opposing political party
behaves like a corrupt idiot or jerk,
that's perfectly consonant for you.
People from that party always behave that way.
If someone from your own party
behaves exactly the same way, well, you know,
it's no big deal, all politicians do this and so forth.
So he brings people into the laboratory
wires up their brains and basically finds that in a state of
dissonance, these brains were not happy, they were just not happy, but give them a chance to
restore consonants and it settles down. Now I want to say about this that I think this kind of
research certainly does, it's certainly very helpful in understanding
what efforts our brains go through so that we can live in a state of confidence between
what we believe and how we behave every day.
It's to our evolutionary benefit to hold beliefs that make us feel part of our tribe, our community, our religion,
our group, and so forth.
And so there really is a benefit, an evolutionary benefit, to being able to get rid of dissonance
when it occurs.
But that said, there are enormous psychological and cultural differences in what causes people to feel dissonance.
So, it's not like everybody always will feel dissonance.
If, for example, you're a scientist and you get information that your study didn't turn
out the way you might have liked, well, you might feel a pang, but the scientific approach
would be to say, well, I've learned something.
What can we do next?
Likewise, there have been very interesting studies
comparing what causes dissonance in Japan or the United
States.
In the United States, we feel a state of dissonance.
If we ourselves personally are ashamed or embarrassed or
humiliated by something that we've done, in Japan, people are
more likely to feel the need to reduce dissonance
if they've behaved in a way to harm, hurt, or embarrass other people in their group because they are
typically a more other oriented culture. So while we can appreciate what is universal about
dissonance, let's not make the corresponding error that we all automatically and forever behave exactly the same way.
Let's unpack that a bit. So from a natural selection standpoint, when you go back, even just several hundred years,
let's call it a thousand years, or even go back further to the point where we were mostly functioning in tribes of relatively small numbers, say even going back to hunter-gatherers.
Assuming for a moment that we could find sufficient food
and shelter and take care of the most fundamental
of our needs, what would have been instances
in which we would have experienced dissonance,
and therefore why would this have been a conserved feature
of our evolution to not stay up at night worrying about things
that were tormenting us, that were creating this sort of dissonant or dialectical difference,
and instead allowing us to basically placate our little brains, which weren't little by
that point, of course, they were basically the same sizes they are today, and move forward.
I find this fascinating because it strikes me as a sort
of a very modern problem, like something that only in the
last few thousands of years could this have even been relevant.
But of course, I'm just not thinking of the right examples
perhaps.
It's our position that cognitive dissonance is hardwired.
And the way that happens in terms of evolution is that it has
more survival value in the sense that if you are a hunter back 10,000 years ago or 20, 30, 40,
thousand years ago, and you experienced dissonance because you've done something
that hurt one of your tribesmen or in a way
that you're concerned that he might take retaliation.
And you would lay awake all night worrying about it.
And then the next morning, you get up bright and early
to go hunting and you get
poundstone by a tiger because you didn't sleep at night because you were so
busy worrying about this thing. You're not as vigilant as you normally would have
been. Then chances are your genes are less likely to get into the gene pool. So those who are good at reducing dissonance,
those who are good at saying, ah, wasn't such a bad thing I did, he'll forgive me, I'm
sure. And then you sleep soundly, you get up and you do your hunting and everything is fine.
I think that that cognitive dissonance reduction, the ability to reduce dissonance, is hard-wired precisely
because it has survival value in that thing.
Exactly.
And as human beings evolved and created new technologies, as agriculture emerged, as
economies grew and flourished, people had more beliefs to defend as being the right ones.
If you're living in a little band
where you have your creation myth
of how your people came to be
and you never meet anybody from another band
with a different point of view.
The day that you do, the day that the next tribe arrives
and says, no, no, your God isn't the right God.
Our God is the right God.
Well,
now, what are you going to do with that information? What are you going to do with information
that your way of planting crops is the wrong way and our way is a better way? So, to the
extent, this to me is one of the most interesting things about dissonance, it's to the extent that a belief is really deeply important to us.
That's when we become most tenacious in holding on to it.
It's why, for example, it's not just dumb people who feel the need to reduce dissonance.
The greatest danger comes from smart people who refuse to accept the evidence that they
have done something foolish or stupid or that they were holding onto a belief or a medical
practice, long past its shelf life.
And now you're saying, I, a smart, competent, professional person who knows more about this
subject than anyone in the world and you're telling me, I'm wrong.
The hell with you, see?
Well, that's the scary part, right? That's, I mean, that to me is the part that is,
look, just on a personal level, that's what gripped me with reading about this was, wait a minute,
how many times am I doing this? Because by definition, the person doing it is generally blind to it.
I mean, this is effectively a form of confirmation bias. That's the cherry picking
that you were alluding to that, Elliot, you pointed out is it's a subconscious type of
cherry picking and confirmation bias. We can talk about it all day long and any science
student worth their salt can define it up and down, but it's how often are we doing it?
And the more and more entrenched we get in our fields the more and more
Quote knowledgeable we become the more difficult it becomes to walk back from something that you once held dear
And Peter my favorite example of this is the
prosecuting attorney who worked hard on a case.
Let's say it's a murder case,
and he sends the person to prison.
He gets a conviction, the person goes to prison
and is in prison for 25 years.
And then some DNA evidence shows up
that proves beyond a shadow of adult in anyone else's mind that
the convicted person was actually innocent of that crime.
What happens to the typical prosecuting attorney in that situation is he would feel so bad, so incompetent, so awful, learning that he sent a person away for
25 years when he's really innocent, and then he says to himself, that can't be true.
The DNA evidence has to be wrong, and so he keeps him in prison for another 25 years.
And he does that not because he's an evil guy,
but because he thinks of himself as a good guy
and a competent guy as someone who would never
make a terrible mistake like that.
Yeah, it's funny. You mentioned that, Elliot. One of the things I was thinking about when
I read the book, and I made a note to bring it up because I went back and re-skimmed it
for the 57th time, was the Amanda Knox case. Do you remember this American girl? I think
she was from the northwest and she was studying abroad in Italy, had a roommate that it was
this tragic thing,
where the roommate, and I think the roommate's boyfriend, were murdered, and it was pretty
clear on first pass that Amanda had nothing to do with this, and yet this prosecutor in
Italy had a real bee in his bonnet that she was hands down the perpetrator.
And then of course, the DNA evidence to merges, because the killer actually went to the
bathroom while he was in the house, was my recollection. And then of course the DNA evidence to merges because the killer actually went to the bathroom
Well, he was in the house was my rec collection
They basically get his DNA out of the toilet. Nope. It's this other guy who they find the prosecutor keeps moving the goalpost
Well, maybe she didn't actually kill him
But she was in cooots with this guy though. There's no evidence
She's ever met this guy before at any moment
I mean the whole thing got more and more and more ridiculous.
Now, in her case, she's lucky.
Despite the multiple times she was actually convicted in an Italian court,
ultimately it was overturned.
But you watch that story unfold.
Another example, which I've heard you lecture on, Carol, is the duke lacrosse,
one from God, it's probably 15 years ago now, right?
Well, in that case, so many issues played into this.
Walk us through that story and how that's another great example of a case study and cognitive
dissonance.
The examples in all of these cases are what happens when a district attorney or political reasons,
personal advancement reasons, decides that he or she knows who the guilty person is and then just narrows the focus on getting that person convicted and ignoring any disconfirming dissonant evidence that would throw that assumption into question. was the guys from the lacrosse team who had a stripper to a party at their, I guess, their
fraternity house, right? And she later claimed that she had been raped. Well, this story fit
every, every story that just touches so many buttons of race and women and fraternities
and how terrible fraternity guys are and they're all racist, rapists anyway and so forth.
I don't know something like 80 faculty members at Duke
took out a ad in the school paper
about the toxic masculinity of the lacrosse team
in particular and men and fraternities in general
and so forth so it was okay.
And so they were all back in themselves into a corner
until it turned out that the district
attorney was not giving the ex-culpatory evidence to the defense.
The district attorney was eventually disbarred.
It was a scandalous case, but he saw in this story a race and gender and rape and fraternity
brothers a way to really make name and fame for himself.
That's a particularly egregious example, but it's not uncommon. No, these cases are not rare at all. And my
favorite example, of course, is the central park jogger case where these black
kids, they actually confessed to the crime and were sent to prison. A fellow named Donald Trump took out full page ads
in all of the New York newspapers saying they should be executed,
even though some of them were underage.
And then it turned out that the DNA evidence never matched
from the kids and the seamen that was found on the woman. And then some other
person who was in prison for a similar crime confessed to it and sure enough, he is the
NA evident did match. And the prosecuting attorney, Linda Feirstein, insists to this day that she was right.
And she would refuse to overturn the evidence
that had to be overturned by the district attorney
who oversaw the case and totally vacated
the evidence and the city of New York
paid a huge compensation amount of money
to those kids who were,
who spent several years in prison.
Ferstine said just the typical thing that prosecutors say, we always knew there was a sixth man.
Exactly, we always knew there was a sixth man. That's what they say.
Okay, so at the time of the trial we're only prosecuting this one guy because we know that
he was the rapist and the murderer.
Oh, well, it wasn't his semen.
Oh, well, then there was another guy there and our guy was just holding the woman down while
the other guy actually raped her.
The Innocence Project guys called this the unindicted co-adjaculator theory.
You know, it's just, you know, after the fact we can come up with any explanation of why we are
still right even though we're wrong.
Now, there's a delicate balance here, and I think the great story that I know you've talked
about, because it's hard for you, Carol, you talk about it being one of the times it
really challenged you, was the McMartan nursery scandal back in the early 80s.
And there's a great quote I think you
said about Elliott, something to the effect of we sacrificed our skepticism at the altar of outrage.
I love that. And I want to come back to that. But again, let's tell that story because that's
another great example. But what I want to do is tell this story through the lens of how do we
What I want to do is tell this story through the lens of how do we think about this in the current era where accusations are coming greater and greater.
And some of these accusations are going to be true, and some of them are not going to
be true.
And how there's two ends of it, you can be at polar extremes of either of these, which
are probably incorrect, but there has to be a rational way to do it.
So, maybe thinking about that, that, that, that, that, that, that, that, that, that,
that, that, that, that, that, that, that, that, that, that, that, that, that, that, that,
that, that, that, that, that, that, that, that, that, that, that, that, that, that, that,
that, that, that, that, that, that, that, that, that, that, that, that, that, that, that,
that, that, that, that, that, that, that, that, that, that, that, that, that, that, that,
that, that, that, that, that, that, that, that, that, that, that, that, that, that, that,
that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, that, I can say, but if you were a parent whose kid was going there, you could easily see yourself getting sort of spiraled out of control, too.
So there's a part of me that actually has quite a amount of empathy for everybody involved,
and it just overall seems like a really tragic story.
Tell us about that story. Specifically, your own struggle within it.
The first one, this is a very, very important question because here's what happens.
The minute there is a sensational story in the news, somebody is accused of something,
for example.
What does the public generally do?
We jump to a conclusion.
Just as I thought that son of a bitch really is a bastard and he did it and then how horrible and the okay.
Or we say, no, the accuser is lying, the accuser has a checkered history, I don't want to believe what the accuser says and so forth.
We jump to a conclusion.
Now what this and its theory would predict is, the minute we make a decision, believe this person or believe the other person,
we will now make our belief conform to the evidence we're prepared to hear as things go forward.
So this is the danger of the early jumping to a decision because because then, as time goes on, that choice, that belief that
we have will harden rather than become more open to disconfirming evidence. We will start
looking for all the reasons we were right to believe the accuser or to not believe the
accuser, and we will ignore and minimize or trivialize any information suggesting that we were wrong. That is why that
first decision is such a crucial one. Because again, the more we put into supporting that initial
decision, the harder it will be to change our mind. People say this is the slippery slope, but the
thing that dissonance theory teaches us is that there's nothing slippery about it. It's our active self-justifications
for the beliefs we have that take us down sometimes what turns out to be a wrong path.
We mentioned in the update to our book, there's a wonderful YouTube that Sarah Silverman did
that shows the pain of dissonance right there on the YouTube screen, where she talks about
her feelings about Louis CK when he was first admitted to have been behaving in these inappropriate
ways with women sexually fine.
What she says in this video is a perfect demonstration of dissonance.
She says, he's my dear, good, wonderful friend. I love him. I think he's
a wonderful father. I adore this man. And what he did is reprehensible. And what he did
was a terrible thing to women. And I want to side with the women whom he offended so deeply. She doesn't answer it, but she lives with this dissonance of this uncertainty. My case
with MacMartin was something else. I was in Los Angeles at the time when the mother and her two
children who were working at this daycare center were suddenly accused
of what turned out to be utterly preposterous assaults on the children in their care.
In spite of the fact that for many, many years they had run this daycare center, nobody
noticed anything amiss.
Parents were walking in and out of the place all the time.
But sanity and common sense generally goes out the door and you have a sentence with the
words children and sex in the same sentence.
Just as you said, somebody comes and tells you a police officer turns up at your door
as the police did in this case and says, Peter, there have been some allegations of child
molestation going on at your child's daycare.
You have any information about this?
What are you going to say?
What are you going to say?
Oh, this is likely to be a sex panic.
I don't think you will.
And I remember, it was such big news here in Los Angeles.
It was hysterical news.
I knew the prosecutor at the time.
She was convinced that this was really happening.
And there were no researchers at the time doing psychological research on how to
interview children or on how children respond to repeated questions.
Nobody knew very much about anything.
And at the time, there was a developmental psychologist who argued that if you
don't ask children leading questions, did the doctor touch you here on your
private parts? Did this person do this to you? If you don't ask
leading questions, the child won't tell you, for example, that
she's been in a medical exam with a doctor who touched her.
Well, that seemed to justify the tactics of the social
workers and police were doing with these little kids.
And I wrote an op-ed for the Los Angeles Times called,
Do children lie? Not about this.
You're a title?
No, but I've had to live with that.
The fact that was the message of the op-ed, the op-ed was about how you really have to
ask children leading questions.
So, of course, I can say that op-ed was the psychological science as we knew it at that
point. But am I embarrassed by it?
Oh, you bet I am, especially finding myself at a conference
a couple of years later when Steven Ceci,
who became one of the great heroes of the research on this question.
Who reminded everybody, do children lie?
Are you kidding?
I do, the children never lie.
Could only be said by someone who never was a child or knew a child for crying out loud. Anyway Steve used a screenshot of
my op-ed as an example of how French-A-Lis-Folition-Stupid even our good social
psychologists are. Oh my goodness. Oh, I don't even use the word good social psychology. No, I was very
embarrassing. Okay, so Peter, to your question, how did I feel embarrassed? That's how I felt.
That is how I felt. And I was much smarter than you, Carol, on this. Yes, you were as usual.
I'm only kidding because I felt the same way that you did.
I was far away at the time, so I wasn't getting bombarded with the news.
But I remember picking up a copy of Newsweek magazine and seeing a picture of Mrs. McMartin
sticking her tongue out.
And I thought, oh my God, this molester is mocking the whole the
seriousness of this situation. But since I didn't know much about it, I kept out
of it. But of course she was sticking her tongue out. Some photographer was
harassing her and she stuck her tongue out at the photographer. And that got
published in Newsweek. And that's an example of what happens.
A person does a normal thing, gets angry at a photographer for harassing her when she's being
falsely accused in the press and in court of having done a heinous crime which she is innocent of.
of having done a heinous crime which she is innocent of, but if we believe that she's guilty,
everything she does begins to look like bizarre,
dreadful behavior,
because once we have the rubric of,
she is guilty of child molestation,
then we can't see her as an innocent person being angry
at a harassing photographer.
Exactly.
I want to just highlight that with a thought experiment, Elliot, which is you pick a person,
you know, whether it be someone who's accused of a crime or a political figure or something
who you just hold in absolute contempt and imagine them
in 10 different positions, you know, smiling this way, frowning that way, sticking their
tongue out, giving you the finger. There is not one of those 10 in which you can't come
up with a negative narrative. You look at someone that you deem grotesque, no matter how
they present themselves, Even if they're
standing there with the slightest smile, you would view it as disgusting. How can they not
show more remorse? The slightest smile becomes a smirk.
Yes. A frown becomes an admission of guilt. We are so able to color this lens. It's amazing,
isn't it? I'd like to add what I think is the important take home
on the MacMartin case. MacMartin was the first of a wave
of hysterical cases across the country in which daycare workers
from here to Boston, hundreds of them were accused
of these kind of ritual, a sexual malastation
of children in their
care.
Hundreds of daycare workers went to prison.
Some of them are still in prison.
And the lesson of McMartan is not simply that I was wrong or that others were wrong.
It's this and this is what's harder.
The first reaction that anybody had to McM, Martin, we didn't know anything.
The public had no way of knowing that the first allegation was made by a woman who was so
psychotic, so crazy, known to the police, that they stopped even listening to her after
a while. Nobody knew that the police had gone to every parent's home leading them into a panicky
reaction as your child been sexually molested.
And nobody understood how the children were actually being interviewed by social workers
who were bringing in those anatomically detailed dolls with prominent genitals and testifying
that they knew if a child had been molested based on how the kid was playing with the doll.
Well, as you would have said before, those genitals are pretty interesting.
No little kid is going to ignore the genitals.
But if they did ignore the genitals, it's become they've been sexually abused and traumatized.
And if they play with the genitals, it's because they've been sexually abused and traumatized. It took psychological scientists to do a controlled experiment and ask children known to have been
sexually molested, and children know not to have been sexually molested.
Give them these dolls, see how the kids play with them.
As you can imagine, there was no difference between these two groups.
So the kinds of therapists who were marching into court with no psychological scientific
training at all, just their own observations, their hunches, and their biases, could testify
with assurance that they knew that this could have been molested.
So this is the kind of research that was done in the aftermath of these daycare cases
that came to transform how we understand children's testimony so that we can help the children who have been molested,
but not destroy the lives of innocent adults either. So, the task for us, embarrassed as I was, dissent that I felt for my own participation
in what I wrote about McMarton, I really began atoning for this by writing about what
was happening in the other daycare cases across the country, such as the Amaralt case in
Boston, which was almost a mirror image of McMurray. There's a very subtle difference between always believe the victim versus always be skeptical
and take the accusation seriously, right?
Those are not the same thing, but they're similar.
They can look similar at the outset, but this is a layer of nuance that seems to be missing
from a lot of the discussion today,
isn't it?
Absolutely.
And in science, whenever ideology interferes, it can distort the science, and it certainly
can distort public opinion.
So the ideology now, and it's understandable, because women, for example, have been ignored for a very
long time. So when they talk about sexual harassment, now, in this climate, the idea is always
believe them. Always believe them, because why would a woman come forward if she wasn't telling the truth. But that's an ideological conclusion.
I would substitute for that is always pay attention, always listen, but keep an open mind and realize
that there are probably at least two sides to every story. So be respectful, listen, but keep some skepticism in mind. Now,
some of the radical feminists are saying, so it doesn't matter. So what if some guy is falsely
accused and falsely loses his job? It's almost as if it serves the people of that gender right because
of all the abuses our gender took in the past. And that's never a reasonable position.
It's a very, very old one, of course, in the era of recovered memory therapy, Peter,
which you've written about at length. This is
one you a lot of friends, Carol. A lot of friends. A lot of friends. One of my favorite
descriptions you ever gave was after you wrote your first lengthy tome on that. Someone
you said you got a lot of how did you describe it? A lot of lovely requests to go and passionately make love to yourself or something like that.
May you go forth into the world and
multiply by yourself, yes.
Oh yeah.
Well, you see, any exhortation to believe,
just to believe, I mean, I remember a cover of
Ms. Magazine many years ago about the alleged
satanic ritual abuse cults, which were supposedly a foot all over the country and people were
believing that there were satanic ritual abuse cults that were trapping children and women and so
forth. And the cover of Ms. Magazine had one of these satanic images with the cover line, believe it,
believe it, really, I'm supposed to believe it.
So the idea that as women were going into psychotherapy
and coming out believing that they had multiple personalities,
not just three or five, but 10, 100, 500,
there was an escalation of the belief
in multiple personality disorder,
which was another hysterical epidemic in our country
fermented by many psychoanalysts and psychiatrists and social workers until malpractice suits
brought that bubble right down. But to say, I understand that a person in therapy may be having
trouble and may be suffering, but I get to question the explanation that the therapist is giving them. It's not a matter of disrespect, it's a matter of understanding
and bringing our best science to bear on understanding. So I have always been skeptical of the
believe X group, whatever X group is, uncritically, what's the evidence, what's the best explanation
for it. So one of them are interesting to me ways of understanding many of these very difficult, he said, she
said, disagreements and debates, which are so painful and so difficult, and we do understand
how many, many thousands of women have been sexually abused in so many ways, but, but here's the but.
If you are bringing up an allegation that occurred six months ago, six years ago, 35 years ago, We're entitled to talk about what we understand about memory, what we understand about the way in which information
since the original event changes our memory,
or changes our interpretation of the event.
Events that happened to us 25 years ago that at the time we thought we were benign,
we can come to see as being malevolent and traumatic and oppressive.
That's all part of how the psychological process works.
And what social psychologists like Deborah Davis
have shown is that a person doesn't have to be lying
to be wrong in making an allegation.
And a man doesn't have to be lying.
He can be self-justifying in responding to an allegation.
This is a different level of understanding that I think is important, especially when charges involve
the possibility of ruining someone's life.
It doesn't fall into the category of there's always a simple answer and there's always a
right or wrong way to see a particular allegation.
This point also about memory, probably about six years ago I read, or five years ago, it
was after I had read your book for the first time, but I read the book by Catherine Schultz
being wrong. And I also was very, I mean, moved is probably the wrong word,
but I was struck by how feeble my memory was because I've always, I think up until reading that book,
I had never questioned that the way I remembered something was correct because I have a very good
memory for random stupid things, you know, like I can tell you that on,
Monday, June 27th, 1988, I did this and it happened this way and that way and this way
and that way and I can be right on some of those things
but I can be surprisingly wrong on other things.
And I think reading that book helped me appreciate
how much I could be wrong about.
And that became really scary to me
because it really made me realize
that I'm quite fallible to my own BS.
I thought I was somehow, I levitated above that.
Like I just thought whatever was in Peter's memory vault
had happened.
And now to know that that's not true is a little scary,
but to your point,
it's probably not true for anyone.
The common idea is that we have a little tapy quarter inside our brain, and all we have
to do is press the button, and it'll all come out. It's wrong because it's not all in
there. A lot of it gets contaminated contaminated and things get mixed together that don't
belong together. So it doesn't have to be any a matter of self-justification or any kind of
distance reduction. It can be just wrong, just randomly wrong and misremembered in some sort of
harmless way, as in Carol's famous memory of that book that she
sure her father read to her.
Oh, I'm handing you the ball.
Oh, that was at the ball.
Oh, how did I miss the ball landing right here in my left?
Oh, well, yes, when we were writing a book, I had a vivid, vivid memory
of my father reading me James Thurber's The Wonderful O.
It's a, by the way, a wonderful book about pirates
who remove the letter O from the alphabet,
from speech and from every object.
You may keep geese, but not a goose and so forth.
And it was a wonderful, wonderful book.
And I remember him reading it to me
and our laughing about what names would be like without their o's, O'Felia, Oliver, and so forth. And it was a wonderful, wonderful book. And I remember him reading it to me in our laughing about what names would be like without
their oes, ophelia, Oliver, and so forth.
And then, because I like to read that book every so often just to cheer myself up, I go
and I see its publication date, which was one year after my father died.
It just hit me and the God.
How could that be?
And of course, then that starts you on another
Trail well who did read it to me and wait a minute?
I was a teenager who's reading me a book when I'm a teenager
Where you know all the things that I had made up around that and knew along which is why I like to think of
dissonance theory as an
incredibly helpful mechanism of understanding that is a form of arrogance control and certainty control. And if we can learn to have passionate
beliefs that give our lives color and meaning that we live by,
but to hold them lightly enough so that if the evidence
comes along, that our favorite diet leader is wrong,
we can finally have to say, you know what,
I was wrong on that one.
Yeah, this idea of doubt versus arrogance is amazing,
but society doesn't really reward doubt as much as it
rewards arrogance, right? I mean, isn't that part of the challenge where we're, I think,
culturally a little bit more likely to find somebody believable, wouldn't, doesn't a patient
want a doctor? Maybe arrogant is too strong a word, but doesn't a patient want a doctor that is exactly much more
confident, confident, confident, confident, confident,
we want certainty, yeah, and certainty can easily morph into arrogance.
I remember once when I was serving as an expert witness in a murder case, I was presenting some psychological data that favored the accused person. I said,
all of the things being equal and the prosecuting attorney leaped on that saying, oh sure, all
of the things, but in the real world professor, all of the things are hardly ever equal. And of course, as you know, court
trials are somewhat theatrical because they're playing to the jury. And I finally had to
turn to the judge and said, that requires an explanation. And the judge was very sympathetic
to me. And I finally had to say that all of the things being equal
is simply the cradle of experimental science.
It means that you control extraneous variables
that could distort the data.
But not that it proves your case.
It's simply random error that can distort the data so that it's a good thing
to hold all other things equal, not a bad thing about science.
That's the kind of thing that we're talking about that.
I'm not sure why I started that story.
Well no, because, see, this is another interesting thing. It's why
many people prefer the pseudo sciences where somebody gives you a certain answer.
I know that if you eat this thing, this following health benefit will occur,
whereas scientists have the irritating habit of talking in probabilities. It is likely that.
It is probable that.
In fact, every scientist knows that the people who speak
in certainties are not speaking scientifically.
That's just not how scientists think,
and that's not how they would present their findings.
But certainty is what most people want to hear.
But of course, if someone really is certain about something, they have almost certainly
frozen their ability to change their minds when they need to.
I want to talk about something that you guys have written about as you have your pyramid
diagram.
I talk about it as sort of a path dependency as the way I kind of describe it to people.
And I find this to be, I think, one of the most remarkable explanations that you provide
for how two people can start out with a very similar set of beliefs,
and yet come to these very pivotal moments where a decision is made that sets them on a path.
And that path becomes reinforcing and reinforcing and reinforcing. And after a long
enough period of time, those people are so divergent in their beliefs and in their behaviors
and including their view of each other that they seem unrecognizable as though they came
from the same species, let alone that they once stood next to each other. Let's go with
two examples. Let's talk about the two
college freshmen who have never before cheated on a test that are now both in the final exam room.
And they are both coming to the same question that they don't know the answer to,
confronted with an opportunity to cheat. So Carol, walk us through how one could cheat,
one could not, and the implication
of this. What these two people, how these people go through the world 10 years later as a result
of that decision? Right. This example was actually based on an experiment that had been done
many years ago with children, and it's, it's really quite simple. The two students
have pretty much the same attitude about cheating. It's not a desirable thing to do.
We know it's not a good thing to do,
but it's not the worst sin in the world.
And now here they are in their exam
and they draw blank on this crucial question
and the crucial question will determine their grade
on the exam and in the manner of students,
not only on the exam, but in this course and in life.
And everything will go south if I don't get an A on this exam
and so forth in the way that students often panic.
And suddenly they are given the opportunity to cheat
as the student next to them, the one with the beautiful hand
writing makes her answers available to them.
Now they have a second to decide, cheat or not cheat.
One cheats, one doesn't cheat. One
gives up integrity for the grade. Yeah, there's no integrity is too important. I'm
not going to look. The minute they make that decision, as I was using this in a way
earlier, the minute they make that decision, they now have to put their behavior and their attitude toward cheating into
consonants. So their attitude about cheating will now change to be consonant with the behavior
of what they did. So the one who cheated will not modify that view to say cheating isn't
such a bad thing. Please everybody cheats, It's just a victimless crime who cares.
But the one who resisted cheating will now feel even more strongly that cheating is wrong
and unfair.
And it's not a victimless crime.
What about all the people who don't cheat and work hard and learn the material. Over time, as they move along, their attitudes about cheating and their
self-justifications for their own behavior will keep really enforcing. And we use the
metaphor of the pyramid because they start out at the top side by side, but by the time
they have finished justifying one step at a time, their own behavior, they stand at
the bottom of the pyramid very far apart from each other.
And for me, what I find most illuminating in this metaphor is that it demonstrates how
hard it is to go back up.
Because now here you are at the bottom and you've spent all this time and energy justifying
your decision to cheat or not cheat. How are you going to now go back and say, you know, that first
step I took off the pyramid was really the wrong one. So that is the metaphor we use in explaining
why people get themselves locked into a belief or a practice. And what that matter for really illustrates beautifully is that every time a person has
to make an important decision, and it's a difficult decision, like the cheating example,
he or she is doomed to experience cognitive dissonance.
For the students who cheat, the dissonance is, I see myself as a
basically honest person, and yet I'm committing a dishonest act. Therefore, to reduce dissonance,
that act isn't so very important because everybody would do it. I'd be a fool not to do it. For the person who resists the temptation to cheat,
the cognitive dissonance is,
I could have gotten a really good grade in this course,
and that would have allowed me to go to medical school,
and I chose not to do it.
Therefore, it would have been a horrible crime
if I had cheated, so that you cannot escape cognitive
dissonance no matter which way you choose, and the cognitive dissonance is followed by self-justification
which changes the attitude enormously.
Elliott, to borrow your phrase from a moment ago, all things equal if I took 200 students
and looked at a hundred of them who elected one path or the other. So just pick whichever path you want to discuss.
How much variability is there in the dissonant response amongst that group? So if you take,
for example, say the group that decided to cheat and now has to spend the rest of their life
justifying, this was not victimless.
This has allowed me to get to graduate school.
I'm gonna have a greater impact on the world,
blah, blah, blah, blah, blah.
How variable are those students in their responses?
And what is the extent to which that is subconscious
versus conscious and how it plays out?
There's no real answer to that question
without doing the experiment,
but in the experiment that was done with by one of my fellow graduate students at Stanford, a guy named Judson Mills,
there was almost no overlap between the final feelings between the kids who cheated and the kids who didn't cheat. Each one deviated from their feelings about cheating
a day before they were put in that situation.
So, there was a little overlap,
but it had a major impact on their attitudes about cheating.
If I may, happens that we have an example here
in our very own book from the high achieving high pressure, Stivocene
High School in New York City.
71 students were caught exchanging exam answers, and they gave the New York Times reporter
a litany of perfect self-justifications, which allowed themselves to keep seeing themselves
as smart students of integrity.
One said, it's like, I keep my integrity
and fail this test?
No, no one wants to fail a test.
You can study for two hours and get an 80
or you could take a risk and get a 90.
He redefined cheating as taking a risk.
For others, cheating was a necessary evil.
For many, it was helping classmates in need.
When one girl finally realized her classmates had been relying on her to write their papers
for them, she said, I respect them and think they have integrity, but sometimes the only
way you could have gotten there is to kind of botch your ethics.
Kind of botch your ethics that there you go.
How do they define integrity in this?
Exactly.
So there's another example that's even more distressing though and I think this was in your book.
But if not, Carol, I know we've discussed it, which is.
Well, I think it is actually in there, which is, you know, the point where the cop is just
planting evidence on people who are clearly innocent.
Do you remember that was this in the book?
Oh, yeah, test-aligning.
There's a term for it.
I think the way you walk down this is, look, maybe the first time it happens is they break
into the crack house.
You know this house is full of crack.
There's no dispute.
The suit, the soon as the battering ram goes through the door
You can see the plume of smoke. You see this one guy running into the bathroom slamming the door locking it and you hear him literally
Shuffling drugs into the toilet as he flushes it and just as you get the door open
You see the last swirl of water taking that last
You see the last swirl of water taking that last gram of cocaine down the toilet and you are out of luck.
In that situation, when you know, as sure as God made little green apples, that these guys
are filthy, you plant that cocaine right on that guy and you make your arrest and you
finally have your bad guy.
There's another cop there who would refuse to do it
even in that situation.
Now again, those guys are at the top of the pyramid.
What can those guys look like 10 years later?
That guy who planted that drug in that situation
which seems quite justifiable, right?
What is that guy doing 10 years later?
Is that Mark Ferman, you remember this Bozo
in the OJ Simpson trial? I mean, I guess to me one of the points of these stories is
You probably aren't starting out with the most egregious examples of behaviors that people think about right?
You probably got there stepwise. Well, that's just it
We look at people very often who are at the bottom of the pyramid and we don't realize that they started at the top
They made decisions very very small tiny decisions Jeb Stuart McGGruder in our book and how he got enmeshed in the
Watergate scandal. It was one step at a time, starting with the smallest, smallest compromises,
until he could not get unenmeshed, right? And so it's true. Very often, we look at the behavior
of people without realizing all the time, effort, and
money that they put into justifying their behavior as they got further and further along.
So behavior that seems really puzzling to us at a distance makes far more sense when
we see it in terms of this one step at a time, which is, you know, one of Elliot's great
experiments, Elliot, the initiation
experiment.
I remember that.
Remember that one?
Terrific demonstration of this.
You want to tell it?
No, I want a true talent.
No, I'm not.
You don't even know if Peter was to you to tell it.
I would love for one of you to tell it.
No, this is the critical point, right? The initiation, this is what I describe
as sort of the path dependency, right? It's the, it's the switch on the train. And as you
said, if you're faced with a hard decision, I think most of us don't appreciate what
the consequences of that can be if we take it with the wrong mindset. And this is something I wanted to come to later, which is identity versus behavior.
So it's sort of like you can say in this moment with these facts, I'm going to make this decision.
But I think when you can do that without pinging your identity to that decision and instead just pinging your behavior to it, I think it becomes a lot easier to move off that path
when you're encountering new information.
But sometimes it's too easy.
Let's talk through some of these experiments.
Let's take the example you just gave of the cop who knows the guilty.
He can see the smoke. He hears the toilet
flush and sees all of the evidence going down the toilet. Now you said you can see how that could
be that is justified that he plants the cocaine because he saw he knows they're guilty and I would suggest that it's never justified. It's understandable that he might
plant the cocaine. But because of what I know about how the human mind works, I also know
that once he takes that step of planting cocaine. That he didn't actually find there.
It makes the next step and the step after that a lot easier.
The next thing you know, there's a totally innocent person who has been framed for crime he
didn't commit. And notice by the way, the dissonance a police officer will feel for another cop in the room
to say, no, no, no, no, this is wrong and don't do this.
I mean, we see this as one of the problems in sort of our discussions of police departments.
The presence of somebody who is behaving ethically is dissonant to someone who is not behaving ethically.
I don't like you.
You're reminding me.
You're reminding me that I'm doing something that I shouldn't be doing if I were an ethical
person.
And these guys get ostracized by the culture of most police departments, which is you have
to go along, you have to play along.
How much do you think this figures into police racism police brutality
do you feel that this is a similar complex issue, but it begins with, I
think, a false and illusory correlation.
Since most street crimes happen in poor neighborhoods and since a lot of poor neighborhoods
consist of ethnic minorities, black and brown people. I think books have learned that those dangerous neighborhoods
with a lot of street crimes are often caused by black people.
Then they become suspicious of all black people
and become quite brutal of it about them
in their behaviors toward them.
And again, it's understandable, but not justified,
that they behave that way, the way they did
and in the George Floyd case,
it's again, one step at a time with two of the other cops
acting almost as bystanders who are allowing it to happen
because the culture is such that they don't want almost as bystanders who are allowing it to happen
because the culture is such that they don't want
to interfere with the most brutal cop
because he seems to know what he's doing
and they're afraid of being ostracized by their fellow cops
if they interfere with it.
And that's one step at a time, which is exactly how we define how cognitive
dissonance can get us into trouble.
Because each step you take down that pyramid, you justify it until, by the time you reach
the bottom, you don't recognize yourself anymore.
That's why Jeb Stuart McRuder is a perfect example because he was a guy with good, high morals, and he saw himself in retrospect being corrupted in the Nixon administration until he ended up doing things that a year earlier
he never would have dreamed of doing. But one staff at a time justifying each one along the way
when he finally woke up as he was being sentenced to prison, it was like waking up from a bad dream.
This is actually one of the most powerful experiences for me
and working with Elie down this book
was finding the stories of people who were able to
break out of the cocoon of self-justification
that they had spun to protect themselves
and were suddenly able to see themselves
and the consequences of the behavior in a clearer light.
You see, sometimes that light is obscured for us.
I mean, for example, when we talk about systemic racism, systemic racism, which is very hard
for most people to understand and experience, because by definition, we experience things
as individuals.
But there was an important study in Seattle some years ago of the way that the police
and the city government were defining
what a drug crime was, what kind of drugs do we wanna go after,
who do we wanna arrest for what kind of drugs?
So the kind of, you know, white kids using cocaine,
well that's okay, that's not a problem,
we're gonna have drug vests on white kids using cocaine,
but black kids using crack, oh, let's okay. That's not a problem. We're going to have drug vests on white kids using cocaine. But black kids using crack.
Oh, let's go for that.
So the very definition of what the problem is in whom we want to arrest set up a pattern
of the over arrests of African Americans compared to whites.
That's an example of a decision that occurs at the top and that can have very powerful
racist consequences, whereas each
individual officer is saying, by arresting this guy for crack, I'm not a racist.
I'm not personally a racist, yeah.
And I would say that one other element in our police departments, as we are now learning,
is that the culture of brutality is larger than the specifics,
something like 20 to 25% of the police officers
in the United States have come from the military.
That's their training.
Many of them are excused from having to go to a police academy
if they've had military training.
So look how that would shape your point of view
about how to control crowds and who the enemy is and so forth. So we have every individual
is embedded in a social network, taking their cues from that network and justifying their behavior
in order to remain a part of that network. Perhaps the most optimistic part of, I think, your work
is the description of people who are able to halt the slide down the pyramid, right?
So I want to talk both about some of those stories, but also what are the traits or things that we can do?
insertions of cognitive tools that we can use to kind of guard against it because I I hope nobody comes away from this
thinking that this is a condemnation of an individual
who's subject to cognitive biases and confirmation biases as they struggle to
emeliorate the suffering of cognitive dissonance. I think the point is just the opposite. We're all
doing it all the time. And the more we talk about it and the more we think about it, the better shot we have at getting to better
answers as opposed to just reiterating bad behaviors because they fit with preconceived
notions. So what are some of your, both of your favorite success stories on people who
were heading down the path, down the pyramid, and then managed to sort of realize that
they were on the wrong side of the pyramid and then managed to sort of realize that they were on the wrong side of the pyramid.
Me!
Look Martin, look, I got out of that mess. Okay, Elliot, you're on.
Well, my favorite person in our book, I think, is Wayne Hale, the guy who made the gold decision on the Columbia disaster in NASA.
The Columbia ship exploded and Wayne Hale was the operations officer at the time.
He knew these guys, he knew their families, and they all got killed because of his decision.
And he wrestled with that one for a while.
His first response was, look, no launch is perfect.
There are always little problems and you have to learn which ones to ignore
and he literally stopped himself in mid-sentence and said,
but you know, when I look at it, I have to say, at this point, I wish I had been more cautious.
The weight of the evidence was to abort the launch just before it started, but I made the wrong decision, and I'm dreadfully
dreadfully sorry.
And when I weigh all of the evidence and all of the reasons why he could have doubled
down, I say, my God, what a courageous guy that is,
to take the blame which he deserved
for the death of all those wonderful people.
That was a tough one.
Yeah, it is.
It's immensely powerful.
It's in finding these stories you see.
Truly, the courage it takes, the honesty,
to face up to it and fess up. And generally speaking, the reaction of those around you is not going
to be critical. It's going to be grateful. Wayne Hale did not suffer for his ability to
send this email to everybody at NASA saying,
look no further for the person responsible for this disaster.
I'm the one.
And I should say, thank you, Peter, for this question, because I too feel that the more
people understand about how cognitive dissonance operates and how we are also susceptible to
it, the more helpful it is in understanding our
own behavior.
So to know that the minute we make a decision, whether it's about a car or a partner or
a house or how to live with COVID, any time we make a decision, we're going to be motivated
to throw support for that decision and ignore evidence that we're wrong, the moment
we understand that that gives us a whole new toolbox of ways of dealing with our own
beliefs and attitudes.
And one of the primary tools in that toolbox is the ability to say, I did a stupid thing.
I made a stupid thing. I made a stupid mistake. I did something that caused harm, but just
because I did something stupid or immoral, does not necessarily make me a stupid person
or an immoral person. And to be able to say that thoughtfully and meaningfully is a very important thing to do. We have to
be able to say making mistakes is difficult to deal with, but one can do it if we don't
if so facto embrace that as meaning I did something stupid, if I'm a stupid person, we have to be able to say, I made a stupid mistake,
but I'm not a stupid person. What can I learn from having made that mistake? How can I make
sure that I don't make a similar mistake like that again? And if that mistake caused harm, how can I make amends? And that's how a person lives a meaningful life.
Exactly.
We love the story of Shimon Perra,
the former Israeli prime minister, who
was thrown into cognitive dissonance
when his good friend Ronald Reagan accepted an invitation
to lay a wreath at the Bitburgburg cemetery in Germany and some new national event.
It turned out that 47 Nazi officers have been buried
at this cemetery.
And of course, the world was furious at Reagan
for accepting this invitation to lay a wreath there,
Holocaust survivors, and so many others were just outraged
as was Paris.
So a reporter said to
Paris, so what do you think about your friend Ronald Reagan accepting this invitation to speak
at the cemetery? And Paris said, when a friend makes a mistake, the friend remains a friend,
and the mistake remains a mistake. In this way, he separated the two dissonant cognitions
just as Sarah Silverman was trying to do
about her friend, Louis CK.
My friend made a mistake, he did something wrong,
he remains my friend and what he did remains wrong.
When I do something wrong, what I did remains wrong and I still remain a good kind person.
You separate the dissonant cognitions and treat them separately because the usual impulse
would be to jump to a decision.
I'm done with that friendship or I have to minimize the thing that my friend did.
And by being able to separate the two dissonident cognitions more closely and evaluate them,
sometimes it requires us to live with the discomfort that we love this person and this
person did this awful thing.
And it requires self-reflection.
What kind of a person am I who have done that this particular thing? And serious self-reflection is a lot more difficult
than self-justification. The easy route is to leap directly into self-justification,
but as we've seen, one step at a time that can lead us down the primrose path. But self-reflection, I did a stupid thing. Why did I do that?
How could I learn something from that? That's our recommendation for the way to go.
In Eliot's absolutely wonderful memoir, not by Chancellor Lone, He uses this observation about what drew him to the field of social psychology,
and I agree with it too. He said, clinical psychology therapy is about repair. Social psychology is
about change. And I think that is really a guiding principle of why we wrote this book, because it's
The exciting principle of why we wrote this book, because it's an examination and indictment of so many institutions here, the therapy world, the criminal justice world, family relationships,
so many domains in which by understanding we do have the power to change.
So what year did the first edition came out?
Was it about O6?
Seven. Seven. So what year did the first edition came out? Was it about 06? 7.
7.
So the world's changed a lot in 13 years. Obviously you've shared that a big part of your
motivation for this was sort of the frustration you had with the Bush-Chainy administrations
continued justification for the war on Iraq. long after it became quite clear that the
reasons that were given to go to war didn't exist.
And I also think it's safe to say that even within that administration, there were many
different flavors of dissonance.
But I think I've read maybe two or three biographies, including the autobiography of George Bush.
I find presidential biographies very interesting. Andography of George Bush. I find presidential biographies
very interesting. And I would agree with your take. I don't think for a moment Bush felt
he was pulling the wool over anybody's eyes. I really think he believed this to the essence
of his core. But here we are 13 years later in Iraq and Afghanistan, and most of the Middle
East, quite frankly, has largely forgotten at this point and not just with what's going on in terms of coronavirus, but I think
more broadly speaking, what's going on in terms of populism, what's going on in terms
of racism, a greater polarization within our political system.
I mean, I think almost anybody would give their left arm to go back to 2007 politically,
frankly, like when you had two somewhat reasonable parties that kind of behave.
It seems a heck of a lot better when you first wrote the book.
Are you more or less optimistic today?
And how much do you think cognitive dissonance is factoring into what really looks upsetting and ugly
and just very unpleasant with respect to the way we are governed and with the way we govern.
How much time you got?
The problem, I think, is the hard polarization of the parties in our introduction to the revision.
Pope Bob Dull saying, you know, Bill Clinton is my opponent, not my enemy. I mean, how far we have come from the idea of seeing the other party as an opponent and not an enemy,
which is quite an explicit tactic.
So it's back to your question about identity in a way.
One of the curious things that have happened in our country
is that political identity has come to have
a primary importance for people in ways that
it did not at one time. That is, you know, it used to be, you'd say, well, would you want your child
to marry a fill in the blank, a person of another religion, a person from another city, a person
from another ethnicity, a person from a different religion, and so forth. Now the thing that you most don't want your kid
to marry is somebody from the other political party.
It's that the hostility of that attitude
is a sign of the problem of polarization.
And what that means is if political parties
have become the central part of people's identities,
then by definition, it makes it very hard to accept any evidence that somebody from the other
party might have a good idea.
Might be doing the right thing.
Might be somebody I can listen to.
Although the current political scene is worrisome and we can all be pessimistic about it.
One of the things that has been most heartening for both of us in writing this book
has come from the hundreds upon hundreds of letters
from people telling us what they have learned from the book
and how it has affected them.
Now, you know, authors get these
that change my life kind of things,
but we get stories.
We get stories from people who have explained
how they have taken an understanding
of cognitive dissonance
into their own lives with often very interesting results.
So for example, once I heard from a man who told the following story, he said, I've got
five siblings and we've been at war, at war, over the legacy of our family inheritance.
We've formed two factions.
We've been fighting with each
other and the mediation hasn't helped and nothing has helped and we're estranged from each
other and so forth. He said, and then he said, I read your book and I gave it to our mediator
and I said, here I said, give this book to my brothers and they will immediately understand
what they're doing wrong. I mean, he didn't quite say that way.
He said, to the mediator, here, have my brothers read this book.
He said, I got no reply.
He said, and then a couple of years later, he said,
I picked up your book again and I read it and I said,
oh, he said, oh, he said, incredibly,
the words on the page had transformed themselves.
And I wrote to the mediator and I said,
tell my brothers that I now understand
what I have been doing wrong in our discussions.
I have been greedy, I have been selfish,
I haven't thought enough about how you guys
have seen the situation.
I'd like to apologize, can we talk?
And they do.
I love that story.
And it's a perfect example of what a lot of people discover on reading the book.
We all have blind spots.
And perhaps the ultimate blind spot is the belief that I don't have a blind spot.
And if only people would see it my way,
then they could arrive at the reasonable solution
to any problem.
But the fact is what people often discover
is that the blind spot is in me.
I think you're absolutely right, Carol.
I think this all comes back to this idea of I did
versus I am. I don't know if either of you had ever met Marsha Linhen. I do. I haven't seen her in a
long time, but yes her work on DBT. Well her work on dialectical behavioral therapy which I've
become such a fan and such a student because I think that what Marsha and that school of DBT have
taught is effectively the exact way you're describing this, which is the more we can distance
our identities from these actions, the easier it is to hold and sit in the discomfort of
these two things.
Again, I think the Silverman Louis CK example
is a great one. I do it as a practice every day, by the way. So I do this thing because
one of the things I've struggled with historically is I tend to peg my achievement to my identity.
So even with something that's as seemingly stupid as shooting my bow and arrow, so which
I do very often. So I'll go in the back and I'll shoot my bow and arrow. So which I do very often, so I'll go in the back
and I'll shoot my bow and arrow.
And if I have a good day shooting,
I'm gonna have a good day period.
And if I shoot poorly, I'm gonna have a bad day.
And of course, the only way that can be true
is if you're so silly as to assign worth to yourself
as a result of how you perform.
So now every single day, when I do whatever it is
that's my recreation activity,
I dictate into my phone for no more than two minutes,
three at the most, a lovely discussion to myself,
separating my performance from my worth.
It can go something like this.
Hey, Peter, great job today shooting. I mean, it was really
amazing. You got up there and you went, you shot a 296 32x. That was really good. And you went and
did, did, did, did, did, did, meaning all of these are very good. But I just want to remind you,
that doesn't actually make you any better today. You know, you're, you're, you're no better a
father. You're no better a husband. You're no better a friend. You're no better a father, you're no better a husband, you're no better a friend, you're no better a doctor
than you were yesterday when you shot very poorly.
And this exercise, by doing it out loud
and actually sending it to my therapist every single day,
and knowing she listens to it,
has been such a powerful tool for me
to uncouple what I do from who I am.
Now that's a silly and small example, but listening
to our discussion today, I want to think of bigger and better ways to do that because
I do think that in as much as our political identities become our personal identities,
we're really hosed. I do fear that as sort of a population, it's going to be very difficult
when, as you said,
like, whenever the other person says is wrong, no matter what, when no matter the
countenance on the other person's face, anything from crying, smiling, laughing,
will always be colored in the wrong thing. I don't know how you can make
progress in thought. If a society can't make progress, right, in some form of manner, whether it be knowledge
or insight or thought, I don't know.
Does this represent the end of progress?
Oh, the students who start the end of days, right?
Well, I don't mean then a doomsday scenario,
but I mean, like at some point, right,
they're, I mean, when you think about what it was
that enabled radical transformation
of society in the past, I don't know, four, 500 years, a lot of it had to do with a remarkable
progress in thinking, the standardization of formal logic, scientific methodologies. I
mean, everything that you've spoken about today, LA at on some level, is grounded in the ability to do an experiment. Carol, you and I have spoken
four hours about our hero, Richard Feynman, and his very famous Caltech. Actually, it might have
even been a Cornell. I can't remember if it was a Cornell or Caltech when he gave this lecture
that is very easily searched on YouTube. It's beautiful where he basically explains what an experiment means and how you know
if an idea.
And the beauty of rooting for the null hypothesis,
which is really hard to do,
but it's what we have to do.
The age of enlightenment was around the 16th century
when people began to do experiments
and think scientifically and every society
has to bring its population into the age of enlightenment.
And I think our educational system, our public educational system has failed us.
When you see some of the sort of people on the street being interviewed, I am appalled
by how illogical they are in their thinking, many of them, and how irrational they are,
and how one thought does not proceed from the previous thought.
It's appalling, and it seems to me that critical thinking has to be taught in junior high school and high school in this country.
People have to learn how to separate bullshit from fact.
People have to learn how to trust science rather than off the wall exclamations of facts that aren't really factual.
And that's all part of our educational system. wall exclamation of facts that aren't really factual.
And that's all part of our educational system.
The democracy, A, democracy is not going to work
with an uneducated population.
And a distrust of the institutions that are
the bedrock of that democracy, science, government, the law.
So that's basically my point, which is my fear is we are further in the wrong direction
today than where we were, you know, the time you wrote this book.
And so my call to you is if you could transmit to that generation that is so critical, right?
Those kids that are 10, 11, 12, 13, 15 years old,
where they're still in these formative years.
And we want to impress upon them,
for me personally, because I have one of my kids
is in that age group, I think nonstop
about how to excite her about science
and how to look at the world and question everything.
Like, why does that thing float in the pool
But that thing sinks and why does you know?
Why is it that the vinegar and the oil separate like every single thing you see you should be starting to think why is that happening?
What is it from your work that I could impart to my daughter who is you know 11 years old to
to my daughter who is 11 years old, to give her the best shot at having a tool
of being the kind of person
that is comfortable sitting in discomfort,
making these hard decisions
and being able to change her mind
when the facts call for it.
I think the truth is that I would suggest
is something that I learned the hard way as a parent
and a grandparents.
And now I'm learning as a great-grandparent
is the importance of modeling.
In the home, you behave in a way that you want your child
to behave when your child is your age.
And there are no exceptions to that. You just do it. You do it the way you want your child
to learn to do it. I think that modeling is a very, very powerful tool.
You want to tell your story, but when you were first married to Vera and the anger.
My father would have angry.
You know, that was a big change in you.
It really was.
I grew up with a father who, my mother and father frequently quarreled and I remember sitting
at the dinner table and we could see it happening because the only time my mother had a chance
to get at my father and and give him a litany of her complaints was at dinner time.
So we'd be sitting at the dinner table.
I remember I was maybe 10 or 11 years old and my mother would be complaining and my father would regard that as nagging and we could see
my father beginning to see inside.
And my older brother looked at me and winked and what that meant was, we'll give him about
20 seconds before he explodes and he exploded.
And he would slam his knife down, knife and fork down on the
table. They'd, God damn it! Can a guy have a peaceful meal around here and eat? Grab his coat
and leave the house and not come back until after my brother and I were asleep. And we hardly
have a God to see my father because it would happen often. And that was my model growing up. Soon after I was married, I got married at when
I was 22 years old to a remarkable woman and we're still married now, 65 years in county.
But early in the marriage, we had an argument about something and I remember getting really
angry, raising my voice and then walking out the door, slamming the door,
going down the steps, I got halfway down the steps, and I suddenly said to myself,
what the hell are you going? What are you doing? And I realized at that moment,
that I was modeling my father's behavior, which I detested.
And I slowly walked back up the stairs and apologized.
And we talked rationally about the issue that had brought on that explosion. And I gradually taught myself that getting angry, raising
my voice was not the best way to deal with it.
I was married to a very gentle, I am married to a very gentle,
wonderful person who was not accustomed to that kind of thing.
And therefore, it would not tolerate it.
And it helped me let go of it.
And to use reasonable, rational discussion,
stating of feelings, the positive ones, the negative ones,
whenever they become apparent.
And that became a model for our kids because the way we relate to each
other is now pretty standard in our family. I don't take great credit for that. I really
learned it the hard way that I learned it. And taught it in the encounter group world
and so forth, the ability to identify a feeling,
instead of just roaring, no catharsis in that respect.
Those are skills to be learned.
And Peter, when you said, what can you do to make science itself interesting to your
daughter?
Remember that as human beings, we think in stories.
Storytelling is our way of understanding the world, explaining the world and making
the world interesting. What science does is tell us which stories are better than other
stories. And that's its charm and that's its magic, if you will, and that's its appeal. When
science is presented as a series of finding, finding, finding, finding, there's this thing
and this thing and this thing and another thing, and this thing, and another thing.
It loses its interest and it's zipped.
But when it's told as a story in which the discovery
is something that changed us, the story we
tell at the beginning of our book of Semilwise,
and his observations about why the women in his hospital
were dying of childbed fever, and that maybe it was because
his students were coming from the morgue to the bedside of these women in his hospital were dying of childbed fever and that maybe it was because his students
were coming from the morgue to the bedside of these women in delivery with the having just
done autopsies on the women who had died the day before and thinking, oh, he's carrying
something on his hands.
That was a story.
The Semilized story was something that my junior high school teacher told our little science
seminar group and I remember I was fascinated by the story
But I guess I was a budding social psychologist because what was fascinating to me was not just that Semmelweis
Had found the reason that the women were dying. They didn't know about germs yet
But he found the reason just wash your damned hands and these women will stop dying.
He had the solution before he knew the problem and that's very interesting.
Absolutely, but seeing in the story that Mr. Crane told us,
what interested me was why didn't some of my fellow doctors say,
hey, Ignatius, great explanation.
Thank you for explaining why my patients are dying.
I can change my practice immediately. You know, this is terrific information. What did they say?
They said, I'll piss off you, Hungarian nitwit. I mean, the equivalent of it in 1847.
They hadn't read the theory of cognitive dissonance yet. No, they certainly hadn't.
I don't think he was ever vindicated in his life. I mean, he died basically in an insane asylum
Still believing that no one had effectively come to acknowledge this theory
Absolutely, but you see it's an amazing story and all the elements are in it the psychological story of his fellow
Scientists who didn't want to believe them. I mean, which is the story of history throughout history. Oh, thanks, Galileo. You know, you're really grateful for your new theory here,
right? It presents both the excitement of scientific discovery and the challenges and the challenges.
Well, guys, I want to I want to really thank you for this discussion today. I know I know it's
probably longer than most discussions you guys have on this topic.
But as I said at the outset, it's a book that I love.
But more importantly, I think it's just a topic that I think everybody needs to spend some
time paying attention to.
It is a part of everyone's life, whether they are aware of it or not.
That's the importance of it, right?
In the spirit of what water is to a fish as David
Foster Wallace spoke about in his commencement speech, whether you are conscious or unconscious of
this thing, it is with you. So we are probably better off being conscious of it and having some
measure of control and thought around it than we are ignoring it. Peter, it was a pleasure to talk to you and the discussion, if anything, was not long enough.
Thank you very much.
Thanks so much, guys.
Thanks a million for inviting us.
Thank you for listening to this week's episode of The Drive.
If you're interested in diving deeper into any topics we discuss,
we've created a membership program that allows us to bring you more in-depth, exclusive content without relying on paid ads.
It's our goal to ensure members get back much more than the price of the subscription.
Now, that end, membership benefits include a bunch of things.
1.
Totally kick-ass comprehensive podcast show notes that detail every topic paper, person,
thing we discuss on each episode.
The word on the street is, nobody's show notes rival these monthly AMA episodes or ask me
anything episodes hearing these episodes completely access to our
private podcast feed that allows you to hear everything without having to
listen to spills like this. The qualities which are a super short
podcast that we release every Tuesday through Friday, highlighting the
best questions topics and tactics discussed
on previous episodes of the drive.
This is a great way to catch up on previous episodes
without having to go back and necessarily listen to everyone.
Steep discounts on products that I believe in,
but for which I'm not getting paid to endorse.
And a whole bunch of other benefits
that we continue to trickle in as time goes on.
If you want to learn more and access these member-only benefits, you can head over to peteratiamd.com forward slash subscribe.
You can find me on Twitter, Instagram, and Facebook, all with the ID, peteratiamd. You can also leave
us a review on Apple Podcasts or whatever podcast player you listen on. This podcast is for general
informational purposes only. it does not constitute
the practice of medicine, nursing, or other professional health care services, including
the giving of medical advice. No doctor-patient relationship is formed. The use of this information
and the materials linked to this podcast is at the user's own risk. The content on this
podcast is not intended to be a substitute for professional medical
advice, diagnosis, or treatment.
Users should not disregard or delay in obtaining medical advice from any medical condition they
have, and they should seek the assistance of their healthcare professionals for any such
conditions.
Finally, I take conflicts of interest very seriously.
For all of my disclosures in the companies I invest in or advise, please visit peteratiamd.com
forward slash about where I keep an up-to-date and active list of such companies. you.
you