Modern Wisdom - #493 - David McRaney - Where Do Our Beliefs Come From?
Episode Date: June 30, 2022David McRaney is a psychologist, journalist and author. Where do our beliefs come from? How do we form opinions? Why are we persuaded by some points of view but not by others? These are important to u...nderstand if we are to avoid developing a biased, unrepresentative worldview, they've also been the focus of the last few years of David's research. Expect to learn how everybody in a group can believe something that nobody believes individually, why arguing online so rarely works, whether people really are becoming more ingrained into their beliefs, how the Westboro Baptist Church convinces its members, the best ways to persuade someone who disagrees with you and much more... Sponsors: Join the Modern Wisdom Community to connect with me & other listeners - https://modernwisdom.locals.com/ Get a Free Sample Pack of all LMNT Flavours at https://www.drinklmnt.com/modernwisdom (discount automatically applied) Get 15% discount on the amazing 6 Minute Diary at https://bit.ly/diarywisdom (use code MW15) (USA - search Amazon and use 15MINUTES) Get 10% discount on your first month from BetterHelp at https://betterhelp.com/modernwisdom (discount automatically applied) Extra Stuff: Buy How Minds Change - https://amzn.to/3OxpZeJ Follow David on Twitter - https://twitter.com/davidmcraney Get my free Reading List of 100 books to read before you die → https://chriswillx.com/books/ To support me on Patreon (thank you): https://www.patreon.com/modernwisdom - Get in touch. Instagram: https://www.instagram.com/chriswillx Twitter: https://www.twitter.com/chriswillx YouTube: https://www.youtube.com/modernwisdompodcast Email: https://chriswillx.com/contact/ Learn more about your ad choices. Visit megaphone.fm/adchoices
Transcript
Discussion (0)
My guest today is David McRainey, he's a psychologist, journalist, and an author.
Where do our beliefs come from? How do we form opinions? Why are we persuaded by some points of view
but not by others? These are important to understand, if we're to avoid developing a biased,
unrepresented worldview, and they've also been the focus of the last few years of David's research.
So today, expect to learn how everybody in a group can believe something that nobody believes individually. Why arguing online so rarely
works, whether people really are becoming more ingrained into their beliefs, how the West
Barabaptist Church convinces its members the best ways to persuade someone who disagrees
with you, and much more. But now, ladies and gentlemen, please welcome David McRainey.
David McRainey, welcome to the show. Thank you so much.
I am very honored to be here in company that shares a lot of my obsessions.
I feel there's a VIN diagram with a lot of overlap here right now, and that's always cool.
When you say that someone holds a belief, like the belief that our interests heavily
overlap in a van diagram, what does that mean? What is a belief?
That's the best opening question I've received so far. I'll say the same thing back to you
that was said to me when I first asked a researcher this, somebody who had 45 years of research into
just beliefs, which they said, whoo, that is a tough one. That was Jim Alcock who I asked.
I thought he'd be the greatest person to go to at first with these questions. It's one of the
reasons one of the book takes such a long arc getting to a authoritative voice is that I thought it would be a book where I just went and asked experts, experts, those questions.
They tell me the answers. I'd find the relevant literature. I'd translate it in a fun, approachable voice and boom, you have a book.
That is not how it turned out because it turned out I was asking questions that have either incredibly complex answers or no answers yet.
When it comes to that, in particular,
this thing I told you where I feel,
we have a strong overlap.
I could say that there's a belief there.
There certainly are some fact-based information
encoded things in my brain that have emotional qualities
of certainty attached to them, involved in that proposition.
But also, I'm also expressing an attitude.
I'm also expressing how I feel strongly in one direction
or another that I have these positive and negative
evaluations of what I'm presenting to you.
I'm also expressing a little bit of things there about values
and expressing some notions of what is
and isn't so in every direction.
So when it comes to just the raw fact-based thing, it's a huge assumption I'm making
based off of information I've received so far in this conversation and also from knowing a
little bit about you and sharing some of our shared interests beforehand. And also we've developed
a little bit of trust and a little bit of rapport so that I
Have all these cues that are translating into a
sub-emotional state of certainty about all these bullet points that
Pile up into something we would call in common parlance the belief that we have an overlapping
Indiagram of the sessions. That's the complicated answer.
What's the breakdown that you went through there?
It seems like there's component parts
that are making this up.
Yeah, I think the reason I like to talk about it like this
is, though this book is about how minds change,
and I was very adamant that that would be the cover
and the title that would be how minds change,
not how to change people's minds. I did not want to write a book that was the modern version
of how to win friends and influence people. I didn't want to do a retread of the Chaudini
influence type stuff. I was actually interested in how do minds change, whether or not that's
through persuasion, whether or not that's through propaganda, whether or not that's through persuasion, whether or not that's through propaganda, whether or not that's through activism.
And despite that, that tends to be what people are,
or that would seem to be the entry point
that a lot of people are coming to the book.
And I get that, we're in a very contentious,
polarized time, and that's something people want to think about
and learn more about.
But when you use the phrase, how minds change,
that becomes a, just like the previous answer,
becomes a very nuanced and complicated concept because some cultures don't really frame it that way.
And when I think about the idea that I've either changed my mind, I want to change someone's mind,
or I have changed someone's mind, or anything in using that phrase, I found that
when people were getting most frustrated with other people, when they were attempting to change their mind in some way,
that frustration often came from assuming they were trying to change their belief, when
really they were trying to change an attitude, or there was an assumption in that conversation
that we were taking our fact-based conclusions and having them do battle with one
another the way like maybe a scientist would or that maybe a prosecutor at a fence lawyer would
when really what's often taking place is are we having a battle of justifications? We're having a
battle of of reasoned justifications for why we think, feel and believe something. And if you keep going backwards
through the processing chain,
you'll find oftentimes that what's motivating that
is some sort of attitude or value set
or some sort of experience,
some sort of thing that you can't help but feel.
And that has led you to cherry pick evidence over time
until you have some things that seem like justifications and when you approach another person that's
where the
The contentious battle takes place. I'm gonna throw these links at you. You're gonna throw those links at me
I'm gonna say watch this YouTube video. You're gonna say watch this one
But the reason those things resonate for us are because something within us
Under and you know, it's in the book. It's we use the model called SurfPad that is grabbed by this.
Something within us led us to those things.
If that's not part of the conversation, but it's probably going to take places, nothing
useful.
The debate will end up being, both parties feel more right than wrong than they did before
they started, and neither person moves any direction at all.
They just become more entrenched in whatever they have come into the conversation.
What's an example of how somebody could be, let's say that we're having a discussion about any
topic you want, what's an example of how I could be speaking from an attitude and you could be
speaking from a belief or from whatever else? Sure, like, I mean, it's something that we all experienced here recently was vaccines, right? And COVID, and I spent time with people who were anti-vaxxers before it was COVID, anti-vaxxer
stuff, it was MMR stuff.
And I spent time with a researcher who studied the impact of different messaging and all
that.
Oftentimes, what will happen is someone, you'll, this is just sort of a broad generalization
here, but you might approach someone who is an anti-vaxxer who, and you say, well, why are you an
anti-vaxxer? And they'll say, because vaccines cause autism or something that effect. And you'll
say, well, why is this? And then they'll start laying out where they heard it or all the evidence
they feel that supports that proposition.
And they presented to you as if that's how that's why they feel the way they feel.
But probably what usually what's happened is somewhere along the line,
they've, and there are a million different psychological processes that lead up to this consistency bias and the introspection illusion and all sorts of memory things that fall into place.
Does the belief change blindness as an actual term where it feels like, yeah, you were like
Gandalf or something and you went to the bottom of your vast library and you read a bunch of
scrolls and you wrote out this thesis and we're like, ah, this is what I believe about vaccines.
But what probably is going on is that you, you had a strong feeling, a strong negative affect,
a strong negative attitude that then, which could be influenced by experiences you've had,
but it could also just be influenced by elites in your cultural norms or social costs and rewards or just raw fears that you may have about authority and
some sort of raw fear you have about medical practices. The idea of losing agency, the idea of
all these things being combined into something you don't quite understand that's going to be in a
needle that's going to puncture the flesh of your child against your will.
These things led you to look for justifications for that anxiety, justifications, reasonable
justifications that you could argue in front of your trusted peers.
And then you find this thing, you say, ah, that really justifies what I think about this.
And then when someone asks you, why do you feel this way?
That's what you present.
This is the thing that led me to the feeling.
When really it was the feeling that led you to the thing.
And we do that too when we think we're being reasonable and when two people disagree on
something, that's often where the sparks fly is that these, we have the process backwards
and then we argue backwards in a way that doesn't really solve much. Do you think that because the modern world has more information than we've ever had before,
and there are more things that can justify our feelings? Is that one of the reason why
we have such a multiplicity of viewpoints and also people feeling really ingrained in their
beliefs and feeling super sort of fractured and fragmented and unable to work out what the truth is.
A little, I mean, yes, yes, and no, I mean, it's just different, right, than it used to be.
And how's it different?
I'm, I'm, I'm, I'm weirdly optimistic in that I think that this is a phase that we'll get through
because we've had things that have disrupted the information ecosystem before.
The way it's different, one of the big things that makes it different is we can form groups
around anxieties much more quickly than we ever could before.
And then tribal psychology, group psychology, in group out, group psychology will then take
precedence.
It'll become the more motivating factor than the thing that led us into the group in the
first place. That's something that is really hyper-present in the current information ecosystem and the
social media ecosystem that we're in.
The fact that misinformation has always been an issue or that truth has been difficult
to figure out has always been part of the human experience.
But it was Tom Stafford who told me, and it's a great quote, I'll, I'm not doing it verbatim
because I'm trying to remember it, but he said something to the effect of, it's like
germs before cities in his mind. Like germs are always a problem. Sanitation was always
a problem, but when we had cities and we started living in large groups, and we were interacting in a new way, that's when that became a real public health issue that had
to be addressed. And not only did the cities have to address it at the level of cities, but so did
the citizens have to address it at the level of washing your hands and being good citizens who were
clean. Well, he said he feels that it's almost, it's very similar as what's happening with
it, misinformation, that misinformation is always been a problem in human groups in one-to-one
interactions.
And thanks to this new way of interacting with each other that is at scale, where we have
more contact with people who see the world differently than us, or we have more information
outlets than ever before, lots of the information gate keepers have collapsed. Then we will have to learn the informational equivalent of washing our hands.
And also the context, the places in which we meet will have to do a better job of sanitation.
So I think in that regard, that's a big deal. The other side of that, I know there's a long
winded answer, but there's another element of all this, which is when we feel a strong anxiety,
like, I like it, I like it to, like, if you're in a tent and you hear a strange sound in
the woods, you get this visceral negative feeling and you're like, what is that?
You feel anxious, you feel negative emotions.
And because we're a social primate, one of the things you wanna do is you want to go searching
for justification for that feeling,
that your reaction to it is reasonable,
that your heightened state of alarm is reasonable.
And reasonable means literally coming up with reasons
for why you are responding in that way.
So you might go out with a flashlight or if it's a different
era, you know, a torch or if it's a different era, nothing at all. And you will
try to find some sort of
something to some sort of evidence that backs up your
reasoning. You do that in the modern information
environment and
anything that causes great anxiety. For some people, some people people who are very prejudice and they have anxieties about things that are
Seem heinous and strange some people are just simply fearful or hesitant about certain things that seems like of course you would be
You go online looking for other for first of all you look for reasons why the things that would justify that feeling and you'll find it, of course you will, but more so than that, you will find other
people doing the same search and you will join their search parties. And then now you have a
community and the communities are something that's pretty new. Like, there's always been like conspiracy
theories back and forth all over the place, but you know, you had to go the library and for find
somebody in your hometown, something.
The idea that you can very quickly form a jump into a group of a million people having
conversations about it where you can deeply radicalize each other is something pretty
near.
What do you mean when you talk about a post-truth world?
I actually am one of those people who I don't like post-truth as a term.
I think, I don't think we're in a post-truth role anymore
than we have ever been.
I think we're in a post-trust world,
which in some ways ends up being the same thing, I guess.
The, there's a great researcher named Kate Starbird,
who talked to me about how a lot of what we see
in online behavior is similar to what happens
after a crisis situation, a natural disaster or a
ship sinking or a building burning.
Like we enter into a, we become aware that there are information voids as she puts it,
and we sort of recalibrate our epistemological approach to figuring out what's going on
to being okay with rumors.
Because the information is hard to come, the actual accurate information is kind of
hard to come by and you need to respond quickly and you're really motivated to lower your
anxiety.
So you start modulating what you're going to do based off of trust more than based on
more than truth.
So like you, someone tells you, hey, I heard you can get water here. I heard
that you should go down these steps over this way. Or I heard that what happened was this.
And you, if this person seems like, if there's somebody you've known for a long time,
you'll know what kind of expertise they have. And you'll modulate your behavior based off what
trust you have in that information. If it's somebody who's telling you they heard it from somewhere else
You'll very quickly say like what did you hear that because you want to know if you can trust the source and
Whereas if it's like a firefighter or a policeman or a soldier or something in a situation like that you be much more likely to just go with it and
That's something that's similar now and that we have a lot of anxieties, a lot of our own edge in a lot of ways, and the information gatekeepers, like I said before, have sort of collapsed,
it's a very fragment in information environment. So we're modulating a lot of what we plan
to do, and also what we believe on trust and the banding trust back and forth has become
very valuable in that space. So it's not so much post trust. It's post truth is post trust is very, we're all trying to figure out who can we trust.
And we're all very eager and sensitive to when people violate the trust that we've put
in them. And we're very quick to say, okay, well, I'm not trusting you anymore.
I'm not taking any more information from that source. So that's my current,
that's my current thinking on things. That's, that's one of the things I proselytize and talk
about in the book.
I like that framing.
I think that in what I see online is a lot of people who are their grift radar or their
shell alarm is hyper-attuned.
As soon as they see that somebody might have perverse incentives or that they think that
they're doing something for the wrong reasons, they're very, very quick to get rid of them.
Once you sell your integrity on the internet, it's very difficult
to buy it back, at least with people that have good memories. Now, that's not to say that
you can't capture an audience and then continue to push a narrative to them, which is untrue.
Does more information help then? Because we have access to more information than ever
before. Surely people's minds should be more accurate than they've ever been
Yeah, you would think that was the dream for for many generations. I talk about that the book the
the 19th century rationalist philosophers the founding fathers of the United States
the cyberpunk
the United States, the cyberpunk champions of the 1990s and beyond, they all had this sort of similar dream, like the founders, they were like, if we have public libraries, then
everybody will be able to have the same amount of information that you'd have used to be
a fancy pants elite person who went to a school.
But now everyone has it.
Are you had to be rich and have a library at home? So their dream was like,
everybody has a library in their hometown. Now everybody will have access to
all the same information and democracy will finally flourish and blossom.
The 19th century philosophers, they were very into the idea of public
education. If we just make it so that everybody gets some basic schooling, then everybody will have access to all the facts and the facts will
will, everybody will agree on what those facts mean and enlightenment for all. Then the
cyber punks, they were like, okay, when we get the internet, we will no longer have these
three, three new stations that tell you what's going on. People will have all the information available.
Although the world's, the library of Alexandria of a new age will be upon us and it will
all have all the same facts.
We'll all agree you don't want those facts mean.
And then we will enter into a cyber enlightenment and we will go to the moons of Jupiter together.
The idea, this is something that is called
the information deficit hypothesis.
It's the idea that the reason people are wrong
is because they don't have the facts yet.
You give them all the facts
and then they'll change their minds and be right.
But the problem here is that we're motivated reasoners.
And we don't just accept evidence because it's evidence.
We accept evidence based off of all the motivations we bring to the table when we come looking to justify
something or create.
Consider what's rational.
We worry about the social groups we're part of.
We worry about our standing when the social groups.
We are approaching it from all the experiences we've had so far and all the ignorance is
we have inside of us. So we will either assimilate or accommodate
based off all those things and the facts by themselves, well the facts just don't
speak for themselves. That's all that comes down to it. Somebody always speaks
for them and the same evidence in front of two different sets of eyeballs may
be interpreted completely differently. We know this just from what, like, there are
thousand studies where they show people a video of something and the video, like, the, a small, favorite research, they show people protesting. But the protesters, you can't tell what the
protesters are protesting exactly. It just looks like some people protesting and then they will prime
the groups. They'll get people who are strongly left or strongly right on the political spectrum.
And then they'll subdivide that to say to some of them, they'll say, these people are protesting against something you are a big believer in,
or you'll say they're protesting for it.
And you can imagine what happens there.
Depending on how you tell people what the people are doing in the video,
they see the video as evidence of something completely different compared to the other person.
You're a private person to think that these are people protesting against something you
believe in, and you'll have all these negative emotions, and you'll find that what's going
on in the video is heinous and terrible.
And you'll see things the other people don't see.
It all goes back to that old research of they saw a game where they had people watch a football game
and they had to count how many times the home team or the away team did naughty things.
And depending on which one you're a member of the support for one team or the other, you saw a different game.
And that was actually that was the title of the paper. They saw a game, but the same video, different game,
even though it's right there.
And we saw all throughout politics,
and especially in all the trials,
and things that took place during the Trump era,
all the hearings,
like there was all this, I remember on social media,
there was all of this excitement on the
left, like finally, this will, finally this video will come out.
We will all watch it and they'll all see the truth.
But of course, like, it comes out, we all watch the same video and everybody sees something
else in it.
Some people see it as, oh, see, I told you, Trump's great.
And other people watch it go, oh, see, I told you, Trump's terrible.
Same video, same evidence, different interpretations.
And I get it, for, I understand that for a long time,
it felt like all we needed was more facts,
but that's why facts are not the way to get into these things.
If you have no concern, no cognitive empathy
for the fact that people will interpret things
through a lens of their priors,
through a lens of their motivations and drives.
I heard someone reverse Shapiro's famous quote the other day that says,
fact, don't care about your feelings.
It's that feelings don't care about the facts.
It's like, look, I've got, I've got my position and fuck you.
I don't care.
I don't care about what you want to do.
So who was the guy that said, um, people aren't flawed and irrational.
They're biased and lazy.
Oh, man, that's Hugo Mercier.
I keep promoting my own book, but at the same time,
I'm asking people, if you've never read the enigma of reason,
this is a book you should, if you're like a super nerd
about all this, like I am, like you are,
like you got to get the enigma of reason in your brain.
He's also written another book since then called,
Not Born Yesterday, about trust and elites and things like that,
about a mob behavior and the things that he basically makes a case against it.
It was one of the first things that I, that, that, that, well, honestly, in how minds
change, there's a comeuppance.
I mean, the book is also a story about how I had to change my own mind about some things
that I was suggesting that I was
putting out there in the world that I no longer feel that way about. And there's two come-up
pieces. I mean, there's a peanut butter and chocolate have come-up in the book. One is Tom
Stafford's work into the Truth Wind scenario, and the other is Hugo Mercier and Dan Spurber's work into the interactionist model.
And the simplest way to put that out there is their work puts forth evidence and they have
studies and they have a whole giant structure. It's a great big beefy model at this point
great big beefy model at this point, that human cognition, when it comes to group based reasoning, and we put our minds together toward common goals, we try to figure out a world
view that we can all agree upon. It's a great, powerful tool that ultra-social primates with
the gift of language can employ in all sorts of situations, that the selective pressures put on us as we were working to
get better and better at that led to biological structures that lead to cognitive mechanisms
that have two broad functions. One is to produce arguments and the other is to evaluate them.
When we produce arguments, we produce them in a very biased and lazy way,
which is not the same as flawed and irrational, which is, I was part of that whole pop psychology
wave 10, 12 years ago, a book about that, we're all like, hey, can you believe how flawed
we are? You can have a silly we are. Can you believe we lock our keys in the car and
we send emails to people we shouldn't have and then we also cause climate
change. So that was like the big push of like be careful, we're very irrational creatures.
But as they put it like it's actually quite rational and it's working exactly as it evolved
to work to when you are faced with the possibility of disagreement or you're trying to deliberate and come to consensus that each individual would put forth a very biased argument, which means it's
very strongly from their perspective, very strongly from their experiences and their concerns.
And it would be very lazy and that is going to be the easiest thing to put forth, the
thing they can justify the easiest in the moment.
That takes very little cognitive
labor and then you dump that into an argument pool. And then the other mechanisms work inside
that pool where you, everybody evaluates things very carefully. And in that situation,
you offload all the cognitive labor to the group base process. They have great studies
where, and there's mention in the book, like where you
have people, they will have them produce arguments, and then they will trick them into thinking the
arguments are someone else's arguments. And if they think it's their argument, they will almost
100% of the time say, yeah, this is great. I don't see anything wrong with this. If they're tricking
into thinking somebody else produced that argument, somewhere in the
70% range, they'll find all of the flaws and the biological things in it and the fallacies
and everything and say, see, what are you thinking here?
What matter, why that matter is in a modern discourse is that many of the platforms in
which we engage with one another in these ways, they're just built to incentivize argument production,
but not really argument evaluation.
The same reason that I was wrong about this for so long, most of the studies that I wrote
about in blog posts and were in books just like everybody else writing at the time, the
literature we were pulling from, those studies were done on large groups of people, but each person was being studied in isolation. Like you'd give them
a problem from the cognitive reflection task, or you would present them with something like
the wasen task, or something. One of these things that are very popular in psychology, and
you would demonstrate that as alone, an individual reasoner would very often, the majority of
the time, get these things incorrect
but as uh that's the other side of the peanut butter and chocolate of my commuopants is that
tom staffer to the demonstrated which is this plays with hegemerciase research that
if you take those exact same studies and you give them to groups then the group what usually will happen
like if you if you study a large group of people and most of them get the thing wrong or they they commit some sort of cognitive bias reasoning thing
That small there is a percentage of them that don't do that and
In isolation that doesn't really help anybody, but if you have a group of people and I've done this
I've done lectures with I use this. It's really fun. I'll take like the widget problem.
Or just some of the widget problem. If it takes five machines, five minutes to make five widgets,
how long would take 100 machines to make 100 widgets? And if you ask that people individually,
about 70% of people will get it incorrect. If you, because you start trying to do math,
when it's, it's been really,
it's just the same thing repeated.
So it's just five minutes.
So there's all, but what I'll do is I'll ask a group of people,
this question, and I'll say, alone come up with an answer,
and sometimes I'll put them into smaller groups,
but it works even better when you do the entire audience.
And then I'll ask, is there anyone here who feels like they have the right answer?
I mean, they really feel it. And then when that person raises their hand,
they'll bring a microphone out to them and say, tell me what the answer is, they'll say,
five minutes. And then you'll hear the whole audience go,
and then I'll say, please explain your reasoning. And then they'll explain it in detail.
And then the whole audience goes,
oh, and then I can say to them, yes,
the latest research shows that what just happened here
was most of us were wrong.
And because we were able to reason as a group
in a way that was, we created this good faith,
trusting, trusting,
rapport, strong rapport environment.
Now we're all right.
We went from almost all wrong to almost all right.
And this is just something that has been difficult to reproduce in online contexts.
And there's, there are many organizations working on it.
But the current version of like Twitter and Facebook and all the others, it's just more,
it's very similar to the oscillated thing where it's a bunch of individual people who feel like
we're talking to each other, but we're really just kind of taking our bias and lazy arguments
and throwing them on a big pile and seeing what happens next. And the evaluation is rarely about
the logic or the reasoning process. It's more about like, are you saying that you're in this group
or that group right now? I'm trying to figure out if we're, that's where it's all, it's
not that we couldn't do it otherwise, it's just that's what's incentivized right now,
that's what creates engagement and that's what sells advertisements and that's sort of
where it's at. It can change and I think that hopefully we will change it, but that's
where I'm at with it.
Well, think about what a lot of tribal beliefs actually are.
There's shows of fealty to your own side.
It's proof to your compatriots that you value the group consensus over rationality or truth
itself, saying, look, there may be some counter evidence against this particular thing,
but I am such a loyal
disciple of this that I'm going to do it and it's a threat display to your enemies as well. It's
saying, look, this is me, my beliefs and I are one. And to your own side, any deviation from that
looks like a lack of commitment. And to the other side, it looks like weakness and a lack of
understanding of your own argument. So it doesn't surprise me that people become entrenched in things like
this. Speaking of which, you did some research about the Westboro Baptist Church, didn't
you?
Oh, I, in the book, I went to several, there's even, there's a chapter that we took
out where I went even went into several conspiracies, conspiracy theory communities and, and cults and pseudocodes. It used to be a
strong portion of the book, but I was like, I'm going to have to write three books to put all
of this into one thing. So we paired it down to, I spent time at 9.11 conspiracy communities,
and I spent time with Westbrook Baptist Church.
Well, you are the Westbrook Baptist Church for the people that don't know? So Westpore Baptist Church is considered the most reviled hate group in America.
They are a Baptist church, like any other Baptist church.
There's thousands of them all across the United States, but this one in particular decades
ago began at first, they were protesting, they claimed
that there were men who were hooking up in the park, in Kansas, in Kansas City, and
took big of Kansas on me.
And they protested it.
They sent stuff on facts machines to like senators and things.
And they just got some media attention from that. And that led to them seeking media attention and becoming really good at it.
And that escalated to the point where they sort of protesting everything that they could
find. They eventually protested the funerals of soldiers coming back from war. And with
signs, they would say, thank God for dead soldiers because they were saying America
is corrupt and polluted and it all goes back to LGBTQ issues, which they're very opposed
to the very idea that someone would find someone of the same sex attractive or pursue romantic
relationship with them.
They became incredibly vocal protesters of that and they're really, really mean about it.
They entered most of Americans discovered them when they protested the funeral of Matthew
Shepard, who was brutally beaten and killed just for simply being a young gay man.
They just protested as a funeral, and they were like, yay, I'm glad he's in hell.
And that as clearly that made a lot of people go,
oh, wow, you are some awful human beings.
But they're so adamant in their protesting
and they started going online and doing protesting
like one-on-one with on Twitter and Facebook and stuff.
It became, it was a few of the people in the organization sort of to leave, and one of them in particular, Megan Phelps, Roper, she left after someone on Twitter,
extended their hand to her and carefully, empathetically, compassionately guided her out of the organization.
And I wanted to talk to her and to scientists who study
what happened there because I just feel,
I felt like I can understand,
like I've been frustrated with family members
who want to change their minds about all sorts of things.
The idea that somebody could leave an organization like that
after spending most of their life defending what they feel and believe, I wanted to know what it took. And
if that was something that there were insights there for, for replying into all sorts of
things that we are frustrated with. And what I discovered there was something that this
was the same thing I found in every other, every other person who had left a community of that nature, whether it was conspiratorial,
cult, religious, or whatever, I had this misconception that there must have been something that
caused them to leave.
And then they had changed their mind about the group and then they left.
But most often what happened was they left the group and then they changed their mind.
The influence of the group was less potent on them. and it changed the goggles they were wearing replaced by different
ones. But the way out of that almost always was someone first of all didn't approach them with
hate and vitriol and I totally would understand of course you'd want to confront somebody that way.
They instead confronted them with
compassionate, empathetic, non-judgmental listening. They would ask them questions, they would not react poorly, they would joke around with them, they would encourage them to deeply
introspect as to why they feel the way they feel and then allow that person who they're treating
that way to discover the inconsistencies on their own.
And more than anything discovered,
they were values they held,
that they could satisfy and affirm outside of the group
that they had grown up within
or the one they had fallen into.
And once the person got on that path,
it becomes an off ramp to get out of something
that was making them uncomfortable.
There usually was an inciting moment with Megan.
It was many of the people that left Westboro.
They had made these changes within the organization where they were being oppressed in a way
they hadn't been before.
Like, they had made changes where women couldn't have certain rights and women couldn't
behave in a certain way.
Had to wear certain clothes.
They had to take on certain jobs,
they had to get a school for certain things.
Those things were sort of the crack that let in the light.
And then, but there still had to be light,
but there still had to be someone who was willing
to be empathetic and compassionate and listen to them,
that gave them an opportunity to entertain a possibility
that maybe my values could be better served
outside of this group. And once you get a taste of that, and you actually do attempt to get outside the group an opportunity to entertain a possibility that maybe my values could be better served outside
of this group. And once you get a taste of that and you actually do attempt to get outside
the group for a second and you get validation for it, there's a real strong off ramp to
get out of those groups.
Didn't you look at the Jim Jones massacre as well?
I did that for the podcast. Man, I really wish I could have put it in the book. I, I,
that deserves its own book.
Maybe one day I'll do something like that.
Pluralistic ignorance.
My favorite term in all of psychology.
That's what the Jim Jones thing was about.
Pluralistic ignorance, first of all, hard thing to say, hard words to say.
Even more difficult.
In the podcast, I let like 20 people give me their definition of this thing, because it's
every definition is different, but it's, every definition's different,
but it's all the same idea.
It's just a hard thing to put into words
without having to like come up with a million metaphors.
What it comes down to is,
it's when the majority of,
it's when you have a group of people,
and most of the people in the group believe
that most of the people in the group believe something
that in fact, very few of the people in the group believe. But since every, most of the people in the group believe something that in fact, very few of the people in the group believe.
But since most of the people in the group believe that most of everybody else in the group believes this,
everyone acts as if everybody believes it and you end up following a norm that nobody likes.
And this is very common. We are all you and I are both doing it for something right now. Everyone
is experiencing this at all times in some way.
And it comes down to we hold, we don't always explicitly state how we feel about things. And so we have to make an assumption based off of the cues that people present. And the cues that people
present may indicate, well, they must believe it. If everybody's doing it, they must actually want to
be doing it, right? So the foundational research in this was about drinking norms on campuses. Most freshmen do not like starting a new college career than
immediately having to get blackout drunk every weekend. Most people do not want to do that.
But most people do that. And most people do that for a couple of years. So why is everybody
doing something that nobody wants to do? Because everybody thinks that everybody else wants to do it except them. Everybody thinks that
the only person that doesn't want to do it. And this is rife all throughout politics. It's rife
through everything. It's a very fundamental thing that human beings do. And in on the podcast,
when you were not so smart, I did a show about this and I wanted to show there was fresh research
into this about there's an assumption that the Emperor's New Clothes scenario is how you bust this
up but there was there's really good research from Rob Willer and it shows it doesn't really do it
like you can't count on the Emperor's New Clothes. Sometimes it works but in the Jim Jones example
there was one woman who stood up.
Usually what, often what happens is it takes one person, like a stand-up comedian or something,
just say out loud, hey are we all really doing this? And then that creates change. The,
or it catalyzes the change. In the Jim Jones, the hardest episode ever done because I had to
listen to that tape that you can, I got it through a, I got it through legitimate sources, but you can just find it online.
If you want to get it, I have links to it on my website too, if anyone wants to hear it.
It's hard to listen to.
One of the reasons it's hard to listen to is because of the, they cut, they copied over
their tapes all the time.
So there's this haunting, like, hymnol, spiritual music, taking place underneath everything
as if it's a music bed,
but it's just because they taped over it so often. And it's hard to listen to because you
you hear everyone arguing about whether or not they're going to commit mass suicide,
and then slowly over time, everyone gets quieter and quieter until they're silent because they
are all dead. And it's not just adults. It was like 700 people and 300
children who was awful, right? And they did it by taking cyanide inside of a flavorade,
not coolade, as we often say. The reason I was not right, it's right, it's flavor, it was flavorade,
yeah. I guess flavorade will forever be thankful that we, that Kool-Aids are more popular drink
because of that, but that's what, that's what they did.
Drinking the flavor aid, yeah.
Drinking the flavor aid.
What the reason I did a show about it was the one woman stood up and said, hey, let's not
do this.
There's only one Jim Jones and where there's a thousand of us.
Why would we kill ourselves and our children? I think this is crazy. Let's not do it.
And she was shouted down by the rest of the group. And she also died. And they all
then the in the episode we go into detail about why you would be willing to die for a norm you don't believe in.
And it clearly demonstrates that our belonging goals trump all of the goals in certain situations.
So we are social primates and that can be whether or not it's intentional.
There's a confluence of psychological phenomena that can become a perfect storm of mega influence upon us that can lead us to do things like mass suicides.
I mean, doesn't that show perfectly that social pain or it. She said, if there was an E.C.S.M.C.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S.S We're willing to put ourselves in physical harm, put ourselves at risk of physical harm
if it means that our reputation will survive beyond us.
And we see that in, even when stakes are low, we see that when people are arguing anything
from gun control to climate change to fracking, people will do that.
But even when the stakes are really high, of course we see it.
We see it with Jim Jones and Masker.
We see it in all sorts of situations.
The war, I mean, that's where it comes out the most, right?
The, I asked her about that and I said it, my metaphor, just to see if I could make sense
of it was I was imagining like if the ship is sinking, like, who am I going to put on
the lifeboat?
I was like, I'll put my social self in the lifeboat.
And then I'll go down with the ship and she was like, yeah, that going to put on the lifeboat? I was like, I'll put my social stuff on the lifeboat. And then I'll go down with the ship.
And she was like, yeah, that's what we usually pick
if we're given that proposition.
Well, you can see how, I guess in some situations,
this could cause people tribally to do something noble.
You know, you might save the tribe
by stopping some predator that was coming toward it,
because even though you know that you're going to suffer physical pain or potentially death,
you know, you're going to do it, and apart from the drive that you might have family,
and you've got sort of this reciprocal altruism thing going on.
But on top of that, you know that socially, it's going to do you pretty well.
People are going to be happy with you when you're at your funeral.
Yeah, everything I've ever written about,
I'm taking a note, because I was really well put.
Everything I ever have written about,
these things that we point out is like,
I can't believe people do this.
We're pointing out when these things go wrong
when they lead to bad outcomes,
but over a long timescales,
over the entire evolutionary history of our species, these must have led to either they were neutral
or they led to positive outcomes
because they are adaptive in some way.
Precisely.
You had to not be in a deficit with these things,
or else they would have been competed out of the gene pool.
The people that did the actions
that weren't effective at survival or reproduction,
are long gone.
And maybe some of those, I often think about this,
I often think what traits have arisen
and then been competed out that in 2022
would be amazing for people to have.
You know, like, like,
I've never heard that framing,
that's really interesting, I like that. Well, think about, you know, like differences in risk never heard that framing. That's really interesting. I like that.
Well, think about, you know, like differences in risk aversion,
or differences in tribalism,
or different thresholds for making decisions,
or changing your mind, or holding on to, whatever,
all of that stuff.
There's been tons, tons, millions of different quirks
and changes in our psychological makeup,
and our physical makeup as well.
There'll have been somebody born at some point in history
who probably had 14 fingers.
And you go, well, I mean, what if 14 feet,
are you be able to type really fast on a keyboard?
That'd be pretty useful.
But all of these things weren't useful at the time,
but I often think about what would have happened
if one particular strand had managed to sustain,
and then it would have become adaptive in the modern era.
Yeah, I like, that's great.
I like that.
I'm gonna steal that,
and I will attribute it to you as much every time I do.
So, yeah, the thing that,
what happened in the Jim Jones massacre
was the result of things that were adaptive
over a long
evolutionary history being put into a unfamiliar alien context all at once and created an unlikely
outcome. And that happens so rarely that we can absorb that kind of outcome as a species.
You, every time we introduce these new contexts for us to behave, we get something novel as an outcome,
but the behavior that's taking place is ancient every time. The Clay Shurkey, I remember once
wrote that technology doesn't create behavior, it lowers the cost to exhibit behavior that was
already there, usually is what takes place, right? So you see the way people we act on social media,
even though it's become, it's almost become old man yells at cloud at this point because I mean, how long are we going to have social
media before we, how long is social media have to exist before we consider it that new
thing that's weirdness out. It's almost like arguing about television at this point.
But it is a, we still haven't sorted out how to do it well. And that's, and eventually we
had like a president on Twitter doing weird stuff that made us really go
Maybe maybe politicians shouldn't have Twitter like we're still in that place because we there's nothing that
Donald Trump was doing on Twitter that the politician has it done for a thousand years
It's just that we were we're doing it in this new context and we were and there's nothing we were doing in response to it that we haven't done for a thousand years
It's just we were doing it in new context that made lowered the cost to do so many things that we got a different outcomes from it.
Even in the how minds change stuff, there's a
people often ask like, why do people resist changing their mind?
Because if you were to update when you shouldn't, you might either become wrong.
Like, or you might, or if you did, if you didn't update when you shouldn't, you might either become wrong, like, or you might,
or if you did an update when you should,
you might stay wrong.
There's a real precarious tightrope there.
And in a different,
with different,
like in a different context,
that could lead you to being eaten
or not getting enough food to survive.
If you updated when you shouldn't. And so, we careful about it. Like if your neighbor says, Hey, I think
you should eat these berries. It's not much different than, you know, your cousin telling
you, maybe you should drink this bleach. And I don't know if I should change my mind
about bleach. But then again, wait, with COVID, maybe I should be careful. I know you
start to modulate it. So we walked that tightrope with everything
and that's very adaptive.
Is there a,
how would you say a set point or a current view bias
that people hold?
I was at this heterodox academy conference this weekend.
One of the guys said a quote,
I absolutely loved that said,
there are many ways to change the world.
Few will make it better, many will make it worse.
And the point is that the behavior and the beliefs
that you've had have led you up to this point.
And no matter how miserable and melancholic your existence
has been, you're still alive.
So you have some degree of proof that the current way
that you're behaving and programming
is causing you to move forward in the world is effective.
I'm not dead because if I was dead,
I couldn't be thinking about whether or not to make this decision.
So does that cause people to have a safety net,
a degree of risk aversion for just thinking anything new,
simply because I know that what I do now has been effective so far.
Yeah, definitely. The short answer is yes. And the long answer, mining how much time we have,
I will, is that there is a point because the effective tipping point, but the, that is,
and to make sense of that, we have to quickly talk about assimilation and accommodation.
PSJA put this forth. If you wanted to answer how do minds change, this is a pretty straightforward answer.
Through a simulation and accommodation, the simulation is when you encounter something
uncertain or novel or ambiguous, you disambiguate it, you arrive at some level of certainty, and
you make sense of it by assimilating it into your current model.
And the way I describe it in the book
is if a child sees a dog for the first time,
and you say, it's like, what's that?
You say it's a dog.
Something categorical happens in the mind.
It's like four legs, furry, non-human, wagging tail, dog.
Then they see a horse, and they've never seen one before,
and they just point at it and go, dog, that they see a horse and they've never seen one before and they just point at it and go dog.
That's assimilation because categorically it seems like it fits into my model. Four legs, nonhuman, furry tail.
But if you say to the child, no, that's a horse. Now something has to happen, which is,
there's a lot of cognitive loadiness. It's a difficult process. You have to create levels of
abstraction. There's neural networks have to do stuff. You have to literally expand your mind because
you have to create a category in which both horse and dog can fit, which means you have
to create a new perceptual category, which expands the range of your understanding of
the world. That's called accommodations. Now, animals can be named two different things.
That's crazy. We're doing this at all times.
Both things are happening at all times.
Some of our conversation makes sense to us
because we're assimilating constantly
than what we already understand about things.
But at certain points, if I especially have a field challenge
or I feel some dissonance,
I'm going to need to accommodate
to make sense of what's going on.
We typically resist the accommodation
because there's more risk in that.
Just exactly as you describe,
assimilation is way less risky.
Even if it leads me to be factually incorrect,
it doesn't lead me to being dead.
So I can be wrong about stuff.
It's better to think the rustling in the bushes
is a tiger than to go looking to see if it's,
maybe it's so rabid, I don't know.
It's better to stay safe and err on the side
of being wrong about that. But there's a point at which we can't do that. There's a breaking. I don't know. It's better to stay safe and err on the side of being wrong about that.
But there is a point at which we can't do that.
There's a breaking point called the effective tipping point.
In the book, it's a long answer to what it is, but it comes down to, there is a level
beyond which counterfactual or counteraditudinal information will cause you to feel so strongly
that you need to update that you will commit to an
update. And they call it the effective type of white. It's a lot. It can be above 30. It's for
raw, neutral, politically neutral information. It's around 30% of the evidence coming in has to
be as to counter the evidence that you've already seen. But if you add social costs, if you add
motivations, if it's like my job depends on it or something, that number can go way higher than that.
But there is a place, it's just the number can go up depending on how many motivations
you have to not change your mind about something.
So that's why people's minds changed, that there is a degree of information coming in that is so overwhelming that they basically can no longer be unconvinced
of that position, but the problem is that when you layer on top of it, tribal bias, all
of the other reasons that people have to not change their mind, that that threshold can
continue to go up and up and up and up.
I remember Gadsad was talking about, he does marketing and consumer stuff, I think,
at wherever he is, Concordia or something. And he was talking about, he wanted to know what
caused people to decide when they were going to buy something. And I think that it was something,
I think it was a similar sort of tipping point that people reached, like a threshold, right? Yeah. And the, and the, it's risk versus, it's a risk versus reward, like you were saying,
but it's like this meaty giant algorithm of every risk, every reward combined. Because if I,
if I change my mind about this, what's what are my friends and family going to say? That's part of
it. Like, how's this going to affect my finances? That's part of it. Well, I have to update everything I've ever thought about this thing. One simple example is,
if you have a friend who's really in the Quentin Tarantino movies and they tell you,
hey, Quentin Tarantino has a new movie coming out. He'll be like, cool. They might be wrong about
it, but you just say, what's the risk? What's the harm? I believe you. However, if that same
friend tells you, do you know that Quentin Tarantino isn't actually a real person? What's the harm? I believe you. However, if that same friend tells you, do you know
that Quentin Tarantino isn't actually a real person? It's been dating Davido this entire time,
wearing a hyper-advanced mech suit? You're going to be like, there's a higher threshold for believing
that, even though you trust your friend and think they love Quentin Tarantino. But it's somewhere
in that media algorithm, you've determined there's more risk than reward in accepting that
information at face value.
You change that to a political concept or something that's deeply connected to your identity,
and you could imagine that you might have a similarly high threshold for accepting that
information into your model of reality.
What did you learn about arguing? Arguing.
Arguing, you know, it's good.
I like that we're arguing on so much on the internet.
Arguing is how we change minds.
And we are set up.
We're biologically set up to receive and deliver arguments
in a way that gets us ahead in the world,
that changes things, that progresses the human condition,
that evolves our ideas, that layers on new abstractions for us to make sense
of things, that articulates the ineffable, my friend. This is all good for us. It's just
that the way in which we're arguing isn't the way we are arguing in the same sort of
context in which we evolved to do so. And I think that we'll, the frustration we have
when it comes to people who don't
seem to see things our way, it's like the frustration you would get if you're trying to reach
the moon with a ladder, is what I say at the book. And then you get, you're like, you say,
well, the moon's unreachable. So what's the point in arguing? So we need to change that
to people. And you say, look, those people are unreachable. There's no point in arguing
with them. The frustration shouldn't be in them. The frustration should be in the way in which you're trying to approach them.
And if you are attempting to argue in a way where you're trying to show that I'm right and you're
wrong, and you can assume that they're going to be doing the same thing, the debate is going to
end in nothing useful for either party. But come on, I mean, people have conversations where they try the most gentle, subtle,
most well thought out, softly introduction,
and the person on the other side of the fence
just impressively seems to be able to disregard
with vitriol, everything that they're saying.
Sure, but that's your fault.
That's your fault.
That's the person who's arguing with them's fault because we are all open to changing our mind.
If it's presented to us, if the conversation's presented to us in a way in which we feel like we
can trust the other person, the other person that has empathy for us, that they are compassionate,
they are transparent. They're not trying to affect our agency. They're not going to shame us.
We're not going to ostracize that I might get something out of it. Does that not suggest that
not engaging is the best form a lot of the time. Let's say that
there are a large proportion of people on Twitter whom for certain subjects, they will not change their mind. It is simply
the medium that they are unprepared to change their mind over given this topic or that
topic or the other one. Like the threshold that you need to breach with them, the bandwidth
that Twitter allows you to get across simply isn't going to allow that to happen.
You are absolutely correct. I love that you said that. This is the first person they ever
put it that way. And I in a conversation would have anybody I've talked to about this and it's really well put and really it's a great point
Yeah, we can't do this and unless we're in a good faith environment and sometimes we don't have a choice
When it comes to creating that good faith environment
There are some environments that are by default bad places to have that kind of conversation and
by default, bad places to have that kind of conversation. And acknowledging that upfront is great. And you could take that
same person that you beforehand would be a bad idea to engage
with them and say, maybe we should take this somewhere else or
we should move this to a different context. And I'm not saying
move it to one of those stupid debates that people have on
stage between like an ADS and a creationist that helps no
one. Everybody walks into that thinking the same thing that they do when they walk out.
Don't do that.
You need to sit.
That needs to be in a place where there's not an audience, where people are not concerned
about what's going to happen when their friends and family or their peers see what they have
to say about it, which is also what happens on Facebook.
People on Facebook are arguing feeling like they're on stage and there's a whole audience
going to go, boo, whenever they say the wrong thing. One-on-one in some way or another, whether it's email, old-fashioned
missives, or at a bar or a pub or a dinner, that's where you actually could connect with someone
using these old things. Until we create online environments that replicate that, you're right.
It may be a bad idea to engage people in places that by default
create a bad faith interaction between you and the other person.
From all of the research that you did, what is your favorite strategy to help people positively
change others' minds? I found, it blew away. Like motivational interviewing is probably the most powerful thing there
is, but it comes out of the therapeutic models. But I was very, a blue mind to find that there were
all these organizations around the world who had stumbled into the same insights as that therapeutic
model, because they were just doing A, B testing at people's front steps or they were doing it on college campuses.
In the book I talk about deep canvassing and street epistemology, but also motivational
interviewing, cognitive behavioral therapy, smart politics. All of them use pretty much the same
technique and pretty much the same order, which is pretty much the socratic method, but with a
lot of added psychological insights put into it. You open up by establishing rapport and trust and assuring the other person that we're going
to just talk to each other and explore each other's reasoning in a way that helps both of us
maybe figure out why we disagree about this.
Where does it come from?
And then you ask for specific claims, you ask for sort of a measurement of confidence in that claim,
you ask for reasons why you hold that confidence, and then you ask, well, how did you arrive at this, at the belief that,
or well, how did you, how did you arrive at the conclusion that that was a good reason to hold
this in such a confidence? And then you just let the conversation spool out from there, and often all
the work will be done on the other side, and they will come to a new way of seeing, seeing it,
and seeing something in a new way of seeing it. And seeing
something in a new way, definitely before the conversation is changing your mind.
That same conference, that same lecture that I was at this weekend, he explained about how
a lot of the students that he has are on the far left. And he puts this quote upon the board
using the Socratic method to get them to be aware that always pushing
for social justice is not necessarily the best way to do things. And it's this big quote
and it says, the inner health of the individual can never be justified until the external social
justice health of the society has been something else. How many people agree with this? And
it's like the sort of thing that everybody would agree with. Then he presses one button
and it comes up below that Adolf Hitler, 1920. So funny.
Wow. David McRainey, ladies and gentlemen, I really appreciate your time. If people want
to keep up to date with what you're doing and get the book, where should they go?
You can find me at davidmachranny.com or you are not so smart. Of course, my podcast and
all that stuff is under you are not so smart.
How minds change the new book, how minds change the surprising science of belief opinion
and persuasion I had to actually look at the book just now to remember what the full title
was.
It's an actual real thing that I can hold.
The that you can find at davidmachranny.com or just anywhere that books are at I'm not going to tell you where to get it, but if you pre-order it, I got all sorts of pre-order
bonuses for you, including a workshop and a video of a bunch of persuasion experts talking to
each other, so pre-ordering would be ideals and that's how the algorithms determine my destiny.
But however you get it, the book is called How Mine's Change.
David, I appreciate you. Thank you.
Thank you so much, ma'am.