Offline with Jon Favreau - How to Talk Your Uncle Out of QAnon this Thanksgiving
Episode Date: November 20, 2022Beth Goldberg joins Offline to discuss her work at Jigsaw, the misinformation-tackling team at Google that’s been called “the Internet’s justice league.” Goldberg walks Jon through the dos and... don’ts of drawing your Q Anon cousins, election-denying uncles, and vaccine-skeptic grandmas out of their conspiracy rabbit holes this Thanksgiving. By pre-bunking, seeding doubt, and listening with compassion, together we can hash it all out. For a closed-captioned version of this episode, click here. For a transcript of this episode, please email transcripts@crooked.com and include the name of the podcast.
Transcript
Discussion (0)
You know, very last sort of tactic I've seen work really well is we can help people find alternative communities.
Right. The main thing they're seeking with these conspiracies is a movement to be part of.
Then let's help them join some sort of local movement that's actually doing good in the world.
Right. Let's have them, you know, foster dogs.
Or we had one woman go from running QAnon groups to running crystal groups.
And I was like, great, you're still getting into the magic of things.
I like that for you.
But this is totally healthy in terms of it's not harming anyone.
It's not spinning far-right conspiracy theories.
You're not doing any damage with crystals.
I'm Jon Favreau. Welcome to Offline.
Hey, everyone. My guest today is Beth Goldberg,
head of research and development at Jigsaw, which has been called the Internet's Justice League.
So this is our final show before Thanksgiving, which means we're all just days away from an annual internet tradition, talking about how to talk about politics with our relatives.
Just endless amounts of content on this topic. so we figured we'd try to offer our own offline
contribution to the genre. A conversation with an expert who's actually studied how to pull
QAnon posting relatives out of their conspiracy rabbit holes. Beth is a behavioral researcher who
leads R&D at Jigsaw, a project at Google that helps develop solutions to fight the Internet's biggest problems.
Hate, harassment, misinformation.
Around Thanksgiving last year, Beth wrote a piece about how we can help relatives leave behind QAnon and other conspiracies.
She's also done a lot of research into how we can inoculate ourselves against conspiracy theories in the first place.
I talked to Beth about all of this.
She taught me why people fall down these rabbit holes, why it's hard but not impossible to pull
someone out, and how we can show compassion and empathy at the Thanksgiving table this week.
As always, if you have comments, questions, or concerns, please email us at offline at
crooked.com. And please take a moment to rate, review, and share the show.
Here's Beth Goldberg.
Beth Goldberg, welcome to Offline.
Thanks so much, John. Glad to be here.
So ever since 2016, I feel like it seems like one Thanksgiving tradition
is talking about how we talk to relatives who have different political views or believe in conspiracy theories or generally just post wacky shit on the Internet. into how we might approach those conversations in a way that not only keeps the peace at the dinner table,
but potentially persuades family members to see the world a bit differently.
So I'd love to start with an excellent piece you wrote last year around Thanksgiving
on how to talk to loved ones who've become QAnon believers.
First off, is it crazy to even want to have these conversations?
No, I think we need to. You had a great conversation a month or so ago with Anandra
where you talked about, you know, we need to be in the ring. We need to try to be persuaders.
And there is no one who is, quote, irredeemable, right? Everyone has that ability to change their
mind at some point. We just have to treat them like human beings still, even when they say really vile, upsetting things.
So really, the onus is on us to, like, keep coming back to that table.
So I want to focus on how people get out of the conspiracy rabbit hole.
But first, I'd love to know from your research, like, what are some of the main reasons people are drawn to
conspiracies? Are there certain social or psychological factors that make certain people
more susceptible? Yeah, when I started this work, I had all sorts of assumptions. Like, I think a
lot of us do that, oh, conspiracy believers are maybe not as smart, or maybe they come from a
certain type of background. Oh, they're all, you know, tinfoil hat wearing people in rural areas or, you know, some crazy stereotypes.
I have honestly been blown away over and over again by how diverse and how, quote, like normal, prevalent everywhere conspiracy believers are.
They're just looking for community.
But in the U.S., and we found that over and over again in these polls over 50 percent of
Americans believe at least one conspiracy at any point in time and something like 30 percent of
Americans endorse these claims that like there's a secret cabal running things that we're not
informed about right so there's there's this constant sort of underlying hum of conspiracy
belief everywhere so it's not really unusual And when people are drawn to it,
they're honestly looking for people who are like-minded, who aren't going to judge them,
and who are maybe a little bit confused by some of the really complex happenings in the world,
like, I don't know, a global pandemic. And they're looking for these sort of easy answers.
And they're looking for people who are to share their grievances and to sort of like commiserate together. Honestly, we all do it. For some folks, they're really finding that comfort
in these communities that are giving them those easy comfort food answers.
Obviously, people have believed in conspiracies since the dawn of time. And I've sort of seen like
mixed research on whether there's like a greater proportion of
people who believe in conspiracies these days. I'm always looking for sort of the larger,
you know, societal factors that are contributing to this. And of course,
because this is a show about all the ways the internet is breaking our brains. I wonder if
there's evidence that the internet and social media have led more people to
embrace conspiracy theories. You know, I've heard this before that we're in some golden age of
conspiracy belief. And I wish we had some really neat survey someone had done like 100 years ago,
so we could just finally sort it out, have a before and after. That data doesn't exist. But
I don't actually think we're seeing more conspiracy theories now than before.
From the little bits of data that we do have, we know that conspiracy belief was super duper high at various points in the past.
So I'll give you one example where there is data.
JFK's assassination.
After he was assassinated, some insanely high, over 50% of Americans, you know, number, believed that it was a conspiracy theory
and that it wasn't really an assassination
that we now have lots of evidence about.
So I think, you know, it's certainly the number of Americans who believe
or number of people on Earth who believe in conspiracy theories
likely ebbs and flows with these really big, complex events
that we can't fully wrap our heads around.
But it's always been a
little bit higher than we think. But what's making us think it's a golden age now is the internet.
The internet is amplifying the fringe, right? The fringe has never been more accessible.
If you think about the pre-internet days, how did people learn about conspiracy theories? It was
like, well, maybe they picked up a weird book at the library, or maybe their friend introduced them to some like super niche movie that like their parents
didn't want them to have. And they snuck it in, right? Like, but it was a little bit taboo,
and it was much harder to access. And so now we have this amplification and this accessibility of
more fringe ideas than ever before. And so that makes it feel like,
oh, we're in this moment where
there's more of it. You mentioned the pandemic. I've always wondered if times in history where
there's more social disorder, loneliness, alienation, whether it's caused by economic or political factors, that sort of leads people to A, like you said, want to find a community and B, sort of make sense of an increasingly disordered world. the word alienation that you used. And I'll step back. My research team at Jigsaw has done
research now with over 100 conspiracy theorists where we really sit down with them at their
kitchen tables. We meet them at their place of work and we do these really deep get to know you
style interviews. And over and over again, you know, from meeting with flat earthers in Idaho
to, you know, people who believe in a great replacement of white folks in Tennessee and elsewhere.
You know, we're hearing these patterns, no matter what they believed of.
First off, isolation. They were feeling separated just from friends and family.
Right. These were folks who didn't have strong social networks.
Alienation from power. They didn't necessarily understand how decisions were made.
They didn't know why,
you know, the tax rate changed, why they had to wear masks. And so there was this sense that they weren't part of that decision making process. You know, who was up there in Washington making
those decisions and even more local decisions or big pharma, right? They didn't understand like
how they made this particular vaccine, right? How did it get done so fast, right? They didn't understand how they made this particular vaccine, right? How did
it get done so fast, right? They're alienated from that whole system of power. And then lastly,
that breeds mistrust. And so you have these really popular narratives now around mistrusting
big institutions, particularly those that you feel alienated from.
I saw that former believers have called believing in conspiracy
theories addictive. Why is that? And how much does addiction sort of play into a lot of these
people's sort of reason for embracing conspiracy theories? Yeah. And I first want to say we wrote
about addiction in that slate piece. And I only use that terminology because actually a number of
the conspiracy believers that I met with used that term themselves to describe their own relationship
with conspiracy or misinformation content online. I was really blown away by that. And it does take
on the sort of negative connotation in part because people feel out of control. And so these conspiracies are sort of a bit of a solve on that need for control and
for answers that they have. And so, you know, why is it so addictive for some folks? Honestly,
it starts with social ties, right? If you're feeling isolated and alienated, and then you
suddenly have this whole group of people online who are sharing this idea. And then you're part of that
community when you share that idea. And when you start to like each other's content, you start to
share each other's content, suddenly you have human connection that you've actually really
been craving. And so community is the main thing. And the more that you build that online conspiracy
community, the more you become alienated from the rest of your previous life community, right? The
more, you know, your family members stop talking to you over Thanksgiving dinner
or your coworkers start to think you're a wacko.
And so there's that segregation that happens
where people really start to become reliant on that conspiracy community.
It's also addictive because it's a rush, you know?
It's a rush to have to be in on the secret.
It's so funny, though, that this biological need for human connection that almost every human being shares,
oftentimes we think about it as like,
oh, it can lead to all kinds of positive outcomes.
But in this sense, it also can lead to finding people,
finding a community that can be quite destructive.
Absolutely. You know, these communities can really fan the flames of a sense of injustice or a sense
that you've been wronged for some sort of whatever the crisis is. You know, you lost your job because
of the pandemic because you thought, you know, the lockdowns were too strict. That was a major
moment for a lot of people to turn to conspiracy theories
and these conspiracy communities
because other people shared that grievance.
Other people had just been laid off
from their places of employment as well.
And so you get this victim complex
and that can really fuel a lot of dangerous things
in the name of self-defense, right?
I mean, a lot of people that showed up on January 6th,
they thought they were legitimately defending their way of life.
That was coming from this sort of moral crusade of like,
I'm doing the right thing, right?
And it's conspiracy theories that help flip reality on its head
and they help you see yourself as a victim,
as part of this community of victims. So you spoke with dozens of Q formers, people who believed in
or were members of the QAnon movement. What did you learn about how people leave QAnon
and why they stopped believing it? Yeah, we spoke with dozens of Q formers.
And even before that, we've met with folks who were COVID conspiracy believers.
And just recently, we've done some more work with some election deniers.
So really gone as broad as possible to understand people's relationships
with conspiracy theories in the US and beyond.
And I'll first say, you know, this wasn't a homogenous group of people by any
means. You know, they came from all different age groups. I met some older folks who had been
Obama supporters. You know, this old woman in New Jersey was a Latino woman, and she was telling me
all about how she'd worked on the Obama campaign. And then in the next breath was telling me that
he's part of the Illuminati. Right. So really, you know,
subverting some of those stereotypes.
But they fell into sort of two rough buckets of the types of belief that
they had.
And that impacted how they left the conspiracy.
So the two buckets,
you can think about it with a sports metaphor of like fanatics and followers,
right?
You've got those people who are the diehard fans.
They're going to be out there when it's three degrees out. They're going to be screaming at the top of their lungs with face paint, right? The fanatics and followers. You've got those people who are the diehard fans. They're going to be out there when it's three degrees out.
They're going to be screaming at the top of their lungs with face paint.
The fanatics.
And then you've got the followers that are like, yeah, I'll check the score on my phone
after the game.
They want to follow along and they want to be part of the crew,
but they really only show up for the Super Bowl.
Got it.
Yeah.
So we met with these guys and the fanatics are the ones who would be self-described
addicted to the conspiracy belief.
They're spending like eight hours a day online sharing the stuff, reposting the stuff.
For them, getting out looked like breaking an addiction.
Right. They had to actually have a moment of self-realization that what they were doing wasn't healthy for them.
It wasn't healthy for those around them.
And for a lot of them, it was being confronted
by a loved one. And, you know, there's a story of a woman we met with in California. She had
been a major Bernie supporter, had fallen down this sort of QAnon rabbit hole. And her husband
actually had to confront her, you know, gently, but say, hey, this is making you unhappy.
And it's taking away time that you could be spending with me,
you're spending eight hours a day online, like I don't even get to spend time with you anymore.
This is how this is affecting the people around you, you know, and sort of mirroring that back
to them can be really helpful. For the other folks, for the followers, there's no one path out.
But for a lot of them, they still have some tether to reality. And so you want to plant more
of these seeds that can become tethers back to reality, right? More little seeds of doubt,
if you will. Some examples were, you know, people who believed in QAnon had been told that there
were all of these prophecies about a certain date when something would happen. And then when they,
when that date rolled around and the prophecy didn't
come true, they started to accumulate doubts. And so I had a few folks that I met with say,
hey, by the third time the prophecy didn't come true, I was really having my doubts.
And it takes a couple of those moments, a couple of those seeds to really germinate before the
followers realize that maybe they've been duped. And so if you're going to be the one
delivering those seeds of doubt, you have to do it with a lot of gentleness and know that it's
never going to be just one that's going to totally get them out. They have to kind of stew on them
for a while. You can very gently sort of ask probing questions of, you know, maybe a loved
one at Thanksgiving and say, hey, you know, I heard that this was supposed to happen at the election. Did that prophecy come
true? I'm interested in the election deniers. What were some of the stories of how they
sort of got out of that rabbit hole? Not a lot of them have fully, at least not a lot of the ones
that I've met with. We didn't do a survey of absolutely everyone. So I'm speaking from a
small group of folks we met with. But for a lot of them, they came to election denial because they felt something had been
taken from them. And they were really angry, right? They were finding this community of people
who shared that grievance. And so when they no longer really felt that anger, they kind of slid
into an adjacent rabbit hole, right? It would be,
they would be really upset about abortion, or they'd be really upset about immigration,
or they'd be really upset about something else that they could still have community and purpose
and something to get them riled up and, you know, posting angry tweets about. But they were able to
sort of just replace whatever that focus was. Now, I think a lot of folks are still, you know,
believe something happened around the election here. And we did this in both, we were talking to election deniers in
the US and Brazil. There's definitely still a lot of belief in both places that fraud happened that
hasn't totally been abandoned yet. But we anticipate, similar to say COVID denial,
that folks are eventually just going to slip into something else that fills that same need for them. I'm just thinking about how we just had a
midterm election. And so far, most of the election deniers, the big lie believers who lost,
did actually concede. Does that have an effect, do you think,
on people who believe some of these conspiracies?
Like if they start to see other people
who once believed what they believe saying,
okay, you know what, I'm going to concede.
Is there any evidence that that has an effect?
It might, especially for someone like a follower
who's doing it because their group of people online
is a big fan of Mike you know, a big fan of
Mike Lindell or a big fan of, you know, whichever, whichever political candidate they were excited
about who lost. And if they no longer have that, you know, captain of the football team rallying
them, they may join a different team, right? Those are sort of more weakly held ties for some of
those folks who jumped on the bandwagon because friends of theirs were already on the bandwagon.
Those are the followers. For the fanatics, I don't think they're going to give up that easily.
Right. They're looking for these really big patterns because for them, these conspiracies are part of a worldview.
It's not just a one off conspiracy theory that their particular representative had their particular house race stolen. For them, they're seeing that as part of this larger plan by the cabal, right?
And so for them, okay, yeah, maybe this one race wasn't stolen, but the cabal is still out there,
you know, machinating and stuffing other ballot boxes, right? So they'll find a way to hold on to
that. So you see a loved one go down this path, you want to help them. What are some
strategies that don't work? What are some strategies that do work? You mentioned sort of
placing seeds of doubt. I would love you to sort of just expand on that if you could. Yeah, well,
I'll start with what not to do. You asked, you know, what are some strategies? Yeah, facts don't
work. Debating people on the facts.
You know, there's that great line by Ben Shapiro who loves to say, you know, facts, not feelings.
Yeah, facts don't care about your feelings.
It's actually the reverse.
You know, people come to conspiracies because of feelings.
They're not coming because of the facts.
They're coming because they feel, right, alienated, isolated, and mistrustful. And so they need to feel heard. They have some sort of grievance
that's driving them. So the very first thing to do is actually just listen to someone.
And you're listening on two layers, like on the high level, what they're actually saying,
there's probably going to be a lot of garbage and you're just going to have to like, you know,
ride it out, be entertained by it. But you're listening at a deeper layer at the same time for
their underlying fears. What's actually driving them to believe in something like, you know,
that the vaccine is going to, to paralyze them? What are they, what are they actually afraid of?
Right. Are they really afraid of paralysis or is is there something else that they've been told over and over again about the way that big pharmaceutical companies are hurting
people who look like them, right? Is it tapping into some other deep-seated fear? And then the
second thing you can do is you can actually acknowledge some of the half-truths or some of
the tangential truths in what they're saying, because every conspiracy theory does have a
seed of truth, right?
There is some sort of little piece in there that you can find common ground on.
Especially, like, if people are really laying into, like, historical conspiracies,
man, there is so much stuff you can acknowledge and meet them on and be like,
yeah, the CIA really did mess up, you know, when they, you know,
when we funded the Iran Contras or the Nicaraguan Contras
or when we overthrew the Shah in Iran. Like, yeah, man, that was a pretty messed up conspiracy.
And so, you know, building some of that common ground and making them feel heard is absolutely
the bedrock to having any of these harder conversations where you're planting the seeds
of doubt. For planting the seeds of doubt, a few things that I've heard work pretty well is,
you know, you're pointing out these false predictions.
You're pointing out hypocrisies, you know, where they're trying to hold two things in tension that can't possibly both be true at the same time.
But one of the things that I was really intrigued by in some of the conversations we had was pointing out the relationship between the conspiracy that they believed in and some
really unsavory characters or some violence and saying, hey, you know, this QAnon thing,
that's actually motivated people to blow things up or that's motivated people to plot assassination
attempts. And, you know, have you have you read the story? You know, are you are you OK being
in cahoots with this type of character?
And that really makes people do a double take and kind of reflect and say, oh, wow, you know, I came to this because I wanted to save the children from the sex traffickers.
You know, I didn't want to support the, you know, the guy blowing up subways or whatever it is.
That's really fascinating.
Yeah.
I also think the point you make about fact checking is so important. That's really fascinating. Yeah. And trying to make sure that something is debunked or fact checked or that if like a headline is written the right way or an outlet reports that Trump just lied, that like that will have this big impact.
And of course, it's very important that you do report the truth that everyone does.
But I think in terms of making an impact and persuading people to believe differently, it seems like simply giving
people the facts is just not sufficient. Yeah. I mean, you've talked before here about how people
are so calcified right now, right? Like if you're entrenched in your position, you've closed your
ears. You are not open to what the other side has to say at all. If anything, you're listening so
that you can improve your own ammunition and your own rebuttal. Right. And so I think there's a little bit of that that can happen at a family
dinner table where people are listening only so that they can, you know, pick a fight. And I think
we really have to change that approach. Right. And really come with a lot more compassion because
people aren't turning to conspiracy theories when they're doing well. People are turning to
conspiracy theories when they're kind of down and out, right?
When you're really struggling with something.
And so even when it's exhausting, you have to just sort of listen with them.
The other thing we can do, you know, very last sort of tactic I've seen work really well
is we can help people find alternative communities, right?
If the main thing they're seeking with these conspiracies is a movement to be part of,
then let's help them join some sort of local movement that's actually doing good in the world, right?
Let's have them foster dogs.
Or we had one woman go from running QAnon groups to running like crystal groups.
And I was like, great.
You're still getting into the magic of things. I like that for you. This is totally healthy in terms of, you know,
it's not it's not harming anyone. It's not spinning far right conspiracy theories. You're not
you're not doing any damage with crystals. That's very interesting. And I do think that's where
a lot of people who are involved in political organizing sort of pursue those kind of tactics
and strategies versus a lot of sort of the online posting wars. If you're actually trying to get
people to join your campaign or to join your movement or to join your organization, you're
out there trying to make it seem fun and joyful and interesting because you're trying to build a rival community to whatever community they belong to.
These are some incredibly talented movement builders, right?
They've actually built some really useful transferable skill sets.
They can put these on their resume.
No, maybe they shouldn't do that, but they should use them to build other communities, right? Whether it's, you know, online communities or offline communities, they really, they can repurpose this sort of movement building and sense of purpose and sense
of belonging. You've done some fascinating research on another strategy to fight conspiracy
theories called pre-bunking. What is pre-bunking? Yeah, pre-bunking is fascinating. It's this really
promising approach to get ahead of misinformation, right?
We spend so much time talking about debunking.
You know, you were just noting how we've been trying to fact check every time.
Fact check our way out of this for a long time.
Right, we have.
And we're a little bit too late when we're debunking, right?
In part because we know misinformation is really sticky.
It sticks in our brains really well, often because it's really sensational. And also because we have
this first impression bias. So if someone's first impression of the misinformation is the
misinformation itself, like that's going to have an advantage. If we can get to someone first and
warn them, a pre-bunk is this preemptive warning that says, hey, you're likely to encounter something
that looks like this next time you're online. This is manipulative. And here's why. It actually is
using the metaphor of a vaccine. It's like a vaccine for your brain. And as a fellow political
nerd, I feel like you'll appreciate the origin story behind pre-bunking. So there is this guy,
William McGuire in the 1960s, he's a
social psychologist. And he got tapped by the US government who said, hey, we have physical armor
for our GIs going overseas during the Cold War. We want them to have mental armor against this
communist propaganda. Can you help us come up with mental armor for GIs? And so William Maguire
comes up with this thing he calls
inoculation theory. And he actually sells it to the US government as this vaccine for brainwash,
which I love. Wow, that's literally what what it's been used for. In the, you know,
many decades since it's been applied across environmental movements, it's been used in
marketing, it's used in a lot of public health campaigns.
And we're just now realizing the promise
of what this looks like in a digital age, right,
where we can put it into really short-form snippets
like videos and infographics and memes
and disseminate it really widely online
and get out ahead of that really fast moving misinformation.
Well, so you guys at Jigsaw created a couple of these videos and then you conducted a study where more than 20,000 people, I think, watch them.
Let's take a listen to just a clip of one of them.
You might think about skipping this ad.
Don't. What happens next will make you tear up. Kidding. You just got tricked. Let's take a listen to just a clip of one of them. So, fascinating video.
Can you tell us more about the study and what you guys found?
Yeah, so we teamed up with a number of academics who have been studying this technique for a while now.
We worked with folks at the universities of Cambridge and Bristol on this.
And we wanted to understand, can we get way out ahead of political misinformation in particular by identifying these common techniques that manipulators, propagandists use?
Right. What are some of the most common things that we can we can anticipate they're going to use every election cycle?
We can anticipate they'll use in, you know, generic political propaganda.
And so some of these like the ones in that video are like emotional manipulation, fear mongering, super common.
And so we wanted to know, can we teach people in just 30 seconds how to identify what that looks like and then spot it themselves when we show them examples.
So we did these tests first in a lab setting where we sort of paid people to sit down and answer our questions and watch our videos.
And we were really heartened by what we found because across different types of Americans,
we had a representative sample of old folks and young folks and different demographics and different
education levels. These videos worked. People were better able to identify when something had
emotional language. They were less trusting of it and then they were less likely to share it.
So we were like, okay, this worked in a lab setting where we paid you to pay attention.
Let's take this to social media. Let's go to the wild. And so we took that video
that you just heard, and we basically bought ads on YouTube and we showed it as an ad to people,
to over a million Americans. And then we followed up the next day with a quick little survey right
in the YouTube player to see if people remembered what they'd seen the day before. And we really were testing, you know, did they learn how to identify fear mongering? And we found that on average,
five to 10% of people were better able to identify those manipulation techniques,
which I know sounds really small. But if you think about it, that's a million people.
Yeah.
5% of them paid enough attention to an ad, which I never do.
No, I know.
They're very compelling videos.
I was watching them last night in preparation for this.
I was wondering, like, I noticed that there are some limits to this, to the effectiveness,
and that it's, I think I read that it was a little more challenging when it's about
politics specifically, and that it works for between, like, the retention is like between a
week and a month, but then you kind of have to keep hitting people with this stuff or else they'll
forget about it. Yeah, I'll take the timeline limitation first. This is just like a vaccine
for your body, where the resistance you gain from that, the antibodies you build, they wear off over
time. Same thing here with your mental antibodies. You get them from these videos for a little while, but like with any intervention,
like any sort of mental educational intervention, it's going to decay. And so what we found from
these particular videos, they last for a solid 10 days, and then we're going to start to see decay
over the course of a month. But just like
a vaccine, you can give people a booster. So one really cool study we have coming out, and some
other folks have been testing this as well, is can we give you even like a 10, 15 second reminder,
a little booster of what you saw in that first video, and get you all the way back up to full
mental immunity. And we found that at least in the lab setting that that works really well. And so definitely some more research needed on that.
Now, does it work on political topics? When people have really strongly held opinions on
something, right, where if you've totally bought into this narrative, I'm really unlikely to
convince you in a single 30 second video that you're wrong. Right, it's supposed to be a pre
bunk, so it's supposed to get out pre-bunk. So it's supposed
to get out ahead of it. But if you've already bought into it fully, it's a little too late.
So that's one limitation of this approach. We're trying to get out ahead of things. So like we made
some videos with the Harvard School of Public Health and American University on vaccine hesitancy.
And for folks who were super vaccine resistant, this video wasn't going to do anything for them. So we really targeted to the people who were more vaccine hesitant and whose minds maybe were willing to be shifted by a video. been on the rise for the last several decades. And a conspiracy theorist probably won't trust
reputable, polished information sources. What kind of sources or language do you recommend
people use with friends and family who may be conspiracy theorists, maybe election deniers or
vaccine hesitant that won't sort of arouse their mistrust?
That's a really good question.
I'll answer it first by focusing on how pre-bunking does that and the language and the approach that it uses,
and then maybe offer some thoughts on for all of y'all at the dinner table over Thanksgiving.
But with pre-bunking, we found that it's worked across the political spectrum in the U.S. and abroad.
And that's really rare for anything with misinformation.
I mean, you know
well how polarized things like fact checks are, right? So this to work equally effectively across
the aisle was huge. And part of the reason why we think is the language it uses is around self
defense. You need to equip yourself, defend yourself against the manipulators and the
manipulation coming your way online. And it's not naming
names about who's doing the manipulating or what the specific claims are, but it's just saying
you have to defend yourself. Onus is on you to do some self-defense. That I think really taps
into this American individualism and it really taps into this sort of self-reliance narrative
that we like here. That could work at the dinner table too,
with family and friends saying things like, you know, if you're really concerned about protecting
your freedom, maybe you should think about this from this larger societal perspective of how,
you know, whatever, this policy is expanding freedoms for everyone. But I also think you need
to come at it with compassion and say, you know, I really acknowledge this has been a tough year for you, or this has been a tough year for all of
us. And yeah, the government has done X, Y, and Z that hasn't been great. But, you know, and then
you can point out really obvious limitations of, you know, big pharma, big government, whoever the
bad guy is and say things like, there's no way the government could coordinate every single physics teacher in America to all tell you the earth is round when it's really flat.
Like they have no ability to coordinate that effectively. So I think there's some playfulness
that you can bring into this too, when you're starting to drop those seeds of doubt.
The playfulness I think is important too. I mean, that just reminded me of, there's a lot of people
who think that like the DNC is
rigging everything or whatever else. And as someone who worked in politics, I'm like,
I don't know that the DNC can like run a one car parade, let alone rig an election.
You know, there's a little bit of that that I think that sort of disarms people a little bit.
Right. I mean, if people are so alienated from these systems of power, they don't know how the
sausage is made. And sometimes it's worth lifting the hood and being like, the sausage making is terribly dysfunctional and messy.
And that means they could never pull off this conspiracy theory that you're so worried about.
Which is why and a lot of people have said this, but real real politics is much more like Veep than it is House of Cards.
So true.
Yes, they just all need to watch it.
Yeah.
So back to where we started.
You're sitting at the Thanksgiving table and you've got your uncle who posts on Facebook all the time is popping off.
It does sound like you want to listen.
You want to show some compassion and empathy with perhaps a larger struggle that that person's having.
And then you want to sort of maybe plant some seeds of doubt and then also try to encourage that person to equip themselves to fight off people who might be trying to manipulate them.
That's absolutely right. And you can also really build some points in there
both by acknowledging where really things do go wrong
and there really are some true conspiracies
or there really is that seed of truth
at the heart of why pharmaceutical companies do X, Y, and Z.
Really, you can meet them halfway
and don't try to win the debate
in one dinner. No, you're not going to change their minds immediately. This is something that
people have built identities around, right? You know, you're not going to change someone's
identity overnight. But when you've planted those seeds, you can check back in on them
every few weeks or something. And we'll see, you know, hey, maybe let me share this podcast with
you. Let me share this article I just read, right? And you can sort of keep watering that seed that you've planted.
But it takes people months often to really fully let go of some of these deeply held beliefs.
So you can start it during Thanksgiving dinner and then maybe catch up with them again on New Year's.
Keep the work going.
Exactly, exactly.
Beth Goldberg, thank you so much for joining Offline.
This was fascinating. Thank you so much for joining Offline. This was fascinating.
Thank you so much for the research you're doing.
I think it's just unbelievably helpful, and I'm glad you're out there doing it.
So thank you.
Thanks so much, John.
I appreciated talking to you.
Offline is a Crooked Media production.
It's written and hosted by me, Jon Favreau.
It's produced by Austin Fisher.
Emma Illick-Frank is our associate producer.
Andrew Chadwick is our sound editor.
Kyle Seglin, Charlotte Landis, and Vassilis Fotopoulos sound engineered the show.
Jordan Katz and Kenny Siegel take care of our music.
Thanks to Michael Martinez, Ari Schwartz, Amelia Montuth, and Sandy Gerard for production support. And to our digital team, Elijah Cohn and Narmel Konian, who film and share our episodes
as videos every week.