Life Kit - Protect yourself from election disinformation
Episode Date: October 6, 2022It's almost time to vote. NPR's Miles Parks explains what research can tell us about how to combat fake political news — and why it's so tricky to separate fact from fiction.Learn more about sponsor... message choices: podcastchoices.com/adchoicesNPR Privacy Policy
Transcript
Discussion (0)
This is NPR's Life Kit. I'm Mariel Segarra.
I just don't know what to believe anymore. That is a common thing I've heard and said,
honestly, over the past couple years. Whether we're talking about politics or COVID or something
else entirely, we are getting served up a lot of information all the time in the news, on social
media, from friends and family,
and it's just a lot to process. And often you read things and later find out they're not true at all.
But it'll be time to vote soon, and it's important before you do to know what information is reliable and how to filter out the rest. To help us do that, we have Miles Parks,
who covers voting for NPR. Hey, Miles. Hey, happy to be here.
Yeah, happy to have you. So covering misinformation is actually part of your beat, right?
It is. And it didn't start that way. You know, I started covering voting shortly after the 2016
elections. But I don't know if you noticed over the last couple of years, covering misinformation
and covering voting kind of feels like covering the same thing. And that's not just for me as a reporter.
I mean, election officials I talk to who run voting at the state and local level say a huge part of their time, you know, 10 or 20 years ago that used to be spent on just counting ballots and getting ballots out to people is now spent on trying to get the truth out to people around our voting systems. Do you have a sense of why misinformation has
become so common recently? Well, technology really plays a part in that. I mean, there's no question
that people have easier access to bad information than ever before. But there's kind of an ongoing
debate on whether social media as a platform is the cause of so much misinformation and polarization in the U.S.
electorate or whether it's just kind of the mirror showing what's happening in U.S. society,
but not playing into it. I think there's no way that you can have this conversation and not talk
about former President Donald Trump. I mean, he is the first U.S. president who used the presidency
to spread lies about the American election system. And so that, you know, was the most credible figure that we have had in U.S. government spread a lot of those lies. And that has meant that those lies have kind of continued after he has left the presidency.
So it's kind of a perfect storm.
It is. There's a million different things and things we probably haven't even thought about, which is part of why, you know, no one has solved this problem yet.
Yeah, I was hoping that you'd have some solutions.
I wish I could say I did, right? Yeah, it's like you've been covering this thing for years and years.
But I'm not the only one. You know, there are researchers who have been thinking a lot about this for years and years.
And I was talking actually with an election official in Arizona, Stephen Richer,
who runs elections for Arizona's largest county. And I asked him straight up, you know, I said,
so what do you do to solve this? Well, if I said I knew that would be a lie, we have ideas and
we've been trying our hardest to employ all manner of tactics. But I don't think that anyone has cracked
disinformation as a societal challenge. Maricopa County, where he runs elections,
has been the target of a lot of conspiracy theories. And they've actually done a great
job of increasing transparency around their elections process about getting out in front
of voters and kind of debunking conspiracy theories around voting. But I think it's important to realize that this issue
has kind of seems to have got beyond just kind of picking out individual pieces of information
and trying to debunk them. And a lot of research nowadays is spent looking at the kind of systemic
issues that are driving misinformation as a problem. Yeah. And, you know, I want to ask a question because Stephen Richer, he mentioned disinformation
and we've been using the word misinformation. There is a difference between those two, right?
Yeah. So disinformation is all about intent. I mean, if you know for certain that a person is
spreading a piece of bad information and they are gaining politically, they are gaining financially
from spreading that information, we call that disinformation. When it's somebody at a party who just, you know,
is saying something they saw on Facebook that's incorrect, we call that misinformation because
that person isn't necessarily gaining anything from spreading bad information, but it's still
bad information. Right. So, okay, let's talk about the research. Were there any promising findings?
There are actually, surprisingly so. I mean, it's not all doom and gloom here.
One recent test sample that I reported on actually focused on people who feel U.S.
elections are fraudulent. I do think it's important at this point to say that is not true.
You know, there were audits, there were court cases, there were paper hand counts of the 2020
election. It was considered by both Republican and Democratic
officials to be safe, secure, and effective. But as we know, due to the former president's
campaign against it, there are a lot of people who believe the lies around the 2020 election.
This voting advocacy group, the Voting Rights Lab, kind of wanted to test a series of messages,
just try to answer the question of like, how do we bring
people back into trusting US elections, if they seem kind of leaning towards trusting conspiracy
theories instead? Does it help to just like, tell them the facts? No, I mean, that that's actually
one of the things that they found was that if you kind of take them step by step into the things
they believe in trying to debunk each one of them doesn't seem to be the most effective way to change people's minds around the whole of the democratic system.
The thing that worked that they found was basically presenting them with this kind of unifying, nonpartisan, patriotic statement about America as a country and how important elections are to, you know,
America working as a country. And what they found is that presenting people with a statement like
that actually had a big effect, especially on Trump voters. I talked to Tressa Undum,
who led the research on this for the firm Perry Undum. And here's what was stunning,
that I almost never see in social
science research or our own survey research is more conservative voters, after hearing that
affirmative narrative, they were in the double digit points more likely to say they trust the
process for county votes and elections compared to a control group.
So without that affirmative patriotic messaging about elections and their importance in this country, just 35% of Republicans said, yes, I trust the counting of votes in this country.
But after they were presented with this kind of patriotic narrative about the importance
of elections, 55% of Republicans said, yes,
I trust how elections are counted in this country.
Huh. So what does that tell us then?
Well, I think when we talk about improving this misinformation problem,
some of the field has moved on from just focusing on individual lies, you know,
fact checking the heck out of this thing saying, you know, here's why vote by mail ballots are
actually counted correctly. I think if this thing gets better from the researchers I talked to,
it's through actually decreasing polarization in U.S. society, which is what that statement
that these voters read kind of aims to do. So did the researchers also talk to Democrats?
Yeah, they did. But I think what's interesting is this sort of statement, this positive patriotic statement didn't have much effect on Democratic voters because right now on a whole, almost all Democratic voters trust elections in the United States.
We know from research over the last few decades that voters who won the most recent election are much more trusting of the election system as a whole. So I think it'll be interesting to watch
and it'll be important to watch as we're kind of navigating this polarized environment, how
Democrats feel about U.S. elections in an election where they eventually do lose. Right. Okay. Well,
we are headed into a midterm election. So from your reporting, do you have any ideas or thoughts
on what voters might encounter when it comes to bad information? known as gray area misinformation. So this is where somebody shares a factual news story or
a true data point or an actual photo, but uses it to kind of further their false worldview.
And so the example I go back to over and over again is a story I reported right around the
time that COVID vaccines were coming out. Well, people who wanted to push the idea that COVID
vaccines were killing people, which again, there wanted to push the idea that COVID vaccines were killing
people, which again, there's no data to suggest that, they were sharing news articles about people
who had died after receiving the vaccine. Now, as we know, people die every single day for all
sorts of causes, and millions of people were receiving the COVID vaccine. So it stands to
reason that some people who received the COVID vaccine would then die from causes that weren't necessarily connected to the COVID vaccine, right? But people were able to share true news stories and plant those seeds of doubt about the vaccines. In those situations, there's very little that the social media companies can do to police it because they're sharing a true news article. It just happens to be furthering this false narrative. Right. It's pretty nuanced. So how do people protect themselves from this kind of thing?
I think the biggest thing that people can do when it comes to gray area misinformation
is be on the lookout when you're scrolling or you're reading and you see a piece of information that just fits so neatly
into your worldview and it makes you angry. It makes you upset. It makes you have a really
strong emotional reaction. We know that people are less kind of critically inclined to think
about the information they're receiving when those kind of anger, those strong emotions come in.
And so people who are making money off kind of polarizing,
getting clicks and polarizing the American electorate, try to get you upset, try to get
you angry. And so I think people, when you feel those emotions, instead of being an indicator
that you should kind of rush off to text that article or that photo to 10 people you know,
take that anger or that sadness or whatever you're feeling
as an indication that you should maybe double check the source or double check the actual kind
of key nugget of that information. Yeah. And I think we can all relate to that. Yeah. So if you're
upset, take a moment. That makes sense. Anything else that we should think about? I was actually
talking to Steve Simon the other day. He's the Secretary of
State of Minnesota. And I was asking him exactly that, you know, do you have any tips for people
ahead of the election when it comes to misinformation and thinking about the
information ecosystem? And what he said is that we as a public really need to distinguish between
the purveyors of misinformation and the recipients of
it? I distinguish, and I advise others to distinguish between the super spreaders of
the disinformation who are doing it often, knowing that it's false, doing it for political purposes,
economic purposes, sometimes both. On the one hand, there's those folks. And on the other hand, our everyday folks
who most of us know, friends, neighbors, co-workers, relatives, who might not know what to think
and are taken in by some of this. And making that distinction is important.
I think that point here is the one that I take away from this, that it's not,
even if you don't think you're somebody who is consuming misinformation all the time, if you think of polarization as the bigger problem here,
and the idea that we should just have empathy for the people who potentially are reading bad
information or are susceptible to this stuff, then I think it's something that every single person
can kind of work on in terms of taking the temperature down a little bit.
Yeah. Well, I wonder if like, let's say a family member shares something with you
that seems false. What's the approach? Do you look it up and see, okay, so this isn't true,
and then send them a link fact-checking it? I mean, I think it is kind of case-by-case basis
because I actually get that question a lot. If it's somebody who you're really close with, who you have that sort of relationship with, then sometimes being direct actually can help the cause, you know, potentially giving them a link or talking through it right then. Most cases, it's not like that. You know, it's a person who you might not be necessarily talking to on a day-to-day basis. And what researchers I've talked to say to
do in that situation is more draw it out. Like, don't necessarily go be so quick to yell or,
you know, send an angry email. Ask questions, you know, be curious. And I think if you can kind of
get to the root of the piece of information, whether that's talking about not necessarily
the piece of information, but the source of it and kind of talking about that website or talking about that person and kind of having a broader
discussion. If you start lecturing somebody on this stuff, it's just not usually going to work.
I think thinking of it as a conversation and asking questions is probably the way that has
the most success. Yeah. Yeah. It seems like we're in a kind of a tangled mess here and
we're going to need to figure it out together with some kindness.
It's a really confusing time to try to get good information right now.
And I think bringing a little bit of that empathy and that understanding that like this could be you is helpful.
Totally. All right. Well, NPR's Miles Parks, thank you so much.
Thank you for having me.
For more LifeKit, check out our other episodes.
Miles has one about how to vote and another that goes deeper into detecting misinformation.
You can find those at npr.org slash LifeKit.
And if you love LifeKit and want more, subscribe to our newsletter at npr.org slash LifeKit newsletter.
This episode of LifeKit was produced by Claire Marie Schneider.
It was edited by Ben Swayze.
Our visuals editor is Beck Harlan.
Our digital editor is Malika Gharib.
Megan Cain is the supervising editor.
Beth Donovan is the executive producer.
Our intern is Jamal Michelle.
Our production team also includes Andy Tegel, Audrey Nguyen, Michelle Aslam, Summer Tomad, and Sylvie Douglas.
Julia Carney is our podcast coordinator, and engineering support comes from Ko Takasugi-Turnovin.
I'm Mariel Seguera. Thanks for listening.