Factually! with Adam Conover - Why “Critical Thinking” Can’t Beat Misinformation, and How to Fight It with Michael Caulfield
Episode Date: March 24, 2021We’re flooded with misinformation, and new research shows that cliches about “thinking critically” and “doing your own research” are counterproductive at best. Writer and educator M...ichael Caulfield joins Adam this week to explain his SIFT method for evaluating misinformation, why expertise is so important, and how we should approach unclear ideas like COVID-19 “lab leak” hypothesis. Learn more about SIFT at https://infodemic.blog/ Learn more about your ad choices. Visit megaphone.fm/adchoices See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
You know, I got to confess, I have always been a sucker for Japanese treats.
I love going down a little Tokyo, heading to a convenience store,
and grabbing all those brightly colored, fun-packaged boxes off of the shelf.
But you know what? I don't get the chance to go down there as often as I would like to.
And that is why I am so thrilled that Bokksu, a Japanese snack subscription box,
chose to sponsor this episode.
What's gotten me so excited about Bokksu is that these aren't just your run-of-the-mill grocery store finds.
Each box comes packed with 20 unique snacks that you can only find in Japan itself.
Plus, they throw in a handy guide filled with info about each snack and about Japanese culture.
And let me tell you something, you are going to need that guide because this box comes with a lot of snacks.
I just got this one today, direct from Bokksu, and look at all of these things.
We got some sort of seaweed snack here.
We've got a buttercream cookie. We've got a dolce. I don't, I'm going to have to read the
guide to figure out what this one is. It looks like some sort of sponge cake. Oh my gosh. This
one is, I think it's some kind of maybe fried banana chip. Let's try it out and see. Is that what it is? Nope, it's not banana. Maybe it's a cassava
potato chip. I should have read the guide. Ah, here they are. Iburigako smoky chips. Potato
chips made with rice flour, providing a lighter texture and satisfying crunch. Oh my gosh, this
is so much fun. You got to get one of these for themselves and get this for the month of March.
Bokksu has a limited edition cherry blossom box and 12 month subscribers get a free kimono
style robe and get this while you're wearing your new duds, learning fascinating things
about your tasty snacks.
You can also rest assured that you have helped to support small family run businesses in
Japan because Bokksu works with 200 plus small makers to get their snacks delivered straight
to your door.
So if all of that sounds good, if you want a big box of delicious snacks like this for yourself,
use the code factually for $15 off your first order at Bokksu.com.
That's code factually for $15 off your first order on Bokksu.com. I don't know the way. I don't know what to think. I don't know what to say. Yeah, but that's alright. Yeah, that's okay. I don't know anything.
Hello, welcome to Factually. I'm Adam Conover. Let's start with a couple pieces of housekeeping. If you are shocked and horrified by some of the events of the last week, I am right there with you.
And I think it's incumbent upon all of us that we do everything that we can to push back against the wave of hatred against Asian-Americans and other marginalized communities that we've seen over the past year. If you're looking for someplace to donate, I know that you've probably seen lots of lists where to donate to help out.
I'll tell you where I chose to donate.
I chose to donate to Red Canary Song and to Asian Americans Advancing Justice in Atlanta,
two groups that support the rights of sex workers and of Asian Americans, respectively,
in Atlanta and around the country.
And just as a little addendum, there's been a lot of talk once again about sex addiction
this past week, as though sex addiction were an excuse for the hateful, violent, racist
behavior that we saw, and as though it were even something that exists in the way that
many people think it does um if you're curious to hear more about that you might go back and
listen to the interview i did with nicole prowse a few months ago uh titled the myth of sex addiction
you can find that in our archives wherever you get your podcasts. Now, on a different note,
I do want to remind you that we are doing a special set
of premium episodes for Stitcher Premium
called Questions and Adam.
It's a fantastic name I know
in which I and a comedian guest
take your questions and answer them.
If you want to listen to those episodes,
you can subscribe at Stitcher Premium.
And if you want to send us some questions for us to answer, and please send them even if you want to listen to those episodes, you can subscribe at Stitcher Premium. And if you
want to send us some questions for us to answer, and please send them even if you don't subscribe
to Stitcher Premium. I want to see your questions. You can send them to factually at adamconover.net.
I truly do read the emails and your question truly might be answered on the show.
Now, with that, let's talk about this week's show. We are flooded with misinformation in our society right now, from commercials to partisan media to straight up lies proliferating on social media.
It seems like there is just an avalanche of false information deluging over our eyeballs and into our brains every single moment of every single day. And the response to this,
what we often are told that we need is more media literacy.
Media literacy, we need to teach people
how to sift good information from bad information.
They need to be more literate
about the media that they consume.
Now, by any measure,
we are not doing a very good job of teaching media literacy.
A 2016 study found that
80% of middle schoolers couldn't tell the difference between a real news story and sponsored
advertiser content. We're teaching kids how to read, but we're not teaching them how to not read
bullshit, basically. And this is a problem because having the skill to figure out what is true,
to separate true from false, is incredibly important. In a democracy, it's
important that the populace believes true things and doesn't believe false things in order to make
educated decisions. If everyone believes that drinking clean water is bad for you and oil
spills are good for you, we might have some bad public policy decisions made that might be
detrimental to our society. We fundamentally need the inside of our heads
to match what is going on outside of our heads.
And critically, we need most people to agree
on what is going on outside of their heads.
If citizens in a democracy are operating
under completely different ideas of reality,
if they can't agree on even the most basic things,
you get an increasingly fractious
and conflict-ridden society.
You get pretty much the society that we're living in right now,
is what I'm trying to say.
And so there's a lot of talk about how to fight this,
and media literacy comes up again and again.
We should teach people to think critically and do their own research.
We hear that over and over again, don't we?
If we just teach people to think harder,
to think critically, to do their own research,
well, that could solve the problem.
Here's the problem, though.
We are starting to realize that those things
might not actually work and, in fact,
could be counterproductive.
Let's take that idea of doing your own research,
for example.
You know, when I was a kid, doing your own research meant going to the library,
finding a book, hey, maybe even asking a reference librarian
who could lead you to the most established experts on the topic,
who could give you a quick overview of the subject matter, all of that good stuff.
Right now, though, doing your own research for most people means opening a browser window,
going to a search engine and reading some random shit on the Internet.
You just type in those search terms and see what you get and start reading that WikiHow article or TikTok or whatever it is that comes up.
The cliche of telling someone to, quote, do their own research is an easy way, we now know, to actually
lead someone further into misinformation.
They might fall for something untrue or they might seek out information that confirms their
previous belief.
Right.
They search for are vaccines dangerous and end up on the vaccine conspiracy sites because
that's what they were looking for.
The fact is, if we just tell random members of the public who are not experts
in the field or are even trained in research skills to, quote, do their own research, we can
end up sending them to more bullshit sites, giving the lies home field advantage. Or what about the
call for critical thinking, where we tell people when they're confronted with information they're
not sure about, they should look at it super closely and really try to figure out are there statistical errors or are there lies embedded in
the text? Well, that can be just as counterproductive and it certainly isn't always an antidote to
misinformation. See, what recent research tells us is that the more time you spend digging into
misinformation, the more time you spend reading it closely, really looking between the lines,
well, the more time you're giving it to worm its way into your brain. Misinformation wants access to your brain. That is what it is. It is basically a zombie that's trying to get in there and eat
your brain and turn you into a zombie too, okay? So combating it doesn't mean thinking harder or
better about misinformation. It means something else entirely,
controlling our attention so that misinformation doesn't get too much of it. But so how do we do
this? How do we deal with misinformation? And most critically, how do we teach others? How do we teach
young people to combat misinformation? Well, to help answer, our guest today is Michael Caulfield.
He's a digital information literacy expert at Washington State University,
and he did the research on critical thinking that I was just telling you about,
and he has a veritable cornucopia of misinformation, fighting techniques, and ideas to share with you today.
I found this interview fascinating, and I know you will too.
Please welcome Michael Caulfield.
Michael, thank you so much for being here. Oh, my pleasure.
So you study and you write about misinformation. You write specifically that critical thinking
isn't enough to tackle misinformation. What do you mean by that?
Okay, well, that's a whole can of worms to start with.
Oh, we're jumping right into it, Michael. Okay. Well, I mean, let's put an important
qualifier on that. Critical thinking, you know, as currently taught, right? And to get into what
I mean by that, it might be best to just talk about how I got into this. Please. Back in 2010,
I was working for a small college and we decided that we wanted to have outcomes where students would learn what we were calling civic digital literacies.
And one of those was critical consumption.
Can they tell what's, you know, true and false on the Internet?
Can they tell what's reliable and unreliable on the Internet?
Can they do that? Right.
And so we decided to assess these.
And so we went through the normal training with the students.
So a model called CRAP, which I'm not a fan of currently, but a model called CRAP, two A's in that. And
at the end, we assessed them as to whether they got any better at it. And they didn't get better
at it. In some ways, they got worse, right? And when I say critical thinking isn't helping us with misinformation, I'm talking
about the sort of thing that's usually done in a university environment, a lot of times in a K-12
environment, and where critical thinking is associated with something washes up in front of
you, some sort of document, a video, something like that. And the idea of
critical thinking, as we're often taught it, is, well, look very deeply at this thing, like examine
it, turn it around in your hands, figure out, you know, does this document use scholarly language?
Does it have footnotes? What's the logical argument of it? These sorts of things. And we,
when we do that, what we're doing is we're immediately pushing
people to deeply engage with something when they have no idea what the provenance of it is
no idea where it came from and additionally uh they don't necessarily know where in sort of the
universe of discourse and the universe of claims, this particular claim stands. I'm not saying that
nobody should adopt a minority viewpoint. And that was one of the misperceptions of a recent article.
I have a lot of viewpoints that are probably minority viewpoints in a discipline, but I have
to know to start, hey, you know, the majority of this discipline actually disagrees with you.
Like, it's important for me to understand where I've landed, right? And so, when I say critical thinking doesn't help, this idea that we are going
to solve misinformation by getting people to pay deeper attention to every piece of information
that washes up in front of them, it's not sustainable because you don't have that much
attention. Or that much time, or that much
expertise. Right. And you're also giving disinformers what they want, which is your
deep attention. You're giving them a shot. You're giving them an audition, right? And so we have to
move away from this as sort of the first step. After you figure out where something has come
from, right? After you figure out, hey, this is where this claim sort of sits in the universe of discourse, hey, this is the
strengths or weaknesses of this particular source, then maybe you choose to go in deeper, right? But
too often we're skipping that first step and we're immediately reacting and engaging in the way we
teach students to, you know, quote unquote, critically think
is actually telling them that that's, that's what we want to do. And we want, we want to get away
from that. Like if you, this makes a lot of sense. Like if you, if you're confronting for the first
time, especially if you're a young person, you're 17 years old, you find some website on the Kennedy
assassination that says he was assassinated by aliens or whatever.
And then you're like, OK, think critically about it.
Really look carefully at it.
And you have never you actually have never read the mainstream history of JFK.
You don't know the general dialogue around the conspiracy theories.
All you're going to do is look at this one source closer that could actually pull you in a little bit.
OK, I'll look at it really close. Oh, wow. It seems to, well, I don't know about this part, but it's this part makes
sense. And you might get sucked in a little bit. So one of the things we do is we pre and we post
test students when we run them through the different sort of training we do, which I guess
we'll talk about a little bit later. But what we see in the pre-test is a lot of students simply apply what we would call a
plausibility test.
Does this seem like something that would happen?
And while that is good if you have experience in an area, right, you're pretty good at
determining the plausibility of things that you're intimately familiar with,
is not really good when things are sort of outside your realm of experience. So, I mean,
if I was to ask you, does it seem plausible that most vaccines take five, six, ten years to develop
and this vaccine takes one year to do it? Does that seem plausible to you?
this vaccine takes one year to do it.
Does that seem plausible to you?
Well, why would you think that you would have the ability to judge the plausibility? You have no experience with vaccines, vaccine developments,
no deep knowledge of why one vaccine would take five years at that time.
And yet you're told, hey, critically think about this, critically think about this.
And yet you're told, hey, critically think about this, critically think about this.
And you're simply not equipped for a question of that complexity until you get a little more basis in the reasons behind it.
But yet again, this is what we see, that students are, and they think that this is critical thinking.
They think, you know, one of the things we see from a lot of students is this, well, if this had happened, I would have heard about it already.
This is one of the big plausibility checks that students do. And that works for a lot of things.
Like, you know, honestly, like, you know, look, if, you know, if a bomb had just gone off in New York, you would have heard about it. That's true. But it also causes students to discount
a lot of things that are true.
Right. Right. You know, there's a lot of true important things happen that we don't hear about.
That's one of the problems with the news. Yeah. I mean, so so, you know, you can apply that same logic and you can say, look, if families were really being separated at the border like this, obviously everybody would be talking about this.
And you can dismiss it in that way.
So there's sort of a double-edged sword here
with plausibility.
But the key to plausibility is
you're not really great at assessing
the plausibility of things
that you don't have familiarity with.
Yeah.
So it seems like a simple point.
I mean, this is a great.
I'm sorry. As we're talking about this, it's starting to overwhelm me like the magnitude of the challenge that, you know, we're faced with online, not just online in our entire media ecosystem of being constantly confronted with false claims or dubious claims or angles, takes that have some ulterior motive behind them.
And the challenge for the average person to weed through them is enormous, especially when you still got to eat and brush your teeth and make a living.
Do you feel the same way?
I mean, yeah, absolutely.
Absolutely.
And I think it's I think it's made even worse by there's a sort of moral belief that we hold that we need to have an opinion on everything.
You know, that, you know, if you ask somebody, hey, what's your opinion on this?
And someone says, you know, honestly, I don't know. You know, that's somehow less than, but
honestly, a lot of times we don't know, right? And we're probably, and this gets into a whole
sort of different thing, maybe than critical thinking, but, you know, it's okay. It's okay not to know, right?
It's okay to realize, you know, that you don't know.
And so there's this idea that we should have an opinion
and we should be forming opinions on everything relatively instantly.
This is, of course, exacerbated by social media,
where social media is, you know a a bf skinner like process
where a bunch of things are thrown in front of you and you're supposed to immediately
sort of weigh in do i like this do i retweet this do i ignore this yeah sort of thing and so
so you get into the you get into this uh you get into this uh you get into this cycle where
where we're supposed to have an opinion on everything, and we're supposed to somehow develop that ourselves.
We can never sort of defer and say, you know, on this one,
I'm just going to go with what Fauci says.
I mean, you know, whatever Fauci says.
There's a sort of weirdness people feel about that.
They want to feel like they've dug in, they've done the research,
they've looked at all the,
they've looked at all that they can navigate and they've completely like
they've rerun the math themselves.
And it's,
it's kind of a bizarre,
it's kind of a bizarre way of thinking about truth seeking and information
seeking if you think about it.
And there's this odd thing where, you know, I think your criticism of the traditional way that
we talk about critical thinking and media literacy, I think is fundamentally right.
Because the thing that you're always told is, hey, do your own research, think critically and
do your own research. And that actually, what kind of advice is that? First of all, you're asking
people who are not professional researchers and are certainly not experts in whatever the topic is that they're researching to do their own research using what tools?
The Internet?
You know, I mean, I've only recently rediscovered in my 30s a good way to research is to go down to the public library.
And you still, you actually do get better information than you do on the Internet because so much good information is locked inside of books.
But, OK, let's say you do that. You do your own research on whatever the topic is.
Well, if I think about who is perpetrating the worst misinformation online, the biggest conspiracy theorists, the people who are spreading the most dangerous falsehoods, they're almost all non-professionals
who are doing their own research.
It's some like computer programmer
with a medium account going like,
well, hold on a second.
I ran my own numbers on COVID
and here's why I think it came from a lab or whatever.
And they write it very flashily
and you're looking at it,
you go, this person actually doesn't know
what the fuck they're talking about they just sat around their kitchen table and it came to the and found
a way to get to the conclusion they wanted to come to and so we're asking people to almost become
uh spreaders of misinformation when we do this there's a lot of there's a lot of data out there
there's a lot of there's a lot of there's a lot of facts out there there's a lot of, there's a lot of, there's a lot of facts out there. There's a
lot of events you can sort through that you can connect in multiple, multiple ways, right?
And it's overwhelming, right? It's overwhelming, right? If we look at, if we look at sort of,
if we look at all the statistics, all the numbers produced in the most
recent election, and in some way he says, hey, look at this number and look at this number and
look at how this changed this way. And so just, you know, the sort of raw data-ness of that
is overwhelming to the average individual. And it's overwhelming to the scholar too,
or it would be overwhelming to the
scholar, except what the scholar has, right? And what the academic has, what the expert has,
and what the professional has, the professional in a professional community, it's not just academics,
is they have a field, right? Of people who help them figure out, hey, what is credible? What is important? What is normal? What is not, right? And so, you know, when you think about people that are doing their own research, it's not that research is wrong, but research happens within a research community, right? And that research community is hopefully a bunch of
people who have figured out, you know, what are standards of evidence, right? What are the
procedures through which we vet information, right? What does credentialing look like
in our community, right? And it doesn't have to be, you know, it doesn't have to be,
you know, academia. It doesn't have to be that, you know, if you look at, you know,
if you look at, you know, professional acumen like, you know, plumbing or something like that,
right? There's a community that says, hey, this is a standard way of doing this and this is not,
and it helps focus your energy.
It helps focus the way you go about searching for solutions. If you kind of just do it yourself
without these sort of social structures that help us vet information, verify information,
set standards of evidence, figure out how to credential different people as having some
more authority to speak on an
issue than another. If you do it outside of that, you're going to become quickly overwhelmed,
right? And so one of the things that you do want to do, I do want people to think for themselves,
but I also want them to think with others, right? And thinking with others means figuring out,
with others, right? And thinking with others means figuring out, hey, what community actually has the expertise, the background to actually be able to separate the signal from the noise
on a specific issue. Find your way to that community and at least engage with it. Maybe
at the end of the day, you don't agree with it, but understand what they're, understand, you know, the primary arguments within that community.
Understand what they think is normal and what they think is abnormal and why, right?
One of the biggest things an expert has that the novice does not have is an expert knows what they see all the time, right?
And a novice doesn't.
And we saw that in spades with a lot of the election misinformation and disinformation.
You know, the experts would say, hey, look, every election at about 2 a.m. in the morning,
you're going to see the big cities dump the votes because they process a lot of votes. They end up
getting put out at 2 a.m. And those votes are going to primarily be Democratic because they're big cities.
And so expect a big jump in Democratic votes at 2 a.m. from some of the major urban centers.
We have seen this every election since the dawn of time. The novice goes in and they're like,
why suddenly did the Democrats jump ahead at 2 a.m. after they went to sleep?
You know, so they can't necessarily separate out what is normal from what is,
they can't necessarily separate out what is normal from what is.
And then their sort of pattern, their natural human pattern seeking
and their desire to sort of overturn the outcome,
lend them to, you know, tend them to go to the conclusion that,
oh, something nefarious was at foot.
There's a lot of it.
Motivated reasoning is absolutely a lot of it.
But I also, again, I think being overwhelmed
by sort of the raw data-ness of it plays a big part.
Because unless, again,
the reason why we have academic disciplines,
the reason why we have professional standards
is to help us
organize and get signal out of the noise, to get some sort of message out of the chaos. And if you
kind of enter it without that, you're going to be lost. And so when I say, when you approach
information and you want to engage with it, one of the things you want is someone that can kind
of give you the lay of the land
and someone that you trust to, right?
And it's not to say that you don't form
a unique opinion on it.
It's not to say that you don't disagree,
that you don't dissent eventually,
but you probably want to start,
if you're looking at election information,
you probably want to start and look at,
hey, what do people that study elections,
what do they think is normal?
What would they look for?
What would an election expert look for as evidence of fraud?
Right.
Well, what what does someone who's been studying this for 10 years, like if you're interested in the topic, the way I look at it sometimes is I'm getting I'm fast forwarding a little bit.
If I'm interested in the topic.
Well, I would want to read ABCD. I'd want to take this the topic, well, I would want to read A, B,
C,
D.
I'd want to take this class,
that class.
I'd want to read this book,
that book.
And then,
you know,
I'd want to,
you know, engage in this activity,
that activity.
And then I would have gained a base knowledge of the thing.
And I,
if I talk to someone who already has it,
they can fast forward me and say,
oh,
if you look into this stuff a lot,
you know,
ABC,
and that can,
there's that sort of general ground knowledge. And for lot, you know, ABC, and that can, there's
that sort of general ground knowledge.
And for the novice, that is worth so much more than someone linking you directly to
an Excel spreadsheet of the votes from Milwaukee, right?
Right.
And we have a hard time understanding or coming to terms with that.
or coming to terms with that.
Somehow we believe that diving directly into the Excel spreadsheet is a more noble endeavor
than trying to find somebody that says,
hey, if I was to look in this Excel spreadsheet
and there was fraud, what would that look like?
And what is kind of normal, right? That somehow that you're getting closer to the reality of things by diving directly into the spreadsheet, looking directly at the video. And this is, of course, you know, misinformation, disinformation on both sides. You know, there's this whole thing, term I really hate is this idea that,
oh, we're going to teach people to spot,
to spot disinformation, misinformation, right?
Because again, you know, this idea,
like you're going to get a video of some event
that maybe did happen or didn't happen.
And you're going to look at it closely
and you're going to figure out,
hey, does this look like it was faked?
You know, are there artifacts here?
Was this photoshopped?
You're not an image forensic expert.
The chance that you're going to do better at that than someone that has studied it for their entire studied image forensics for their entire life is essentially zero.
Yeah.
I mean, there's a joke about this on Internet forums that popped up around, you know, when I think in the early 2000s, that was, oh, this is fake.
I can tell by looking at the pixels.
Yeah. And that's something that people would say to make fun of people who claimed that by looking at they could look at an image and say, oh, this was Photoshopped.
Sometimes when you look at an image and say this is Photoshopped, you're right.
But sometimes you're wrong.
I think I love you made a comparison earlier
to plumbing i think that's actually a really good comparison because telling someone do your own
research is a lot like telling someone do your own plumbing it's actually good to know a little
bit about plumbing i have a i have a home i there's pipes in it i learned how to clean my
like the little trap that the grease gets in under my bathroom sink it was clogged and i learned how to clean my, like the little trap that the grease gets in under my bathroom sink.
It was clogged and I learned how that worked. Right. But why? And it's good to have that skill.
But while I was down there under there, I started going, wait a second. Look at this pipe.
Is that shouldn't, should that be like that? Well, something's fishy here because now,
now let's, let's be, when my house was built, they fucked up the plumbing. The so i'm looking at i'm going i think they fucked this up i think there's a mistake in
this plumbing and then i look at it some more and i finally figure out oh no that's how it's supposed
to be i almost took this pipe apart because something didn't make sense to me and you know
what i should do is i should ask someone who knows i should ask my neighbor who's a general contractor
hey do you know i sent him a picture does this look normal and he says yeah no that's normal that's supposed to be like that don't don't
touch that okay okay thank you i'm glad i asked somebody now that does it's like there's sort of a
uh that doesn't mean any plumber you ask is gonna help you out not every plumber is great but the
general like expertise does play a role here,
as does having a general awareness of how the system works yourself.
And you bring a good point up there. You get sometimes you get bad plumbers, right? Sometimes
you get good plumbers. One of the things we try to teach people when we look at SIFT, right? That's
our model for it, right? Stop, investigate the source,
find better coverage, and trace claims, quotes, and media to the original context. So one of the things in that F, right, that find other coverage, is not necessarily that you're finding one person,
but you're looking and saying, hey, is there anything that represents the views of a bunch of experts in the field,
right? Because, you know, what you would like is you would like some advice that's not just
dependent on whether you happen to get the one good plumber, right? There are good doctors,
there are bad doctors, there are doctors that believe a lot of sort of ridiculous things,
and there's doctors who are quite good.
So one of the things we do teach students is it's worth it looking for something like the American Medical Association.
Why?
Because that's a large organization that attempts to speak for the common knowledge, the consensus of a large body of medical professionals. And they have to be careful, too, right? I mean, they're not going to say anything that disagrees with a significant amount, you know, of their membership.
things that they're supported by the evidence, that it's broadly a consensus, at least among medical professionals. And when you find something like that, that's more useful to you than the view
of a single person. So you start to see how it, we almost have it completely backwards, right?
We think, oh, well, here's the evidence. If I went directly to the evidence, like then I would get
the best answer. And then maybe we step back from that and say, OK, well, if I can't go directly to the evidence, I will find the one expert, the one smartest expert, the person, the one expert who is right.
And then, you know, and then you say, well, what, you know, but that's flipped backwards. That's upside down.
you probably want to look and see what a body of experts says, right? If you couldn't find a body of experts, then maybe you'd want to resort to just finding, you know, finding a decent expert
on a subject. So, if you're doing plumbing, you'd want to say, hey, look, what is the standard in
here? Is there some place I can go that says, hey, this is the standard way you should be doing this,
that it's not just one person, right? You know, if you think about your house, right? There's standards for how the stuff
is supposed to be done in your house, right? What is the standard? What constitutes the standard?
If you can't find that, then maybe you want to find the expert. You say, okay, I'm going to find
an expert. I'm going to try to find an expert that I trust. And if you couldn't find that, then maybe you still, maybe you have to go deep into the specific problem itself. Maybe
if you couldn't find a standard on it, if you couldn't find a trusted expert on a particular
issue in your house, maybe you'd have to sort through it yourself and figure out, okay, what,
no one seems to know what this is, you know, how do I deal with it? But, but really that's, that's,
what no one seems to know what this is, you know, how do I deal with it? But, but really that's,
that's, um, that, I mean, that's, that's way down the line from the, these sorts of things.
So tell me again, what, what SIFT stands for? Yeah. So, uh, SIFT is a, is a model that we use with students. Um, it's, it's an acronym. It's,
uh, STOP. STOP is just, uh, before you share something or react to something, ask yourself, do I really know what this is?
Like, A, am I an expert in this area?
B, like, do I know anything about the credibility of this person who is sharing it or putting it in front of me?
Investigate the source.
We don't mean deep, you know, Pulitzer Prize winning investigations.
We just mean what is the basic, you know pulitzer prize winning investigations uh we just mean
what is the basic you know what is the agenda of this source right like what is it they try to do
you know and um a lot of times you can look them up on wikipedia and see oh is there uh
is there is there some incident in their past where they yeah Yeah, is there an incident in their past?
Or even just, you know,
if you're looking for a first answer on something,
you might want somebody that, you know,
is an expert in a field,
but you might steer away from somebody
that's heavily involved in advocacy, at least to start.
I believe that you want to engage with advocates, right?
But I'm just saying,
as you start to orient yourself to a new question, you might not want to start with advocates, right? But I'm just saying, as you start to orient yourself to a new
question, you might not want to start with advocates, right? Because you might not want to
start with someone who works for a think tank that's funded by an industry. Yeah, exactly.
Yeah, exactly. And it's not to say that the work they do is never useful. It's simply to say that
if I'm asking for the sort of a map of the landscape of an issue. I probably want to start with somebody that isn't, you know, drawing a map to try to get me to where they want me to go.
Right. Right. I want someone that's just interested in drawn maps as much as possible.
And so and so you look at that and this can be really simple.
You know, you see a story. It says, hey, look, there's a coronavirus outbreak, you know, at our local high school, right?
The COVID outbreak at our local high school.
Someone shares that.
It could just be hovering over like the profile and seeing, oh, actually, this is a local reporter.
They're not going to throw away, you know, they're going to they're going to have enough care with this issue that they're not simply going to report a rumor.
Right. Or, hey, you know,
this person here, this is a local comedy account, you know? We see this all the time, actually,
especially on the left. A lot of times you see literal jokes, Gorilla Channel, probably the
biggest example of this, jokes that people make going viral,
even though the person sharing it on Twitter,
the person sharing it on Instagram,
the person wherever,
the person sharing it says right in their account,
hey, comedian, right?
Yeah, yeah, yeah.
Comic tweets.
And people haven't even looked and said,
hey, this is a comedian.
That's a different context.
I'm slowing down here a little,
so let me get through the other pieces of it.
Yes. So, again, investigate the source, figure out, hey, is this a medical researcher? Is this a conspiracy theorist? Is this a local news reporter? Is this a local comedian?
Then, if for the issue that you're looking at, that level of authority,, and trustworthiness, uh, is not
sufficient, then find better coverage. And we really, really encourage people.
This is probably the biggest lesson in our, in our, uh, curriculum. Um, be, be really, um,
we really encourage people to not stick on the piece of information that comes to them,
the source that comes from them. And this is one of the things that social media really pushes you
to do. You get a story from somebody and it's something you're interested in, but rather than
backing up and saying, hey, if I really was interested in this, where would I go to find a
good coverage of this? You end up
engaging with that story because that happens to be the one that came in front of you. Right.
Right. So, so, uh, and this is just really, this is, this is just a bad way to go about things.
Right. And so find better coverage means, okay, so I got this story. It's from someone, maybe
the, I don't trust implicitly, um, rather than it sort of going through everything and trying to figure out,
hey, you know, what are they saying that's true? What are they saying that's false? Just ditch this
whole thing over here. Go find another story on it. Go to Google News search. You know, if someone,
you know, if there's a rumor that someone just died, you know, you know, the Keanu Reeves rule
is if you want to know if Keanu Reeves died or if it's a hoax on Twitter, you don't delve deeply into the tweet.
You go to Google News and you search Keanu Reeves.
I mean, if Keanu Reeves – this feels creepy now.
No, there's almost always other coverage of whatever the thing is.
Yeah, there's almost always other coverage of whatever the thing is always other coverage go read a couple new go
go look at the new york times and the la times side by side and and like just just see what else
is out there on the same time you know there's there was a coverage um there's coverage there's
a keanu reeves death hoax the reason why it's in my head is keanu reeves death hoax um a few years
back i said keanu reeves uh died while you're saying he didn't die? No, he didn't die.
Let's be really clear about this.
Okay, okay, okay.
I thought I was an imposter in Always Be My Maybe.
Okay.
So there was this story that he died while snowboarding that went viral, right?
And you get into that plausibility trap, right?
What do people do?
This is a really weird thing that happens with people psychologically, where the more detail something has, the more they tend
to think it's real. Like people think that, you know, it's, it's more likely, you know, it's more
likely to die from lung cancer by smoking than it is to die from lung cancer, which is like a
logical impossibility, right? But Keanu Reeves dies by snowboarding. People start to delve into that mentally and they're
like, well, yeah, I do know that Keanu Reeves does snowboard. Snowboarding is dangerous. And so you
get into this plausible. Don't do that. That's not just go to Google News, type in Keanu Reeves.
If something happened to Keanu Reeves, the whole world will be talking about it on Google News.
Right. And then the last piece is just trace claims, quotes and media to the original context.
So we see a lot of times that people use real media, real pictures.
We saw this with mail dumping mail in 2018,
and then they would say, look at this, you know, massive fraud, right?
Right.
If you trace that photo to the source, or if you click the story even sometimes,
and look at the date, you're like, actually, this is from years ago, right? Very similar things.
Very often, you'll see a photograph or a video where maybe the beginning has been clipped out.
We saw this with the Covington video, right, where the beginning had been clipped off.
And then everybody got outraged about it. And then it turned out to be a much more nuanced situation once people saw the full video.
Yes. Right.
And so if people had said, hey, this is so this is coming from someone I don't know.
I want to see the fullest version of this video to start.
Then then again, trace trace that to the to the to the source and see if that that original contexteds some light on the nature of that.
So SIFT, stop, investigate.
Yep.
And then what were the other two?
Find better coverage and then trace claims, quotes and media to the original context.
I really love this acronym, but I do feel it's a little hard to put into practice in
our current media environment.
And I want to find out how you feel about that but we gotta take a really quick break okay we'll be
right back with more Mike Caulfield.
So you talked right before the break about SIFT,
about this acronym for helping us avoid misinformation.
But it occurs to me that you used a couple examples that went super, super wide, even in the mainstream media.
You talked about the Covington video with the kids in Washington, D.C., with the standoff with the Native American man beating a drum, where there was a are all these stories about that the U.S. Postal
Service is getting rid of blue mailboxes, that they're destroying mailboxes. And there were
these photos of like dumps full of mailboxes. And then the story came out, oh, wait, those are
actually very normal. Those are old photos of normal mailbox dumping grounds. It is the case
that, you know,
there are some cuts being made
to the postal service that should concern us.
However, these photos are, you know, not accurate.
But that went very, very wide
and not just, you know,
you know, rando kooks saying this on social media.
This was sort of really touched
mainstream media coverage.
And so we're in this media environment where this happens like daily as a matter of course.
Do you feel we're in a crisis situation with misinformation?
I mean, people people say that we are, but you study it for a living.
Do you think it's really bad and that it's worse than it's ever been?
Well, you know, in the field of misinformation, people debate this all
the time. Is it worse? Is it just more visible? The school of thought that I subscribe to is
that at least the types of misinformation we're seeing now are more wired in to people in positions of authority and power.
Now, I got to make one little but immensely important footnote on that. When you look at
the history of racism in this country, you find that misinformation and disinformation about race
very often had its hands on the levers of power, too. Right. So I want to make sharing it, being exposed to it, the way
that it's affecting professions. This is, I think, new in that way. If you think about, for example,
we're looking at vaccines and there's some issues with nurses and EMTs and things like that, that some of them are actually be getting most of their information through those really reputable and largely trustworthy channels.
to share pictures of their kids, they're also exposed to this stream of things that is creating this sort of distrust in many places where distrust is not warranted. And so I think that
element, the way in which you're seeing also this with, you know, with police departments, right?
So nurses, police departments, policymakers.
They're all they're all you know, they're all engaged in their profession, in their field, but they're also exposed to sort of this massive amount of misinformation and disinformation.
And I really worry about the way that that shapes policy, shapes the rollout of various initiatives.
I mean, I think that can be really, really destructive.
Yeah, go ahead.
Well, it poses a problem for your methodology, I think, to a certain extent, because you talk about, well, let's defer to to experts or at least let's consult experts let's
when we're trying to answer a question like this let's go talk to an expert or let's survey the ama
or whatever the group expert body is but these bodies are also subject to misinformation right
yeah we have as you say we have folks in congress who are you know and so the, I don't think literacy itself solves this, right? I think
it has to be part of, you know, it's our tool belt of different things that happen, right?
I mean, one of the things is, I mean, and one is probably like eight legs to the stool, right? But,
you know, one of the legs is certainly
that some of these professional bodies have to be better at getting their message out,
communicating it, getting in front of people in the way that a lot of the people engaged in
disinformation are, right? Because the people that are engaged in disinformation are finding a way
to get this in front of people every other day and change their perception of things. And the people that are engaged in disinformation are finding a way to get this in front of people every other day and change their perception of things. And the people, you know, in these
organizations are going really through a lot of traditional channels. So communications
is a piece of that. You know, I will say when you look at the professions, it doesn't change my trust in something like the AMA or,
you know, a nurse's organization or something like that, because I do think it's still,
because they have the benefit of the professional knowledge, the level of erosion in a field like
that is actually less, right, than it is in other places.
But it is a very real concern.
And one of the things that I've been advocating is that these hey, there's a lot of things that kind of went far and wide and where there was a lot of confusion at first, whether that's Covington or the secret mailboxes or whatever.
You know, one of the things that does happen is if one of our superpowers is we can just wait.
We can just wait a bit.
So we can use SIFT and very quickly discover, hey, you know, I got a short bag of tricks.
I try out, it takes 60 seconds.
And if at the end of it, I come up with nothing, you know, I can wait and I can, you know,
I can instead of sharing this, I can bookmark it and I can come back to it tomorrow.
and I can come back to it tomorrow.
And the truth is usually by the next day, if it's,
if it's gotten any sort of traction,
you will actually find coverage on it.
You will find someone saying,
Hey,
actually these mailboxes have been around with the Covington video.
It wasn't very long until that second video emerged.
It actually really wasn't that long.
It was long enough that a lot of people didn't wait,
but it wasn't that long. And so part of it is lot of people didn't wait, but it wasn't that
long, right? And so part of it is you go through this stuff and you say, hey, you know what? I'm
going to hold off on this for a little bit. And to kind of bring that into the talk about
professionals, I think that's important for professionals too. The way a lot of this stuff
works is you have this sort of constant stream of stuff. And you just, even if you don't process it, it sort of builds this,
over time, it erodes your trust, right? It creates a feeling of unease, right? It's the sort of,
you know, I don't know precisely what, but there's something fishy there, right?
And the idea of using something like
SIFT is when you feel that strong emotional reaction, when something really has affected you,
that you actually don't leave it, right? They actually come to a conclusion either,
hey, this is worth my attention. It's not worth my attention, or maybe it's outright false.
Or actually, it turns out that this thing I just got really upset about,
people don't really know if it's true or not yet, right?
So maybe I should chill, right?
So I think that over time, if people do engage in that sort of activity
and realize, hey, I actually don't, like, I'm not going to get all upset
and disturbed about this until I know a little more,
that it will change the emotional disposition of people to some of these things. And I think it can stop some of the erosion of trust.
And I think it's particularly important that I do it at the university level.
I teach people to do it at the university level, teach the stuff at the university level. But I ultimately want to see this sort of training put into all the different disciplines that are dealing with misinformation and disinformation on a daily basis.
you know, you teach students how to do this.
Strikes me that a lot of the people spreading and who are frankly victims of misinformation
are not students.
They're very old.
Instead, people who are long out of school,
not going to go back to school anytime soon.
They're grandmas and grandpas with iPads.
And how do we get the message to those folks?
Well, I mean, there's multiple ways, right?
So people at University of Washington's Center for Uninformed Public are looking at ways of using public libraries to get some of this message out.
I think we can get some of this out through professional organizations that people might be involved with.
You know, I've talked to, done interviews with Consumer Reports to reach the sort of
Consumer Reports.
Every old person's favorite magazine.
And old people that would like to feel like they are good at figuring out that sort of
thing, right?
You know, and so there's that, right?
But I also think that there's this value.
One thing, so when we think about professionals, and I'll come back to why this relates to
students in like 10 seconds.
When we think about professionals, one of the things when we look at what we would teach
nurses, for example, is the majority of nurses consume good information, but they are confronted with patients that often have encountered misinformation.
And the patient says the misinformation.
What do they do?
Right.
How do they figure out what the heck the patient is talking about and how do they address whether that's true or false?
the heck the patient is talking about and how do they address, you know, whether that's true or false? Well, one of the things is you can give people in a profession like that the skills to
not only check this, but because SIFT is a methodology, they understand, hey, this is how
you go about it. And they can show that person, this is how I look at an issue like this. And
here's how I'm finding out that this is largely spurious, right? The same thing is true with students. One of the things we do
at the end of most of the courses that the people I, the faculty I, I teach, teach the students and
the ones where I've taught students directly, the students want to know, hey, how do we go and,
you know, how, how do we teach this to the adults? But a lot of
them are actually motivated, right? So one of the things that strikes me as I was having a
conversation with a person about SIFT, and they said the thing that had just struck them was that
their mom, every few months, their mom shared one of these fake Facebook
posts of this person that had supposedly gone missing, but hadn't.
You know, this is a big thing to get likes as people pretend someone's gone missing.
Share this, please.
She hasn't been home for three days.
Sometimes it's true.
And the person sharing that really wants to help, right?
It's not a bad motivation.
And if the person really is missing, it might provide help, right? So it's a dilemma, right? It's not a bad motivation. And if the person really is missing, it might
provide help, right? So it's a dilemma, right? And so she would say back to her mom, she would go
and she would check and she would find out, actually, mom, this is a hoax. This is a hoax.
And she'd say to her mom, this is a hoax. Mom, this is a hoax. And every few months, mom, this
is a hoax. And when, after going through SIFT, she said, you know, what I realized about
this is I have to tell my mom this is a hoax every couple months, but I could show my mom how to do
this. Because it's not, it's not that hard, right? You take the person's, with this particular sort
of misinformation, you take the person's name who supposedly got missing you throw that into a google news search uh and if all the articles come up are like you know snopes in
in truth or fiction in a hoax layer that person did not go missing right if if you find that
there is something from a local paper or something like that then that person is indeed missing, right? It's not hard. It literally takes
five seconds, but she was sort of bailing the boat out without dealing with the leak, right?
Yeah.
And so I think that that's a model that I'm interested in. And one of the reasons why we
try to make it as simple as possible and as methodical as possible. We have
these moves like just add Wikipedia to a URL and things like that. They're just sort of dirt simple
moves. And we try to do that because it's not only that we're teaching students how to do that,
but we want students to teach other people in their lives as well.
That's really great. That's really great advice. And I am really on
board with this mission. I do have a question about when I think it gets most complicated,
because again, you are saying, you know, a big thing is let's go look at the mainstream sources.
Let's look at the large professional groups and things of that nature. One of the things that I do that I focus on in my own
work is finding the places where the mainstream opinion is wrong or where the professional group
is wrong. And sometimes, you know, you mentioned the American Medical Association. We don't need
to go through a list. The American Medical Association has been had had wrong positions
many times. And there's been some other doctor saying oh the ama guidelines here the ama advice
is incorrect and then you know maybe after some decades it moves right and we we learn the truth
about whatever this condition is um whatever i mean i assume at some point you know we used to
lobotomize people regularly in america and presumably there were professional groups saying
this is a great treatment and there were a couple oh, my God, we need to stop doing all these lobotomies.
Right.
And here's the problem.
Now, I look for people who are saying that where it's true, where there's enough dissidents saying this that I can trust them and believe them.
However, misinformation also quite often takes the form of someone saying, ah, the mainstream is wrong. The mainstream is
incorrect. And how do you evaluate that? A very good example of this is that I still don't quite
have an opinion on is the lab leak hypothesis about COVID-19 because it's gone back and forth
about, you know, whether these are cranks saying this, whether it's, you know,
and you can, there's a group of doctors who say that we should take this more seriously.
There's also another enormous group of people saying that this is a dangerous conspiracy
theory.
And it's very, very difficult to weed out one from the other.
Do you have any specific tips or tricks in that sort of instance?
Yeah. So part of what we're doing when we look at Find Other coverage is we're trying to discern not just like what idea wins, right?
What claim wins?
That's not what we're looking for, right?
We're trying to discern, hey, how does – again, it's the lay of the land.
It's the map of this issue. And there are different ways that can play out.
You know, it could be something like global warming where there's a consensus.
Right. You know, global warming exists. The consensus is it's manmade.
Right. That's that that looks one way. Right.
It could be a majority minority sortority sort of thing, right?
Like there's a majority opinion, right?
And then there's this minority opinion.
And the minority opinion could end up being correct.
And what you would want to look for in a majority-minority opinion
is whether there is any certain split, right, that is indicative.
I'll give you an example.
With the mask, you know, guidance we got, right, there was a mask.
There was not a consensus that we shouldn't wear masks in, you know, in March, right?
There was not a consensus that we shouldn't wear masks.
There were, we had in Western countries, a lot of people saying we shouldn't wear masks. We had a
lot of experts in Asian countries saying, yeah, wear the mask. That's a really interesting split
because it turns out that the Asian countries have dealt with these pandemics more.
And so even though there's a split there, you look at that and you say, you know, if you were looking at, for example, recommendations on mastectomies and you found, hey, look, female researchers tend to lean this way.
Male researchers tend to lean this way.
That would be a really interesting split.
Right.
A majority minority thing on the, on the lab thing, there's another thing going on, which is just uncertainty. And you have a lot of
experts constantly trying to tell people repeatedly, we just don't know. We just don't
know. And you have a lot of people trying to blot that out and turn that into a yes or no.
out and turn that into a yes or no. And the case has been on the lab situation that it's a theoretical possibility, right? There's a scattered amount of information that
could be read one way or another way. And the majority of experts say, we actually,
we just don't know, right? And people are not willing to take uncertainty for an answer.
And it translates, unfortunately, it translates, however this turns out, right,
it's going to translate into, well, you said one thing and then we go back and you look at the comments and it's like, no, actually most,
a lot of people are saying, if you look at the experts in the area, there was there was a lot of uncertainty at the start.
But we can note that. So here. So, you know, again, I've read some of the debate on this issue and it I find it slightly it like bothers me when I read it and I'm trying to figure out why.
I also don't really have a stake in it, you know. But I
think part of the reason it starts to look like misinformation to me and I start to put in that
bucket is because the folks who are there seem to be folks pushing the lab leak piece of it,
that there seems to be some amount of wanting to be punitive against China or against the medical research establishment,
or they say it was a lab leak, it was these people's fault, and we need to acknowledge that.
And then scientists say, well, we actually don't know.
And they say, oh, but it could be, though. It could be.
And you're ignoring it.
Let's be specific about that.
There's a whiff of motivated reasoning there.
I think what you're seeing is you're seeing that people that are very motivated,
reasoning there. I think, I think what you're seeing is you're seeing that people that are very motivated, that have a punitive reason are expressing a level of certainty about the,
yes, it happened. The people who are not motivated are expressing a level of uncertainty. And maybe
that in, in what I would watch in a situation like that is I would watch those people that,
that, that seem to be unmotivated by that. And I would watch if their uncertainty starts to trend one way or another, that would be a really important signal to you, right?
But you get that, I think, again, by sort of looking at, hey, of the people that hold this
opinion strongly, what do they look like? Of the people that are uncertain,
what do they look like as a group? What is their background? Of the people that are uncertain, you know, what do they look like as a group? What is their
background? Of the people that believe it didn't happen, and we can show that it didn't happen
for various reasons, what does their background look like? And that's an incredibly complex
issue. And part of what we're talking about with SIFT is really dealing with the simple issues
very quickly. So if you want to give your
more attention to a question like that, you can give more attention to that without,
even along an issue like that. I'll give you an example of the way an issue like LabLeak
gets exacerbated by things that SIFT does deal with. There was a, just after we started getting a lot
of coverage of COVID-19 in this country in like February, a Harvard professor, I think, was
arrested for, you know, illegally working with the Chinese government. That is, they were taking
some money for a bunch of projects
that they hadn't disclosed and so forth. And he worked at a lab in Harvard. And this was blown up
as, right, this guy was working with Wuhan. He has a background in chemical warfare, you know,
all this stuff. None of that's true, right. What's true is that he works on a number of
things, none of which relate to COVID-19. If you click the article and you actually read the
article being shared, you can find that out. And all of this stuff was sort of built around it,
right? You are going to have a better, make better decisions on a on a an issue like this um if you can immediately kind of you know
if you have a standard of evidence right yeah you can immediately say look i don't want to be
distracted by that i don't want to be like filtering out all that stuff and coming down to
you know what the what the basics um what the basics of the issue uh are and then and then of
course again really when you see phrases,
people, you know, scientists use for uncertainty, understanding that they really mean they're
uncertain and understanding what that uncertainty means. That's a more complex issue, but I think
one that we've got to get better at because otherwise what people feel like is, people feel like science is constantly whiplashing back and forth. And in a lot of cases, there are a lot of cases where science is actually moving forward in a fairly measured fashion.
Yeah.
Right? It's just that the clickbait headlines that we're sort of exposed to create every new study as a new consensus.
headlines that we're sort of exposed to create every new study as a new consensus.
Yeah, I mean, there's a we need to be able to separate out the science from the media coverage of the science to a certain extent. Like, I think another good example of that is at the very
beginning of the pandemic, when the medical advice was very much about wiping down surfaces,
surface based. And then the media kept repeating that that became our cultural understanding of was very much about wiping down surfaces, surface-based.
And then the media kept repeating that.
That became our cultural understanding of how to fight coronavirus.
And then within a couple of months, we started to realize,
scientists started to realize,
oh, this is actually spread via aerosols through the air.
But that message wasn't getting out yet.
It wasn't permeating the media. So you had all these scientists saying,
hold on a second, aerosols, aerosols, aerosols.
There was like a contact tracing study
of like a restaurant, I think in Taiwan,
there were multiple infectious disease experts saying this,
if you look at the pattern,
there was a case with a choir,
there was this, we covered this on this podcast.
And then at that moment,
it was sort of like this dissident group because it was these
were folks who were counterpost against the mainstream media um but they were uh scientists
so one of the things is people uh a lot of conspiracy theorists want you to confuse a
fringe idea with a minority idea right and? And those are two different things,
right? A minority position is one which is not the majority position within a discipline or a field,
but which is engaged with the people in that field trying to make the case,
right? And the people in the field are engaged with dealing with the very real
evidence and objections that that minority position raises, right?
A fringe position is just that.
It's fringe because it actually doesn't engage with the field at all, right?
Yeah.
And it doesn't want to, right?
It doesn't want to, right?
There's not a desire to engage with the field because the field has certain standards of evidence, certain processes, certain things which are not conducive to the specific fringe idea. And they feel like their case is better made by sort of directly taking the fringe idea to the public, which does not understand what those standards of evidence are and so forth.
And so these are two very different
things. And I think what you find is there are a lot of people invested in making you think that
they're the same thing. And usually, if an idea has some legs, right, usually, not always. I mean,
If an idea has some legs, right, usually, not always.
I mean, like, as you said, you go into a lot of cases, you know, on this program, on your TV show, where people got things very wrong.
And nutrition is like one of take a look at this.
Maybe not everybody.
And there is.
There is.
I mean, the nature of we know that the culture of science can be resistant to new ideas sometimes. It absolutely can. But there's also a
huge benefit to people that end up breaking through that in showing the new idea, right?
So there are different incentives in that. And I think what you do find is in most cases,
a person with an idea where there's good evidence can actually at least build a small community within a discipline.
You know, it's weird.
In some ways, my approach to media literacy is a minority position in terms of my profession, right?
The profession actually engages in this thing they've done since the 1990s.
I'm saying that thing doesn't work.
Yeah.
You know?
Hold on a second.
Hold on a second. Hold on a second. I need to, I hold on. I need to stop. Okay. This is too meta for the program.
Yeah. Okay. Michael Caulfield. I got to investigate the source. Well, I did learn
about him from a piece of the New York times, so that's pretty good. Yeah. But I need to go,
I need to go find other sources. Okay. I I'll do that right after i'm done interviewing okay so so
here's here's what you know if people were looking into my minority idea here um i would encourage
them to look uh at the number of uh of professionals of uh of uh you know librarians that teach this
stuff all the time and constantly assess it, the uptake in that.
The recent, some of the recent research that has come out on lateral reading, which is, you know, one of the underlying principles of this is something we call lateral reading.
But you'd look at that and you'd find it's small, right? actually I did not, when I started going out and proposing this stuff, I did not have a hard time
gaining relatively quickly a group of professionals who understood, hey, what we're doing is not
working. Let's try this, assessing it, finding out it did work, and growing that. It was not,
even though this is like fundamentally opposite of the way that we've done things since the 90s
in higher education, I was able to get, because the idea was valuable and useful
within the profession, I was able to get a group of people around it relatively quickly.
Now, if after four years, every single librarian I had talked to had said, you don't know what you're talking about.
I tried this, and the students are worse at this than ever.
If I could not find any librarian in the country or any set of librarians in the country, librarians often do the info at a college level.
I assume everybody knows that
maybe people don't know that if I, if I couldn't find that, I think you'd be right in saying like,
you know, why, why are we listening to this guy? The people that do this, nobody actually,
nobody actually finds this useful. Right. And so, so, so it's important for us to,
to do this. I'm defending myself as a, as a minority idea, but I'm absolutely sensitive to these issues because I am in that position, but I'm out there and I'm
not just taking my case to non-professionals, non-academics. I'm out there trying to engage
with the academic community, engage with the academic community,
engage with the professional community,
make the case and change how we think about this.
Yeah, you're not the person who's been pushing an idea
that has been tested and failed and nobody liked
and is no longer invited to the conferences
and everyone says, oh my gosh,
this person is embarrassing themselves.
That's what everyone in your field says,
but you're able to sort of swindle podcast hosts
into letting you come on and talk to their audiences.
Exactly, exactly.
There's plenty of podcasts that do that.
They have the fringe people on
and they present them as though they're,
I won't name any names,
but they present the fringe ideas
as though these are reputable people
when in fact they're the folks
that couldn't get invited to a dinner with other people in their field. And that's not to say that social,
those sorts of social relationships are more important than anything, but they're,
they're, they are an indicator of whether someone's ideas have legs or not. Because yeah,
if you're talking about the entire, the entire community of the people who do this,
you should be able to get at least, you know, a fair hearing and some friends in the community if you're if you're going about it in
the right way and your ideas have have legs and are coherent. If you could be the person that
proved right, if you could be the person that proved that climate change was not manmade,
right, if you actually thought you had the evidence for that and you could show
it to a group of professionals and gain any sort of following on that. And I want to be careful
here. I want to just state again and again, like climate change is man-made. It's very serious.
There's an absolute consensus on it. But let's say you were a person and somehow you came across
evidence that it's not man-made. If you did that successfully,
you would be in the history books, right? You would be in the history books. The incentives
for that, there are heavy incentives for people to just continue what they're doing
in the dominant paradigm. But for a small group of people, especially people that are maybe not
invested too much in the older institutional
structures, but who are in the actual discipline, there are also amazing incentives for people to
contest that. And the reason why we trust things like science is not because we trust scientists,
and it's not even that we trust the scientific process as, oh, I come up with a null hypothesis,
and then I put X into test tubes and do this and so forth.
That's not why we trust science.
There's a great book by Naomi Ureskes, who wrote Merchants of Doubt.
She has a great book called Why Trust Science.
And she lays out, I think, a convincing argument.
The reason we trust science is we've built social structures that in many ways when
they when they are effective split the difference between making sure you know uh making sure that
people make informed decisions that do not ignore the history of the past but also have incentives
for people to produce new ideas and we we have systems, you know, whether they're journals,
whether they're conferences,
whether they're, you know, particular statistical models,
we have systems to resolve these debates.
We have, you know, again, a system of credentialing
to help us more easily recognize people
who other people have at least thought was worth,
a person worth listening to, right? And so it's the system as a whole we trust. And does the
system fail? Absolutely, it fails. But it actually, over time, it does spectacularly better than any
one individual person at discerning what is true and what is false and it's because it's rigged up as this as this system of um of these of the system of sort of competing goods right yeah um
uh and and occasionally we have to go we have to say hey um you know on this particular issue it's
it's not working like we need to uh like you can make the argument that you know the the climate
change the the institution of climate change science
is you know there are incentives to toe the line of what everyone else is saying right however at
the big climate change conference of which there's many people give talks and your talk the incentive
is to give a talk that brings a new idea in right if you bring in a brand new big idea that blows
everyone's mind you're going to give a lot of talks.
Everyone's going to swarm you at the at the drinks function afterwards. You're going to get your shit published. You write a book, you'll go on the news, et cetera.
And so there is an incentive to bring new ideas in. But then, as you say, there's also an incentive just to have a fringe idea like, you know, people who are critical of climate change say, oh, you know, it's this big institution where, you know, everyone is forced to toe the line because that's what, you know, you don't get tenure unless you say climate change is real or whatever.
But the people saying that are usually being paid by the oil industry.
There's an incentive to be that person too. When you're looking at the lay of the land regarding an issue, what you want to say is, hey, so this is where people fall on the issue. Are there any attributes, right? Are there any attributes that people falling, you know, in this area of the issue share in common?
they share is, hey, you know, everybody that sort of, you know, think of a map as sort of a two-dimensional space and we sort of place everybody in, we have maybe two axes, right?
One is sort of certainty and one is, you know, yes, no, or something on a particular issue.
If you look and you say, hey, everybody up here in sort of the right upper quadrant
is funded by oil money. Yeah. Yeah, that's a big warning sign, right?
That's a huge warning sign.
If you look at something, journalism isn't immune to this either.
There was a long time where journalists did not take seriously the missing Indigenous women and murders among Indigenous women.
It simply was not being covered by the mainstream press, right?
But it was being covered by Indigenous reporters.
Now, if you looked at the lay of the land there and you said, hey,
is there an epidemic of missing Indigenous women, right?
And you looked up here and you said, oh, okay, well, actually,
none of this stuff is being reported there.
And then you look down here and you said, actually,
the people, the reporters in these actual communities, the local Indigenous reporters
are reporting this. You say, that's an interesting place. That's an interesting division. I mean,
it turns out to be a really horrifying division, given what we now know. But that sort of pattern
where you're seeing not only, hey, what's, you know, it's not like everybody takes a vote and we just go with what the vote is.
You want to get a very quick lay of the land, understand where people fall and understand, like you were saying about the lab leak thing.
the lab leak thing, one of the things that you're noticing is that the people that are expressing,
you know, highest on that axis of certainty, and yes, it happened, are also a bunch of people who are, tend to be engaged in a sort of a political gamesmanship, right? And tend,
you know, tend to be a certain, a certain group of people, right?
Let's say anti-China people would be what I feel I've noticed.
Yeah.
And so what you would say is, look,
the certainty here seems to be associated with a particular political
valence, right?
Yeah.
The,
which makes me think that maybe,
maybe a level of uncertainty is, is, is warranted here, right?
If it turns out that as this field, as this sort of,
you know, again, this sort of two-dimensional map of certainty and validity of claim pans out,
if you start to see the people that are not just the anti-China people moving into a different
quadrant relative to certainty or relative to, you know, yes or no.
As those people move, that would be a really important signal to you.
Right. And so one of the things if you want to I deal mostly with students, we deal with we're trying to teach them the first two minutes they encounter a document.
Right. But my dream, my dream is to teach a course that is really on just social epistemology, right? How do we look at, how do we look at the sort of opinions expressed of a variety of people on claims and read that, like read that, read that like a map to tell us, hey, you know, this is what I see interesting about it. And one of
those things would be if you have a bunch of people who have expressed uncertainty,
but then you see that group and they're suddenly shifting into certainty on an issue, that's a
sign, right? That's a sign something is going on, and paying attention to
that is important. If it's just the certain people getting louder, right, that's not as much of a
sign. And so, that again goes much deeper than SIF. But yeah, I take the map metaphor, I should
say, from Sam Weinberg, who did some of this stuff on lateral reading down at Stanford that inspired a lot of my work.
But he has this analogy in this paper called Lateral Reading, Reading Less and Learning More.
And I love it.
He says, imagine you're just dropped by parachute into the middle of this unfamiliar landscape.
And then you're like,
I don't have any food. I got to get to, I got to get to a town. I got to get, I got to get out.
But you don't know where the heck you are. Right. He said, the way that most of us act is we just start going in a direction, you know, but what we want is we want to understand,
You know, but what we want is we want to understand, we want to understand where we are, right? And then start to be a little methodical about that. And if you extend that metaphor, I think what you really want is you want to kind of have a mental map of, you know, I'm trying to think of a, there's a, you know, people argue over statin drugs, right? Are we, you know, are we prescribing too many or too little? And, you know, what's the
optimum amount of statin drug prescriptions, right? Well, you know, know again if you land on somebody that's that's that's uh
if you land on somebody that is if you land in this issue um just understanding hey you know
how how is it laid out like what is what is what is sort of off the map in terms of fringe
what's sort of a minority is the minority research uh associated with the pharmaceutical companies or not right yeah uh you
know and um um take your statin drugs if your doctor tells you by the way i'm not saying i just
pulled that out of my head and now i'm thinking you know uh people are going to think i'm saying
something about that but but you know people have to make these decisions ultimately, right? And understanding this as something that's a little more than a one-dimensional thing.
Understanding this as something that is where you just want to sort of read the map of the field.
Yeah.
Like there was this, to bring it back to the lab leak hypothesis, there's a big story about that.
I can't just say I did not anticipate when I came on the show. That we a big story about that. I did. I can't just say I did not anticipate
when I came on the show that we'd be talking about spending this much time talking about
about that. I would have prepped very differently. Oh, I'm sorry. Well, no, no, no, no. But that's
OK. I think it's a good example of where there has been a lot of uncertainty in the expert
community, but it's been not well communicated because the loudest voices have been,
have been certain voices.
Yeah.
Well,
it's one that I haven't prepped that much for either,
but it's,
but it's something that I've read and been trying to make sense of myself in
the last,
you know,
four or five months.
And there was,
you know,
a big,
there's a great big long article.
And I think it was the Atlantic.
I forget on this that like made a bunch of waves and people were very angry
about it.
It was a Atlantic, I forget, on this that like made a bunch of waves and people were very angry about it. It was a really long article and it was all about making the case for this hypothesis.
And to bring back around full circle your recommendation, it would be like, OK, well, maybe instead of reading every bit of that article with a fine tooth comb, I mean, read it.
But rather than dive into it and look at every single one of its claims, the more important thing to do is scope out and say, OK, this is one position that somebody holds.
This is one set of arguments. How do how does the entire field feel about this?
And where are they clustered and how are the people clustered?
Like, what are their incentives? Where do they come from? How certain are they?
So you can get that sort of broader sense before you go and make your evaluation.
And then I love this.
You don't necessarily need to have an opinion yet.
You can just know that now.
Now you just know what the consensus of it is.
And I love that because I think so much of the time we are too focused on getting to the conclusion and getting to the what then, you know?
I mean, this happens with like, I'm going to bring it to a spot that you'll be even more uncomfortable with.
This happens with Me Too allegations, right?
People are like, I don't know.
Did Woody Allen do it or not?
And I'm like, you know what?
You don't need to be a judge and jury here.
You can just know the whole, you can just know, right?
You can just hear,
this is what this person says.
This is what this person says.
And now you have all that in your head.
And now it's up to you
if you want to watch Annie Hall or not.
I don't give a shit.
You know what I mean?
But like, you have a sense of like
what the overall picture is.
You don't need to,
you don't need to come down
and say, here's my opinion on it.
You can just have a survey of, here's the dialogue.
And that can be really valuable.
I think we underrate how valuable that is.
So let me sort of take, yeah, bring it to an even more uncomfortable place.
But let me take that example because there's something relevant in that example.
One of the things we saw early with early Me Too allegations is, you know, an allegation would come out.
And then suddenly Hollywood would race to dissociate themselves from this person.
Right. And everybody would say, look at this. Look at this.
Everybody, everybody is just one person says something.
And then everybody is just throwing this person overboard.
And what happened almost every time, right?
What happened almost every time?
You learned that actually, no, it wasn't this one allegation.
That the allegations had been circling around in that community for years and years and years and years.
Decades in some cases.
Decades, right?
for years and years and years and years.
Decades in some cases.
Decades, right?
And so when we think about,
now this is a kind of a weird environment because no one's being public
about where they stand on the issue.
But when you actually look at some of the people
that are very quickly dissociating themselves from it,
one of the things you might be thinking about is,
hey, if Tom Hanks is throwing this person overboard,
like, you know, maybe Tom Hanks has some information that I don't, right? You know,
you know what I mean? So, so I think, I think, and I'm not bringing SIF to that sort of thing.
It's not something I'm going to, I'm going to promote. But one of the things I notice now when these allegations come out is I do notice, you know,
when a bunch of people move very quickly on this who have knowledge of this person, you
know, I read that as a signal, right?
I read that as a signal.
But to your point, too, one of the things you can say is just you can have simply the
position.
Look, we actually everybody stand back.
Let this person tell their story.
Yeah.
No.
And do not get in the way of the story.
And maybe your position should be that when people try to shut that person telling the
down, stand up and say, no, take a seat.
Listen to what this person is saying.
That is what we do during this period.
Yeah, because that doesn't mean you have a position on that.
Yeah. People tend to have this tendency to jump to they say, oh, oh, are you are you going to pronounce them guilty?
Are you the jury and should are you going to ban them from the entertainment industry?
And it's like, no, we don't. And they say that in order
to shut down the conversation, in order to say, don't listen to the allegations. And what I'm
saying is, and I think what you're saying is, no, let's listen to them and hold them in our minds.
And then we can listen to what the person says in their own defense. And guess what? We're not
sitting on a jury. Now we just know these things and they can influence our behavior.
We decide how they can influence.
But hearing it out and understanding, you know, that this is and really holding this is what this person says.
Because there's a trick being done there, right?
There's a trick being done and it's being done actually at the expense of the women and men bringing forward the charges, right?
It's being done at their expense.
bringing forward the charges, right? It's being done at their expense. The trick is,
because we're immediately moving to a decision, that we have to enforce these rules of absolute evidence, right? I mean, that's the trick that's going on there. And of course,
we don't have all the evidence yet. We haven't had the conversation. We're trying to have the
conversation to find out, you know, was this a pattern of behavior? You know,
what other people experience similar things? And then you have this other side that's immediately
saying, well, you're, what you're saying is this, right? You're, you're, you're trying this person
in sentences and, you know, and so it's important to think about what you're trying to accomplish
with these things. I think you probably, you may
do yourself a disservice sometimes if you start engaging with people on their terms like that.
Yeah. That there has to be, I mean, you'll have a situation where supposedly a pattern of behavior
has been happening for 10 years and people just want to have a discussion about it for a few weeks
yeah like that is not that's not an absurd claim that's an absurd request right you know so so i i
do think that um yeah i i uh i you know i i really want to stress this is not what what sift is about at all but but i think you're right i think you're right
this idea that this idea that um to even express a sort of well that you know that that that seems
fishy like that you have to have ultimate evidence for that at that point right yeah as though as
though having the discussion were sentencing someone as guilty
and therefore we shouldn't have the discussion.
No, we're gonna have a discussion about it.
Like, let's do it and let's listen and let's observe.
And it's okay for that not to result in some sort of,
like there's a demand for let's go guilty
or innocent at the end.
Or to take another example, with any kind of lab leak hypothesis, anything like that.
It's like, OK, now let now that you've read the article, decide which do you think it was or not?
It's like, no, you can hold some uncertainty and still have learned something and still have, you know, that impact your behavior in an interesting way.
OK, let me end with this question.
I do want to know, again, it feels like a minefield,
our current media ecosystem, social media, the internet.
However, there's so many good things that came with it.
We have so many new voices that are heard
and there's certain, you know, there's debunkings
and kinds of information getting out that were not in the
past and when we had a more gatekeepy, you know, mass media landscape. So do you feel generally
optimistic or pessimistic about our new communications world that we live in? Do you
think that it's rife with misinformation or do you think that, you know, the internet and all of our digital communication gives us better tools to combat? Or do you not
feel one way or the other? I don't want to pin you down.
Well, I mean, so we talked about uncertainty and, you know, here's one where I'm uncertain. I mean,
we do, you know, people that look over the course of history look at the introduction of other major technologies, you know, the introduction of print being, you know, one of the big analogs.
And, of course, after print was introduced, there were an awful lot of, you know, there was an awful lot of possibilities.
There were an awful lot of problems. It was, it was, um, you know, it, it, um, it took a while for people to
figure out, Hey, you know, what do you, what sorts of institutions, right? What sorts of skills,
uh, what sorts of oversight and, and, and gatekeeping, if you want to call it that, do we need around print?
And so you start to get, you know, publishers.
You know, publishers start to have reputations, right?
People learn to read the reputation of publishers as a possible indication of the reputation of the published person.
indication of the reputation of the published person. You start to have scholarly societies who print, you know, scientific tracks, but then develop ways of having public conversations
about the thing the person put out, right? And doing that in public, back and forth. So,
you develop all these social mechanisms,
and some of them are institutional,
and some of them are educational, right?
Helping people navigate, and you need both.
And I think there's still a lot of potential,
I mean, in the level of access to the knowledge that we have
and our ability to get different stories out, to tell different stories. still a lot of potential and i mean in in the the level of access to the knowledge that we have and
our ability to get different stories out to tell different stories but um the technology has run
ahead of the institutions and it's run ahead of our intuitions about how to approach it. And in that gap, we're in, you know, we're in a pretty tenuous
place. Yeah. And so I think, I think the idea here is, is not, again, to come back and say not,
oh, well, this set of solutions solves and whatever. But I do think that we have to try to move as quickly as makes sense to try to get people the sort of education they need to navigate this new environment and try to figure out what sort of institutional changes, oversight, just even accountability. What's a person's accountability in this new environment
where everybody is suddenly a publisher, but is not generally held to publishing accountability,
right? We need to get to some of those answers as quickly as we can while still, you know, not shutting
down the discussion.
Yeah, well, and educating people and creating a culture of, you know, of people who are
able to think through these things and have, you know, the right defense mechanisms and
the right habits to help them sort through the incredible amount of information is going
to be incredibly key.
And I thank you for doing that work and for coming on to talk to us about it.
Okay. My pleasure.
Well, thank you once again to Michael Caulfield for coming on the show. I hope you enjoyed that as much as I did. If you did, hey, please leave us a rating or review wherever you subscribe.
It really does help us out. And more importantly, tell a friend or family member about the show.
Share it with a loved one.
That is the way that you can pay it forward, both to me and to the friend who you'll be giving the gift of factually to.
I want to thank our producers, Kimmy Lucas and Sam Rodman, our engineer, Andrew Carson, Andrew WK for our theme song.
The incredible folks at Falcon Northwest Gaming PCs for building me,
the wonderful gaming PC that I'm recording this very episode on.
You can watch me stream video games and whatnot at twitch.tv slash Adam Conover.
By the way, right now, and maybe still when you're listening,
if you listen to this in a couple months, maybe I'll still be doing it,
but right now as this comes out,
I am hosting a live stand-up comedy show every Thursday at 6 p.m. Pacific.
You can find it at twitch.tv slash Adam Conover.
I bring some of the greatest comics from around the world straight to my Twitch stream.
You can find me anywhere else you get your social media at Adam Conover.
You can find my website, my mailing list at adamconover.net.
And that is it for us this week on Factually.
Thank you so much for listening.
We'll see you all next week.
you so much for listening. We'll see you all next week.
That was a HeadGum Podcast.