Your Undivided Attention - The Spin Doctors Are In — with Renée DiResta
Episode Date: May 7, 2020How does disinformation spread in the age of COVID-19? It takes an expert like Renée DiResta to trace conspiracy theories back to their source. She’s already exposed how Russian state actors manipu...lated the 2016 election, but that was just a prelude to what she’s seeing online today: a convergence of state actors and lone individuals, anti-vaxxers and NRA supporters, scam artists and preachers and the occasional fan of cuddly pandas. What ties all of these disparate actors together is an information ecosystem that’s breaking down before our eyes. We explore what’s going wrong and what we must do to fix it in this interview with Renée DiResta, Research Manager at the Stanford Internet Observatory.
Transcript
Discussion (0)
I was looking at some of the Michigan protest activity.
The armed standoff had happened yesterday.
One of the interesting things was the extent to which those protests had been co-organized
by the anti-vax movements because they've been holding protest at state houses for five years now.
That's Renee DiResta, a disinformation researcher with the Stanford Internet Observatory.
We've had Renee on the show before.
She's seen all of the things that Russia did to manipulate the elections in 2016.
I highly recommend you check out that episode.
Today we have her on talking about coronavirus conspiracy theories and what different state actors and non-state actors are doing to amplify.
Anytime there's a new vaccine law that pops up, they have a whole caravan system set up and all these things are in place.
And a lot of their ability to kind of put on a spectacle is front and center right now.
And it's really remarkable how press just kind of keeps falling for a lot of the same stuff.
I'm Tristan Harris.
I'm Azaraskin.
and this is your undivided attention.
So, Renee, a lot has changed since the world has turned upside down.
You've been tracking how the information environment has gone totally haywire
and especially manipulated by foreign actors, bottom up, top down.
Give us a map of how to think about what's been happening in the coronavirus landscape.
Sure.
Sure. So I'm looking both at overt media as well as the more kind of covert troll farm stuff that people think about when they think, you know, disinformation research. I've been looking at top down versus bottom up. So that's the difference between what comes out of state actors, how to state actors communicate. How do those with access to massive audiences kind of comport themselves versus the more bottom up grassroots clusters of people behaving in unison and spreading information amongst themselves?
You know, I actually got my start doing this work looking at the anti-vaccine movement,
which is, of course, all of a sudden, highly relevant again in very, very material ways.
This is the topic that the entire world is talking about.
You know, everyone from anti-vaxxers to the government of China lately,
trying to understand what's going on in the information environment.
What trends are you seeing that are most sort of concerning to you?
I mean, because we're on this podcast because we're trying to meet a real existential threat.
And the anti-vaccine movement, instead of being, you know,
this sort of niche issue that you're an expert on has become.
relevant to the entire world's attention. You know, I've been hearing from some of the
platforms that there's actually preemptive kind of conspiracy theories about a future vaccine
and getting people to be avoiding taking that. What most concerns you about this and what
are the trends that you're seeing and how it's different from the kind of previous ways that
information's been spread? So I would say the anti-vaccine movements have been very well organized
on social media for years. A lot of the large pages had been on Facebook since 2009. And then in
2015, there was really this, there was a lot of new kind of upstarts that began to call themselves
medical freedom and health choice. So medical freedom and health choice are euphemisms that
they've been using for quite some time. And what that translates into is I have the medical
freedom to refuse vaccines, you know, just purely by virtue of like my own free will as a person.
The way that governments have traditionally come down on that is like, well, then, you know,
there's certain affordances that you don't get right. You can't send your child to school. That's the
kind of thing that that's the social contract or the social balance. COVID-19, of course,
since we are hearing from so many highly reputable voices that a vaccine is going to take quite some
time. They are working as fast as they can. What the anti-vaccine movement has chosen to do
in a very, very coordinated and deliberate way is begin to undermine confidence in the vaccine program
in general. So they see this as a fantastic opportunity because if you search for something about a
vaccine or a treatment or something else, you're going to encounter nothing for a while, right?
You're going to encounter very, very thin search results. And so this provides an opportunity
for determined individuals to begin to try to lay claim to the information landscape so that
when you search the name of, say, a researcher who all of a sudden, you know, inadvertently becomes
a public figure just by virtue of the work that they're doing or a new drug name, you know,
how many people knew what hydroxychloroquine was two weeks ago or a month ago. And so what you
see is there's an opportunity for people to create content to kind of fill those holes. And
that is something that the anti-vaccine movement has prioritized for quite some time. I wrote about
this in 2018 in the context of the very routine vitamin K shot, which is not a vaccine at all. It's
just a vitamin that kids get a shot to prevent brain hemorrhaging after they're born because
we're born without, we're born deficient in vitamin K. And so unfortunately, for a long
period of time if you searched Google for vitamin K, the results that you would find were not from
the Mayo Clinic or Children's Hospital of Philadelphia. Instead, you would get the healthy home
economist who's some random woman who just writes a blog post about how she's rejected the vitamin
K shot, as if this is a noble freedom fighting thing to do as opposed to an incredibly risky
decision. But that content becomes popular. It's widely shared on Facebook. It's widely shared on
Twitter by the sort of networked anti-vaccine movement. They all link to each other's stuff in a very
coordinated way. And so that was dominating search results for that term. Unfortunately, what we see
there is those articles being shared in groups where children do, in fact, actually die of brain
bleeds because there are consequences to these things. And so that was something where Google and
the social platforms really began to pay a lot more attention to this in late 2018, early 19. And so
that was where you started to see some of the policies that have been put in place to try to
surface authoritative health information really came about as a result of a number of these
extremely tragic situations and then also the Brooklyn measles outbreak and then the Samoa
measles outbreak in 2019, the latter of which had, I think, about 80, 81 children die. And so this is
where, again, the consequences, the stakes are pretty high. So they're trying to prevent that from
happening, but you know, you're fighting with people who are absolutely determined to create content,
whereas the pro-vaccine side, the pro-public health side, particularly the institutions, are just really bad at fighting that battle.
They're really bad at understanding how to create compelling authoritative, interesting content that's going to perform well on YouTube's algorithm or Facebook's algorithm because it's just, it doesn't inspire people to share it.
You know, there's no natural peer-to-peer virality there.
Nobody's like, wow, this is the best meme I've ever seen from the CDC. Let me go share that.
And so that, as silly as it sounds to think that the CDC needs to understand how to create content, you know, made for Facebook or Instagram, unfortunately, that's where people look for information.
And so there is a need to get our institutions out of long-form scientific white papers, perhaps, and make them realize that they have to be filling this void, you know, but the other side, unfortunately.
Just trying to help people understand the through line over the last five years, I find it hard to do that without pointing back to the fact that,
that this was so obvious that this was happening, unfortunately.
And there was just no precipitating event that really made it, you know,
a crisis-level thing that needed to be fixed, unfortunately,
until Samoa, maybe Brooklyn, to some extent last year,
but then really, really this pandemic.
There's something you said on the Lawfare blog that just, I just want to quote it,
and that is, this is not a narrative issue.
This is a systematic manipulation issue where anyone, anywhere,
with any kind of message can manipulate a series of amoral but effective and dumb algorithms
to create a perception of widespread popular belief and consensus.
You're saying that state actor is more so than individuals or even terrorist groups
are using this to warp what we believe to be true, how we make sense, and how we ultimately act.
A small fraction of a group, even in network activism, even in state media,
create the content.
And then the question becomes, how do you enable it to achieve master?
reach. And that's one of the things that the internet fundamentally transformed, right? That,
I think, is the, whenever anybody asks, like, what's different now? It's that participatory nature.
It's that process by which people are actively engaged in the dissemination process. So,
anytime you see a meme that you like and you forward it, you are participating in that process.
And so I'll talk briefly about the China stuff, the top down piece, because I feel like that's
also a really interesting phenomenon in which bottom up and top down have kind of come
together in some ways. So the original Chinese content about the novel coronavirus, beginning in
January, they were calling it Wuhan pneumonia. And this was their official state media,
you know, CGTN. We watch a universe of Chinese state media pages. They're on Facebook.
You know, you can, anybody can go read. They spend a lot of time producing English language content
because they want, sorry. If I have the stat, CGTN has 96 million page likes, well, CNN, I think, only
has 32 million. So their reaches is actually massive. Yeah, with CGTN in particular. Yeah, they've got
about 99 million, I think, at this point, followers. And so this is just, this is Chinese state
media. The same way RT is Russian state media. So there's no subversion around what they are
there, what we would call academically like white propaganda. The attribution is quite clear.
Well, the one thing just to interrupt, do I mean to interrupt you, but if you can explain for people
that the country of China is not on Facebook, and yet they're actually
massively on Facebook for the rest of the world. This is a very important fact I think people don't
actually know. So can you just explain that briefly to? So Facebook is banned in China, but Chinese
government has had since about 2013 to 2015, a real strong demonstrated commitment to ensuring
that it is telling the story of China to the world, right? Facebook pages are a great way to reach the
world, especially since you can add target on Facebook. And so what many of these pages do is they
boost their posts or they create ads. I started looking at these about a year and a half ago.
For a long time, the ad content was like really cute, like pandas, they ran kitten ad campaigns,
explore the world with us was the tagline of one of them. And then gradually what we started to see
is as coronavirus became the thing that the world was talking about, particularly as it became
clear it was not going to stay confined to China, you started to see them use that apparatus
to push out boosted posts related to their handling of coronavirus.
And so, again, these are English language outlets.
They do have regionalization, so they have various other languages that they support as well.
There's an Arabic channel in particular.
So what we see from CGTN and these other Chinese media outlets
is that they run ads to boost their content about coronavirus to English speakers,
and they're doing this worldwide.
So we see them in Southeast Asia, we see them in Africa.
A lot of it is the kind of spin that you would expect from any state.
media outlet. You know, China did an excellent job of handling this. The president showed up to
ribbon cut at this new hospital. We have no new cases. China bought the world time was one of the
key narratives. China's expert handling of the situation ensured coronavirus didn't spread.
Then, of course, it did. Sending P.P.E. to Italy. Right. Send it. Notice that says we in China
support you in Italy. We're with you. Yep. When the World Health Organization, you know,
unfortunately, that's, there's been some politicization, of course, as I'm sure many people,
people have seen. As that began to happen, there was the, you know, when the World Health
Organization made favorable comments about China, China immediately took those, turned them into
headlines and repurposed them, right? So here's the World Health Organization. Here's this
global body confirming, in fact, that China has helped the world here and has done everything
right. And so they ran, they ran content and they ran ads based on that. So what you wind up
having is these very, very large audiences, but their own people are not permitted on
platform. So it's an interesting dynamic where China has virtually kind of a parallel internet.
There is very heavy censorship. The stuff that we see in a in a democracy, you know,
committed to free expression does include these types of content. We do see content from
foreign state media. And typically there's a law called FARA, Foreign Agent Registration Act,
where we do say things like there should be disclosures of state media so that people do know
that they're receiving a communication from a foreign government or a foreign agent. The purpose of
is to help people have an informed picture of where they're getting their information from.
On social media, it's really interesting because there are just certain design decisions
that impact how clear that actually is.
So if you go to some of the Russian pages, you can see on Facebook there's a little kind
of column on the right-hand side that will disclose to you where the funding is coming from.
On YouTube, they actually put a banner down at the bottom that says this is paid for in whole
or part by the Russian government or whatever government it is.
interestingly if you share the YouTube video to Twitter that banner disappears so the video yeah so just an interface design challenge right
so the way the page is designed on YouTube that flag that visual marker indicating that this is a state communication is lost and so if you if you consume the video on Twitter you don't have that disclosure actually an interesting example where the tech companies could make a small intervention which is when when Twitter or Facebook try to grab the thumbnail of the video that you're sharing that you're sharing.
sharing, YouTube could specifically change the thumbnail to add in a little red notice or something
like that in the thumbnail. That's one tiny thing that they could practically do to help
clarify that state media, correct? Yeah, I think, and that's something that, you know, I've
personally kind of complained about that one before, but it's because it makes it hard for people
to know what they're consuming. We saw China begin to do this. So state media interacts in
interesting ways. R.T. will pick up stories of unrest in various parts of the world. The yellow vests
in France, the Hong Kong protests, you know, U.S., of course, I'm sure they're going to be covering
what happened in Michigan yesterday. There's a lot of these opportunities that they have to
kind of amplify the appearance of chaos. And it's interesting seeing kind of which side they come down
on. So during the Hong Kong protests, they were creating video about the unruly protesters who were
with police. No, RT was creating this content about China on its sort of millennial
video making, one of his millennial video making pages. I think in the now was the one that did
this. And then they were being tweeted out. And so if you were just looking at Twitter,
you would see this thing that was sort of like, you know, it was like a nested doll of bullshit.
It was, you know, this account that was run by someone in Russia, not the, you know, run by
a Russian registered agent, not declared in any way with a video that's on YouTube. You can tell
it's Russian on YouTube, but not on Twitter, going viral on Twitter. People are sending it to me,
like, hey, I was just in Hong Kong. This is totally false. Like, where is this coming from?
You know, and I'm replying back, like, well, that's literally Russian propaganda. Someone was like,
I bet it's Russian trolls. I'm like, no, it is. It is. It's exactly what it is, in fact.
So how do you, how do you, you know, not trolls in the surreptitious internet research agency
sense of the word, but there's a real question of how do you enable people to understand that
the online manipulation, only part of that is coming from the fake accounts and the bots
and the trolls, because there is this whole other side to it, which is overt in the nominal
sense, but isn't necessarily as widely understood as it perhaps could be.
And Twitter has taken some action here. I think Chinese state media was running ads
against the Hong Kong protest or sort of disrupting. And Twitter actually banned those
ads, but I don't think Facebook or YouTube still has, even though they're financially benefiting
from them. Right. Facebook and YouTube do still accept money from state media to run ads.
Twitter decided not to. I personally think not to is the right call. I feel like there's a
difference between enabling some, you know, allowing someone to operate on your platform, where
again, the audience is proactively going and looking for it, as opposed to it's being pushed
at the audience that has indicated no interest in opting in by doing anything other than
clicking in a certain way or living in a certain place. With the specific stuff that's going on,
I realize I haven't even explained the narratives. The battle going on right now is around the
origination of the coronavirus, which as I mentioned in January, they were calling Wuhan pneumonia.
That was because it was a new novel type of pneumonia emerging in Wuhan. So before they
sequenced it and understood what it was and before it had its name and everything, if you were
looking at the posts that these media outlets were creating, they were trying to, you know,
they weren't covering it up. It was, you know, the extent perhaps was covered up, but it was really
being spun in a very positive way. There's this new emergent form of pneumonia in Wuhan,
but here is how we are protecting our people. Here is how we are treating this. As it became a bit of
a disaster, they began to build hospitals. Again, even the spinner on the hospitals was that this was
Chinese ingenuity and engineering.
Everyone remembers these videos.
It's like you have a hospital going up in what like days.
Yeah.
Look at the miracle of Chinese engineering and capacity that they can make this happen.
Look at our governance model.
Don't you wish you lived in an authoritarian country like ours where we can actually
build stuff like this without any hurdles, building codes, signoffs, et cetera.
No red tape.
We just get it done.
And everyone looks at that and says, gosh, in the West, our democracy is just not
producing anything.
That potholes been there for five years and no one's done anything about it.
Meanwhile, China's building something and just, you know,
a day. Can your country do this? So there is there is a lot of that kind of again that positive
spin but then what became really interesting is the some people are saying phenomenon of
propaganda which is around the same time in January of course there are conspiracy theorists in
China as well and there are these message boards and you know various online communities where
people share theories and shortly before the virus emerged there had been the world
military games which were held in Wuhan. And there was an American delegation, including a handful
of people, one of whom, I think a cyclist, worked at Fort Dietrich, and Fort Dietrich has a,
I'm not going to overreach on my knowledge here, but a sufficient like biorechurch program or
a medical containment level four type environment. And so the theory in the fever swamps of
China's internet was that the U.S. soldiers had in fact brought it over.
This was, in fact, assisted by a Western conspiracy theorist, whose name I'm not remembering
before my coffee, but who was also saying, like, I bet this is a bioweapon.
Anytime there is any emergent disease anywhere, the bioweapon conspiracy comes out.
That's something people have to understand.
This is not new.
It happened with Zika, Ebola.
I mean, you name it, SARS.
There has been a bio-weapon story about it.
And most people don't think know that the origin story of AIDS as a U.S.
bio-weapon came from which country?
That was Russia. Yeah, it was Russia. The CIA created AIDS in a laboratory and it escaped. They ran that story in just regular routine newspapers, right? That was narrative laundering through newspapers because this was during the 1960s through 1980s. They kept that one going for a while. Anytime the U.S., you know, when AIDS went to emerged in India and Pakistan, the same papers that had alleged it had been created by a U.S. government lab then turned that into and then it was released in, right?
And so it became the gift that really kind of kept on giving for them for a while as they just inflected the narrative each time for each new country to serve their purposes.
So what we saw with this particular situation, so the early emergence of the bio-weapon theory is coming out of Chinese fever swamps.
And then you start to see Chinese diplomats on Twitter, blue-checked Ministry of Foreign Affairs guys, saying things like, people are wondering if it's true that the U.S. released this virus in Wuhan, right?
Meanwhile, contemporaneously, you have Senator Tom Cotton asking the same questions about the bio level, the Wuhan laboratory that now, you know, that same narrative is percolating here too. And so, you know, we're inclined to think like, oh, the Fort Detrick thing is just garbage, right? You know, we live in the U.S. and that's just not a thing that that would have happened or that we would have done. But to take very seriously the idea that it was, in fact, a bioweapon that escaped from a Chinese laboratory. And that kind of inclination is flipped, of course,
in China, who, you know, where they are...
It'd be inclined to think that it's a U.S. bio weapon.
Of course it wouldn't be our, you know, we couldn't have possibly made something.
And so that phenomenon of just asking questions, absent any evidence, it's really hard to challenge
that because when Tom Cotton was challenged, as you saw play out in the U.S., the response
is, well, why shouldn't we be asking these questions?
I mean, these are reasonable questions.
We should, you know, we all have a right to know and we should want to know.
How much should elected officials with strong influential voices,
be speculating about these things
absent any evidence.
But it wound up happening
when the China bio-weapon story,
the one where we created the bio-weapon,
is that the, besides the kind of
blue-check, you know, Twitter-a-Di,
and this is, again, a very controlled environment
for that guy to just kind of go off the rails
like that was interesting to see
because then Chinese state media
picks it up also and begins to, again,
as they're pushing out their content,
now has the story of the origin of the virus
is a matter of debate. So no longer is it the Wuhan pneumonia from January. Now it is, did it come
to Wuhan from outside? Early on, the question was more like, this is a zoonotic disease,
what animal did it leap from? There were a couple articles about that debating whether it was a
pangolin or a bat. And then gradually, now it's, did the U.S. bring it here? And so there's this
complete shift in tone, you know, if you look at the headlines from January versus the headlines
from late March and mid-April, real demonstrable difference in the acceptance of the emergence
of the virus as being from China.
And so that's a sort of remarkable shift in which they gradually socialize it through
subtle changes and headlines and framing over time until it gets to the point where the past
has been rewritten.
They're sort of like rewriting the present in real time.
There's just this constant steady stream of content that you can push out that just nudges
people a little bit each time in a different direction. So that's one of the interesting dynamics
that we've been seeing is the state media outlets have phenomenal reach, the random guy on the
message board and God knows where doesn't, but the process by which these things are sort
of picked up and legitimized through what, you know, we've been calling blue check disinformation,
right? Or state media, again, propaganda, what are the ways in which platforms? How do platforms
respond to that, right? What does Twitter do with the Minister of Foreign Affairs from China when
he begins to speculate wildly? Speculating wildly is not saying this came from the U.S.
It's saying people are saying, people are asking, did this come from the U.S.? That's a different
thing in terms of certainty. And so it does tend to kind of bump up against enforcement loopholes.
And I think that's been a real interesting challenge for the platforms.
It's just a question. I mean, how harmful can a question be? It reminds me of in hypnosis,
you know, part of working with people is you embed suggestions and questions.
To even get people's minds to go in the direction of a question is setting up a bias accumulation
machine that says, hmm, maybe I should be looking out for that anywhere in my sort of
attentional field, anything that looks close to that, there's sort of a snap to grid.
Oh, yeah, that kind of confirms something that was in that question area.
I think what's hard about this is that this is an incredibly complicated topic.
There may be reasons why, you know, something was leaked from a Chinese lab, not a bio weapon,
but that there was research going on there.
There's, you know, maybe Tom Cotton did have, you know, intelligence briefings on that.
Maybe he didn't.
However, the kind of throwing up our hands, I mean, I think what's so confusing about this moment is that all of the cues that we would know, that we would use to know what has legitimacy and what doesn't, all our normal mechanisms are so confused.
And I think, you know, amplified by the fact that we're actually all socially isolated and at home.
So now, as Asa says, you know, when we're looking out at the world, no one's actually on the streets, you know, walking around in the hospitals.
Like we're all kind of looking through this tiny telescope people that's our screen or our phone, you know, while stuck at home.
We don't know it's actually going out in the world.
Even journalists who are sort of out there in the world, most of them are actually at home making phone calls.
So we're all disintermediated from the direct raw evidence of what is happening.
So we're even more easy to manipulate because we're all stuck in our pajamas.
Yeah, I think that's absolutely true.
I think there's also a, you know, Western media is not immune, even the non-state.
you know, media, right, even just our regular media, where you see the headline framed as a
strategic question, did so and so do this, did so and so say that? Somebody told me once, I'm trying
to remember who it was. I think it was a journalism professor that if the headline is phrased
in that way, the answer is almost usually, is it almost always no. Did so and so do this?
No. Because if he did it, we would know and we would have written the headline as so and so
and so did such and such, right? And so it was an interesting. But the question is much more
successful of getting people to click on it. Exactly.
You're producing a curiosity gap, which is to literally make someone aware of a gap in their
knowledge that they don't actually have filled. Once you make them aware of it, increase the
motivational spike in the nervous system to want to close that gap, because I don't want to know
that I don't know something. I want to know that I do know something. And so it's sort of a one-two
punch of curiosity gaps. As the pressure against platforms goes up because there's so much costly
misinformation where you have people in Iran drinking, I think it's methanol and dying because
they were told false information that, I mean, literally hundreds of people died and thousands
got sick and went to hospitals because of fake information that was spreading.
as the pressure on platforms increases because these conspiracy theories are killing people.
Another example is 5G.
The conspiracy theories that coronavirus was linked to 5G isn't just this kind of fun kooky thing.
It actually caused real people to blow up cell towers in the UK.
And so once physical violence is actually triggered in the real world as a result of it,
you have this dynamic where tech platforms are actually forced to respond.
So they actually have to start taking down things like 5G conspiracy theories and cell
which is what YouTube did. But when that happens, you have this kind of blowback effect where now
the conspiracy theorists say, see, there is this kind of distributed idea suppression complex
that's taking down these real things. That even proves our point of how right we are.
And then you have platforms who, you know, not wanting to be the moral arbiters of authority or
truth, say, okay, well, we're just going to follow whatever the WHO says. But then you have this
dynamic where the WHO actually flips its advice. So one week they're telling everybody, yes,
definitely wear masks. Everybody should wear masks. The next week they're saying, no, no, no,
don't wear masks. Secretly, it's because we need to make sure those masks are going through to hospitals.
This is more the CDC than WHO. And so the weird dynamic is kind of platforms are forced to
defer authorities. They don't look at us, look at whatever the CDC says, but then the CDC is also
vulnerable to political things. And then no one knows what to trust. And people are rightly skeptical
of kind of everyone looking around. So it produces this mess. And you see those kind of dynamics.
What do you think of that?
Yeah. I mean, that's really the challenge for the platforms is, I think, is curation. And so what I mean by that is, we've talked about, you know, content creation, filling the void, dissemination, people become part of the sharing process. There's a, I think, in the early design for deciding how we see what of the information, you know, there's such an information glut, what of even our friends posts do we see in the feed, there's the idea that if you could derive signal from engagement, right? What,
What did most people like? What did most people engage with? You were, you know, doing this bubbling up of stuff that people should pay attention to, right? That there was some value in that signal. And the thing that was never incorporated into that signal was any kind of sense of authoritativeness. And indeed, that was sort of like anathema, right? If you remember back in 2015, there was a Google debated very briefly, they wrote a paper. It was an academic paper. It was never worked into search. But it was the idea of the idea that you should incorporate.
some sort of authoritativeness into certain sources.
And Google did eventually develop a framework for search through human review
called Your Money or Your Life,
in which it began to say,
for health and financial related queries,
we have an obligation to return something with a higher standard of care
than just what's most popular.
Right.
And interestingly,
that applied to Google search,
but it wasn't applied to YouTube for some time after.
And that's because YouTube was seen as an entertainment platform.
So there's a difference between where do you,
you know, in the olden days of like 15 years ago or even 10 years ago,
the idea that you would go to Facebook to get your news would have been crazy, right?
You would still get your newspaper to get your news or you would go to a news.com type site to get your news.
When the platforms became these sort of all-encompassing,
when it moved from like entertainment and friends post and weddings and babies to also,
here's how you get your health information, right?
Like that's a pretty profound shift.
Ranking which of your photographs I see based on how many people have liked it probably is.
is still a reasonably good heuristic for what of your photographs I should see.
Ranking a news article in that way, it's not as clear that, really, I would say it's quite
clear in the opposite direction at this point, as that sort of the curation problem became,
how do you decide what to surface for people? And that's the problem that the platforms have
today. And so some of the things that we developed around the measles outbreaks in 2019
was the idea that you could point to the World Health Organization and the CDC.
right and there are a lot of like kind of nuances to that decision you know I thought at the time that pointing to more local pediatric hospitals they produce better content they're more trusted in the local community you have to surface information that people trust the problem is through the accumulation of erosion of trust in media in general and problems that are bigger than tech platforms erosion of trust and authorities exposure really a lot of this is rooted in the exposure of the fact that the authorities didn't behave well
the past. Vietnam War is kind of the canonical example of this, right? That what you hear in a
government press conference and what you see with your eyes on your television are two different
things. And so the question became, how do platforms decide what to curate? How do you do
that in a time when institutional authority and emergent authority are not necessarily the same
thing? So you have the CDC and the World Health Organization, which as you noted, the latter
unfortunately really got quite political in this particular situation versus the kind of
Blue-check scientists and frontline doctors who are hanging out on Twitter writing tweet storms,
explaining the news to people every night. How do you surface those people? How do you find those
people? How do you validate and vet which of them are worthwhile to surface, which of them are
offering authoritative content? How do you amplify those voices in such a way that you are
making affordances for emergent authority or timely authority rather than just pointing to
institutions that, you know, just circling back to the very beginning of the conversation,
aren't communicating in the way that people have become accustomed to communication.
So when the whole mask gate thing was going on, you know, I'm not in any way an expert on any of
the epidemiological topics at work here today. So I thought, oh, well, okay, let me go.
At the same time, I'm very suspicious when people are saying, like, they don't want you to know,
they want, you know, they're keeping this from us because they want to save the masks for workers.
Like that also is overly simplistic and too conspiratorial.
I believe in they never attribute to malice what can be adequately explained by incompetence
kind of view of institutions, right?
And so I went and I googled around for a while.
And I found on CDC.gov the guidance for SARS, which was also no masks.
And so in 2012, they have this guidance.
And they very meticulously, it's an extremely long write-up of transmission modalities,
why they think masks are not the most effective thing, you know,
why they are still going with hand washing and a variety of other prescriptions.
None of this has anything to do with a shortage of PPE.
This was their guidance, and they didn't update their guidance.
They gave the same guidance as back in 2012.
And I thought, you know, I wonder if this is more a function of the COVID transmission
mechanisms are still so new, right?
There are unknowns here.
How do you expect an institution that is usually, like, you know, in a scientific body,
they're waiting for months to find new, validated, kind of peer-reviewed or information that meets
a certain standard of scientific rigor. That's the timeline that the CDC is operating on.
The timeline that Twitter is operating on is like, well, 30 seconds have gone by. Like, you know,
where's my new information? I'm sitting here hitting a refresh. I'm not finding anything new.
And so again, you have that interesting challenge of the data void, which is somebody somewhere will step
and tell you what you should think about masks
because some percentage of them
will have actual knowledge about masks
or will have digested the research
and can articulate the mask policy.
And then some percentage of them
will just be random people chasing clout
looking to collect their likes and retweets
and new followers.
And so this is where you start to see
the kind of popping up of things that go viral
because they just appeal to people's prior biases.
They decide that for some reason
this was retweeted by someone I trust,
ergo I should trust it,
ergo I should retweet it.
And that's how you would see these like nonsense medium posts written by, you know,
people with no more knowledge or, you know, authority than me talking about these
scientific topics based on like, well, I'm a growth hacker and, right?
Well, it's been very interesting to go into this topic because, you know, there's a famous
article, the hammer and the dance and also the original Flat and the Curve article was written
by Thomas Pio, who is a French Silicon Valley tech engineer who I think had worked at Zinga
and marketing and saw exponential curves.
He saw, he knew what exponential curves looks like,
and he knew that most people don't have a good intuition for exponential curves.
And so he wrote this article that then became canonical,
then became the canonical what we think about it,
followed by this other article,
the hammer and the dance.
The idea that you have this hammer is the way you sort of do the lockdown,
and then after that you have this sort of dance.
You're going to have little brush fires of infections coming up
and you have to kind of dance with it.
Very viral, very powerful communication.
There are many people in the tech industry who were saying
this virus is going to be a bigger deal when all the institutions,
we're saying it's not a big deal and Cuomo and people in New York and governors and, you know, we're saying keep going out, keep going out to bars and restaurants. And so there's this weird thing. Well, like, well, who is the authority? You know, why do we have these tech people chiming in assuming they know the answer? They don't know anything about epidemiology. Meanwhile, they were one of the few people that were actually calling this correctly. And so that's where this kind of crisis of trust, I think, just really comes in is how, how, who do you know to trust when it's so confusing? And in fact,
Jeremy Howard, who is another tech person from Fast.A.I and started Kaggle, was also one of the founders of the Mass for All movement.
He went through and he did the sense making and really got deep into the literature.
Yeah, so exactly, where do we find this seat of authority?
Yep. And then on the flip side, you had the ones that were like woefully wrong, right?
You know, this bleach cure thing that popped up recently because of the president's comments,
there were groups dedicated to consuming this form of bleach called MMS, Miracle Mineral.
solution as a treatment for autism for quite some time. And it took a really long time. Yeah, MMS is,
I don't even want to go into the, it's gruesome. It was actually bleach enemas, but it was the theory
that autism is caused by gut parasites and you could use this bleach to, you know, solve your child's
autism. And I mean, it was kind of tantamount to child abuse, actually. It's a really, really,
really terrible situation. For a long time, the platforms didn't know what to do about it. One of the guys who is,
you know, one of the grifters who sells this stuff,
reorganized himself as a church, right?
So now it's a religious, you know,
treatment on par with like using any other sort of altering substance
for a religious experience is how they tried to reframe it now.
And that, of course, then if you take it down,
you're violating somebody's religious beliefs and, you know,
there's all sorts of loopholes for this kind of shit.
And so...
So that's an example of trust hacking.
Now I'm a religion and now it's all sanctioned.
And it's kind of the same frame control, you know?
Is it the CDC? Is it an official institution? Is it a religion? And these are all ways of
reframing what is not trustworthy. Well, in that particular situation, we thought, like, Brandy,
I think Zedrosny's how she pronounces her last name over at NBC, had done some really great
exposés on the MMS groups on Facebook. And they did eventually begin to come down.
Amazon used to sell you not only the book on like what the stuff was, but you know, you'd get the
referral to like, yeah, recommended products. And here's the police right here and buy it with one click.
Basically, that used to pop up on Amazon, and British health authorities, I think, went after that.
But there were a lot of these sorts of things where, again, it became like, what do you do about people exploring this?
Of course, after the president says it, you start to see a rise in searches for that topic and searches for, does, you know, what is bleach cure?
And so there's just this immediate spike in interest.
And then the MMS people pop back up to say, you know, see, we had the cure all along.
We were being censored by big tech.
we were being, you know, disparaged in the media,
but the president of the United States realizes
that there's some benefit to ingesting disinfectants.
And that's where we are today?
So looking ahead, like, where is your attention going to go?
How are you, where are you going to be looking?
What are you watching for?
What worries you?
And what solutions should we be adopting?
Because I think also, you know, we're here
because we want to change this for the better
where we're not just trying to complain about the present.
What also have you been recommending to tech platforms
as do you think about going forward in the future,
including with election 2020?
Yep. So with the platforms, Alex and Vanessa and I wrote this op-ed on how to think about state media and not forgetting that particular vector for the transmission of disinformation. And with that, you know, it was really advocating for better disclosure, honestly, just everything from getting that interstitial to show up on Twitter to maybe saying, hey, state media doesn't run ads. Maybe, you know, maybe we apply Twitter's policy to the rest of the social ecosystem. I think it's a good policy. With the bottom-up stuff, I mean, the thing I keep telling,
them as the vaccine, as the vaccine development progresses, as the treatment progress, it's not just
vaccines, it's anything remotely related to a pharmaceutical product. Remdesivir, I don't know how to
pronounce that one properly, but the one that Gilead produced is in trials right now. Any research
organization, any drug name is going to be immediately seized upon and tried, you know,
they will absolutely try to ensure that the top rated content for that is whatever kind of popular
conspiratorial crackpot stuff, they can get to the top of search results. So they are working
as hard as they can to kind of own the narrative space, the bottom-up conspiracy groups, and the
platforms have to be cognizant of what those developments look like. There's also going to be a real
risk of harassment to the people who are working on that particular, you know, working on the
vaccines or working on the implementation. Those people are going to be docs. They're going to have
their faces blasted all over the internet. You know, it's going to be the same truther kind of like
so and so is connected to such and such by six degrees of separation, and then George Soros is
behind it all, right? You know, that kind of gates is behind it all. That dynamic is absolutely
going to happen also. So there's these things that we have seen happen in enough other
information crises that we should be able to preempt at this point. As far as other work that we're
doing at Stanford Internet Observatory, we are looking at how does state media communicate about
these things. And that includes not only China, but Russia and Iran and Brazil and a range of
big, prominent global players for whom this has become a geopolitical battle, right?
A battle that they have to fight for their status and respect, maybe is the word, in the outside
world, but also within their own people, within, you know, kind of shoring up their reputation
as leaders domestically does require them to communicate in certain ways internationally as well.
And so looking at the dynamics that are taking shape internationally as state media begins to talk about reopening and things that are going to begin to pop up.
I think with 2020 elections, it is going to be impossible to divorce that election from the reopening process.
And so I think that the groups that have laid the groundwork for being more highly visible and proactive and kind of in-person protests around reopening,
are going to be used and are going to kind of rally their audience to behave in certain ways
during the political campaign.
We're going to have to see what specific precipitating events, things that have yet to develop
are going to play out to see how that becomes part of the campaign.
But I think that we should absolutely expect it to be part of the campaign.
And then, again, there's just the basic feature-level manipulation,
and the kind of run-of-the-mill stuff that we were seeing
during the Democratic Party primary,
just ways that weird things still trend and stuff
that the platforms have to get a handle on.
So I think there's kind of a range of problems
to end on that highly optimistic note.
The image in my head that I have is,
did you see in Planet Earth 2?
There's this very iconic scene where the iguana babies are
racing past all the racer snakes
and they're coming from every angle.
I love that one. Yeah, yeah, yeah.
I feel like the baby iguana is truth.
And it's getting attacked on all sides and in every way by these snakes.
And it's like we're in this mad dash to see, can we make it?
Well, and the baby iguana does make it, right?
Which is where my mind went when you said that was actually the other scene
where the walruses go careening off the cliff.
So let's hope.
that we wind up with the, you know, the successful baby iguana, the intrepid baby iguana
that does, in fact, make it to the top of those rocks as opposed to the horrific walrus scene.
Thank you so much for coming on.
We could talk to you for hours longer.
There's so many more aspects to this.
But, you know, I think the fact that there are no quick, easy answers that you can say,
well, look, it's obvious.
The platform should just do blah.
That would just fix all of it.
You know, I think everyone has this kind of desire to simplify.
the solution. And it's very complex and nuance. And I think it reveals just how much care we have to put into our epistemology of how do we know what we know. How can we strengthen each of our sense-making instruments? How can we all not be just eager to contribute with certainty about what we know to be true because we retweeted that guy and we actually don't know anything about science or epidemiology or how viruses spread? I think if we calm down our information environment so that only those who are sharing when they actually have the expertise and creating incentives for that, you know, I think
how do we have a stronger bottom-up sense-making environment for a complex problem, which
there is no, you know, I think the unfortunate reality is there is no easy top-down authority
that can get, you know, broad information inputs and then like World War II, hierarchical
command and control, push truth right back down the stack because it's very hard to know when
you're making sense across such a vast terrain of information inputs. Yeah, I absolutely agree.
I think that the top-down control model is over, right? That's just not the information
environment we have anymore. And so it's thinking about in the age of decentralized, decentralized
media, personal participation in the process, thinking about the kind of creation, dissemination,
and then curation pathway, thinking about how we, at each different point, what is the optimal
state for that particular process? I think that's where we need to be right now.
Renee, thank you so much for being here. Thank you. It's good to see you.
I know. We have to get breakfast when this is all, you know, when we can actually
to our apartments again.
Yeah, for sure.
Thank you so much, Renee.
Thank you.
Have a great rest of the day, guys.
All right, you too.
One other area in considering the intersection between the viral information environment
and then our viral biological environment is that we're creating the conceptual
frameworks inside of people's brains to think about a virus.
So, you know, how many people before coronavirus had heard of the phrase, are not or are zero?
which is the number of people that are infected
for every person that's infected.
If you think about our information environment
in this epidemiological way,
then information has an R-not.
A meme has a certain viral load,
it has a certain viral rate.
And for every person who's infected with that idea,
who believes it,
they will infect a certain number of other people.
And you can go into the kind of wash
your informational hands metaphor
by saying, hey, instead of just spreading everything
being a high spreader, being a super spreader,
or being a high information,
what do they call it, shatter?
like, you know, viral, you're shedding a virus.
You're shedding biases as you click on things, as you like things.
You're shedding biases for other people, and you're screwing up the information environment
for everybody else.
The more you contribute when you actually don't know what's true.
I want to push this even a little further because you're implicitly, once again,
putting the onus back onto the individual, but how do you actually flatten the curve?
We flatten the curve by staying home.
Do we control whether we stay home on online platforms?
No, we don't really.
we can control how much we say, but it's Facebook, it's Twitter that controls how many people
were in contact with, how much it gets spread. What would a mask look like for this world?
An informational mask. An informational mask looks like there is no one-click sharing. You have to click
and then wait. And then if you haven't actually clicked through and read the article, you have to wait
longer. Like that's the equivalent of an informational mask. And in fact, it would lower R or not. It would
flatten the curve. Yep, it's funny. And I think about, you know, the virus infects you more if your
immune system is down, right? So if I think about like fear and paranoia, the kind of things that
conspiracy theories amplify in you, when I'm operating with fear and paranoia, my information
immune system is down because I'm more likely to have bad stuff come into me because I'm operating
with this kind of bad biases. And more fear means I'm more like looking out for anything that could be,
oh my God, it is a bioweapon. Oh my God, China is doing the worst possible thing. Oh my God,
this was all deliberate, you know, that sets up my kind of my informational immune system for my
own brain is down. And then you think about, okay, well, what's the equivalent of like loading
my brain with vitamin C and with zinc lozenges, which like increase my immune system
resilience? How can tech platforms distribute the vitamin C for information that gets all to be more
thoughtful? Here's one very specific example. When a piece of news or information later gets
retracted, instead of just issuing that retraction, Facebook or Twitter, they know
when you have seen that piece of news.
And they could sandwich that with the retractions,
the better information.
Every time there's a conspiracy,
every time there is a debunked myth,
every time that gets shown both above and below it
to take advantage of the fact that the more we see something,
the more we believe that it's true,
you see the better information.
You give it context.
That's like one example of an informational antibody.
Yeah, I love that.
And we were talking earlier, Issa, about Facebook and Twitter
are, you know, for better or worse, kind of like the governments of the attention economy because
they set the incentives for what gets, you know, boosted through the feed. And one of the things
they could do is actually kind of give a subsidy, an attentional subsidy, like a boost, a signal
boost, to any publisher who participates in a corrections program, meaning if you actually issue
corrections and updates to things that you were wrong about, and you're the kind of publisher that does
that, you're participating by, let's say you add a tag to your pages. So every single time
there is an update, you like, there's an automated system where you're notifying.
Twitter and Facebook of everybody who saw that.
Now, Twitter and Facebook can go back and say,
hey, we know everybody who clicked, like, share,
or even saw that initial piece of information.
We'll make sure that their brain sees the correction more often
than they saw the original information.
So they have a count.
They say, hey, you saw this conspiracy theory
about 5G and cell towers and coronavirus,
which we know in some official way did not be true.
You saw it this many times, two times.
So we're going to actually make sure that four times
you see the correction.
And they could actually implement that through their system.
You know, another interesting idea, and this is borrowing from an idea we talked about a couple
podcasts ago for what platforms could do to combat deep fakes. And the idea was a kind of attention
quarantine or an attention jail. The idea is that if you retweet or share a deep fake and you do
not label it appropriately, that the platforms de-platform you. You can't share things for 24 hours.
If you do it again, then the 48 hours we do it again, it doubles.
it doubles, it doubles. The same thing could be used for misinformation or for conspiracy.
If you are found to be sharing something that later gets retracted, well, you by retweeting,
you are tacitly, maybe not even tacitly, you're endorsing that piece of information.
And so if you then get put into a little bit of an intentional jail, you can't tweet now
for one hour. But if you do it again, a few hours. If you do it again, four hours.
it makes a really interesting kind of incentive
that makes you stop and think more about what you're going to do.
It reminds me of just the way that countries are dealing with,
you know, if you come into the country with a virus,
then, you know, you can be here.
We're going to put you in a hotel for two weeks
so we can make sure that you're quarantined.
It's like you can be, you know, part of these platforms.
But if you're contributing things that spread an information virus,
you know, we're going to quarantine you.
Obviously, this is more nuanced.
And I want to make sure that our audience recognizes
that we recognize.
that it's not as if there is just this true thing
and we know what the truth is
and here's the thing that was false,
but when we do know that something was incorrect,
we know that, let's say, a study was flawed
and it was importantly flawed in a way that was knowable.
That's a correction.
It's a concrete piece of information
that could be updated to everyone else.
And just to sort of name that we're not trying to create
some arbiter of truth,
a brother-like infrastructure.
It's about creating the decentralized ways
that important updates and corrections
to how we make sense of the world
make sure that they reach the people who they need to be reached.
This conversation is incredibly complex and nuanced
because for any good you can imagine for the right approach,
let's say we should limit the spread of information,
then you wouldn't get the kind of viral growth of a Me Too
or a Black Lives Matter.
And there's always these sort of gray, you know,
I think this problem reveals the gray zones
and the complexity of these issues,
that there isn't some clear answer.
There's a stat from social movement theory
that you only need three to three and a half percent of a population
for a movement to get off the ground.
This is the foundation for climate movements,
extinction rebellion, for civil rights.
But the negative side is you only need
three to three and a half percent
to get a really bad movement off the ground.
Like, let's say you want to get, you know,
armed protesters to shoot up state capitals around the world
because they're led to believe
that they're about to get on lockdown
permanently by U.S. military, National Guard,
conspiracy theories or something like that.
You know, the fact that it only takes
three to three and a half percent
before you get this tipping point,
into kind of mass movements can be both for good and for bad.
And then on the game theory side,
it's often the case that the bad actors out-compete the good actors,
that fear out-competes the positive in an attention economy
because the positive isn't viral.
It's not sensational.
It doesn't arouse your whole nervous system,
but negative emotions spread faster and activate more intensely
people's own emotions.
So how do we deal with this?
This is really the kind of challenge of, you know,
looking at the kind of curvature and geometry of human,
emotions and say, how do we actually try to have our better angels of our nature win?
And that takes a real conscious design process.
Your undivided attention is produced by the Center for Humane Technology.
Our executive producer is Dan Kedmi and our associate producer is Natalie Jones.
Nor Al Samurai helped with the fact-checking.
Original music and sound design by Ryan and Hayes Holiday.
And a special thanks to the whole Center for Humane Technology team for making this podcast possible.
A very special thanks to the generous lead supporters of our work at the Center for Humane Technology,
including the Omidiar Network, the Gerald Schwartz and Heather Reisman Foundation,
the Patrick J. McGovern Foundation, Evolve Foundation, Craig Newmark Philanthropies,
and Knight Foundation, among many others. Huge thanks from all of us.