Making Sense with Sam Harris - #378 — Digital Delusions
Episode Date: August 2, 2024Sam Harris speaks with Renée DiResta about the state of our information landscape. They discuss the difference between influence and propaganda, shifts in communication technology, influencers and cl...osed communities, the asymmetry of passion online and the illusion of consensus, the unwillingness to criticize one's own side, audience capture, what we should have learned from the Covid pandemic, what is unique about vaccines, Renée's work at the Stanford Internet Observatory, her experience of being smeared by Michael Shellenberger and Matt Taibbi, Elon Musk and the Twitter files, the false analogy of social media as a digital public square, the imagined "censorship-industrial complex," the 2024 presidential election, and other topics. If the Making Sense podcast logo in your player is BLACK, you can SUBSCRIBE to gain access to all full-length episodes at samharris.org/subscribe. Learning how to train your mind is the single greatest investment you can make in life. That’s why Sam Harris created the Waking Up app. From rational mindfulness practice to lessons on some of life’s most important topics, join Sam as he demystifies the practice of meditation and explores the theory behind it.
Transcript
Discussion (0)
Welcome to the Making Sense Podcast.
This is Sam Harris.
Just a note to say that if you're hearing this, you're not currently on our subscriber feed,
and will only be hearing the first part of this conversation.
In order to access full episodes of the Making Sense Podcast,
you'll need to subscribe at samharris.org.
There you'll also find our scholarship
program, where we offer free accounts to anyone who can't afford one. We don't run ads on the
podcast, and therefore it's made possible entirely through the support of our subscribers.
So if you enjoy what we're doing here, please consider becoming one.
Well, did you see Trump's appearance at the National Association of Black Journalists?
That was spectacular. As you probably know, it went off the rails at the first question,
which in Trump's defense, it was a very hard-hitting question. I didn't catch the journalist's name.
She was from ABC News, but she was great. And Trump performed like a robot that had had its racial software updated somewhere around 1972,
about the time that Archie Bunker was the most famous character on television.
about the time that Archie Bunker was the most famous character on television.
His cluelessness and probably actual racism was just leaking out of his pores in that context,
and it was fascinating to watch.
I would point out, however, that the man spoke with reasonable fluidity,
despite the absolutely bizarre hand gestures. He was a very different near-octogenarian than President Biden. We should be very grateful that Biden is no longer in the
race. And from what I've seen, Vice President Harris has responded well. I'll remind you,
this was an event where, in front of, I assume, an exclusively black audience,
Trump questioned whether the vice president was actually black,
and in fact claimed that she had only just turned black,
having previously identified as an Indian her entire life.
Of course, none of that's true, but true or not, it was an amazing thing to allege in that context.
Anyway, it seems like Harris has responded well to this by just letting surrogates respond. She
has just laughed it off, which I think is the right move. I published a piece on Substack
yesterday talking about how I think Harris should pivot to the center. I really do think this is
necessary. She's just trailing so much video and audio where she, in the 2020 campaign, played
connect the dots with bits of woke sanctimony and delusion. She has to perform an exorcism
on that stuff. If in an interview or debate she gets led back onto that terrain
and is asked about defunding the police
or the new gender identity law in California
what she thinks about the epidemic of teenage girls
who apparently want double mastectomies
so that they can transition
unless she can show that she has her head screwed on straight
amid those kinds of topics, there is just a nuclear bomb waiting to detonate for her at the
center of democratic politics. And I just don't think she's going to be able to ignore it. It'd
be great if she could just talk about Trump's corruption and reproductive rights and gun control and uniting the country.
But unless she finds a path through the minefield that was patiently laid by progressive fanatics
on the far left of the Democratic Party that is sane and appears honest, it is just a disaster waiting to happen. So anyway, in this
piece on Substack, I argue that it would be very easy to pivot here, and there's not much to explain,
right? It does not need to seem like hypocrisy. And I even scripted how I think she could do that,
for better or worse. Okay, and now for today's podcast. Today I'm speaking with Renee DiResta.
Renee was the technical research manager at the Stanford Internet Observatory,
and she generally focuses on the abuse of information technologies. Her work examines
rumors and propaganda, and she's analyzed geopolitical campaigns created by foreign powers,
and propaganda, and she's analyzed geopolitical campaigns created by foreign powers, such as Russia, China, and Iran. She worries about voting-related rumors and the integrity of our
elections, health misinformation, and conspiracy theories. And she has been widely published in
The Atlantic, Wired, Foreign Affairs, The New York Times, The Washington Post, Politico,
Affairs, The New York Times, Washington Post, Politico, and elsewhere. And she also has a new book, Invisible Rulers, The People Who Turn Lies Into Reality. And we talk about the book.
We discuss the state-of-art information landscape, the difference between influence and propaganda,
shifts in communication technology, influencers and closed communities, the asymmetry of passion
we see online and the illusion of consensus, the troublesome unwillingness to criticize one's own
side, audience capture, what we should have learned from the COVID pandemic, what is unique
about vaccines, Renee's work at the Stanford Internet Observatory, her experience of being
smeared by Michael Schellenberger and Matt Taibbi, Elon and the Twitter files, the false analogy of
social media as a digital public square, the imagined censorship industrial complex, the 2024
presidential election, and other topics. And now I bring you Renee DiResta.
I am here with Renee DiResta. Renee, thanks for joining me again.
Thanks for having me.
So you've written a new book, which speaks to the insanity of the moment, which is
speaks to the insanity of the moment, which is obviously quite important. That book is Invisible Rulers, The People Who Turn Lies Into Reality. There are more than a few of these people,
but we'll discuss a few of them. And maybe you've been on the podcast before and we've
touched some of these topics. I think the most recently was about a year, year and a half ago, something like that.
Yep.
But maybe remind people of your areas of focus that have culminated in this book.
Yeah, I study influence and propaganda. And for about five years, up until June of this year,
I was the technical research manager at Stanford Internet Observatory,
where we study adversarial abuse online.
Right, right. And we'll talk about the controversy around the Internet Observatory, where we study adversarial abuse online. Right, right. And we'll talk about the controversy around the Internet Observatory. But let's just focus on the big picture to start here. One thing we think about and talk about under various guises
now is propaganda. But propaganda is a bad word for a certain kind of influence and persuasion. Obviously, there are benign and even beneficial forms of influence and persuasion. How do you differentiate the problem from the unavoidable and even good variants of just the spread of information online. Yeah, I think propaganda didn't used to be pejorative, right?
So prior to the 1940s, it was just a word that meant the kind of desire to propagate information or the need, rather, the need to propagate information.
It comes from a word used by the Catholic Church.
After World War II, it becomes that kind of bad information that those people over there do, right?
So it becomes the sort of information that your adversary puts out into the ether to manipulate people. And it becomes,
you know, it takes on that particular tone. So I think roughly speaking, you could define it as
the systematic and deliberate effort to influence the attitudes and beliefs and behaviors of a
target audience, but in a way that often involves biased or misleading info to promote a particular agenda. So information with an agenda, and oftentimes that agenda is
unclear. So a lot of the time it's differentiated from persuasion in that persuasion is seen as
making an emotional appeal, but doing it more ethically. Persuasion kind of respects the
autonomy of the audience. It doesn't necessarily aim to manipulate them. It isn't using fakery. It isn't selectively leaving out significant
pieces of information. It's always been, I think, a fuzzy term and one that people
kind of quibble around. So in the book, I really tried to differentiate it in part by this very active systemic effort to shape public opinion as opposed to something that is more organic.
And how have our online lives and the various platforms and tools changed this problem?
I mean, in your book, you go back to various cases, you know, a hundred years ago and beyond.
People have drawn analogies to the printing press, and you talk about the alarming case of Father Coughlin, which is now almost a hundred years old.
What has changed for us?
So in any new media ecosystem, anytime there's a shift in technology, communication technology, you have new means by which you can reach large audiences.
Social media was pretty distinct in that you could reach very, very targeted audiences.
So it really enabled propaganda to go from something that was seen as a function of mass media, right, manipulating large numbers of people or creating a national narrative to something that became very niche.
So I think that's particularly different.
You could point back to maybe the era of the printing press and the pamphleteering wars where there were niches then,
but there was a significant trend over time towards the mass, the mass media, the mass narrative, and now we've reverted again to the niche.
Just to focus on that for a second. So what is the significance of it being a niche? Is that one, your targeting can be that much more effective and bespoke, but there's also this
feature that the rest of the world can't really see what the niche is seeing, right? Right, exactly. So the messages, the memes, the things
that resonate really come up not only in media, but in closed groups. So a lot of the time,
one of the things I talk about in the book is that there's always been this perception that
media messages reach the public, and that is how public opinion is shaped. And
that's actually not really true, right? For a very, very long time, since the 1940s, we've had
these research studies that show that the media reaches certain groups of people and then those
people, they're called opinion leaders, really sort of socialize and communicate with people who
are like them in trusted communities. And that's how opinion is formed. So it's sort of socialize and communicate with people who are like them in trusted communities. And that's
how opinion is formed. So it's sort of moderated through these opinion leaders. This is called the
two-step flow theory of communication, right? And sort of communication theory. And so there's that
piece where it's not just that you've seen the thing somewhere, it's that your community accepts
it. Your community is talking about it. You are talking
about it with them. So the interesting piece about social media niches is that you have both the
fragmentation of media, but you also have these closed communities that are global yet simultaneously
niche, right? So there are people from all over the world in them. It's not people you know in real life.
You are brought together because you share some sort of alignment. And then in
those closed communities, you talk a lot about the media that you see in your ecosystem. So,
you know, almost like a double niche, if you will, right? You have the niche communicators,
the niche influencers, the niche content producers, and then you also have out of the field
of view or out of the broader, you know, zeitgeist, if you will, you're kind of discussing the things
that you see in the niche within the niche.
And so that structure of social media
is interesting both from a content perspective,
but also from like how we form opinions
and who we talk about things with.
And then there's this effect where the niche
suddenly becomes a perceived majority,
even when in fact it remains a minority.
You have this,
this is something you discuss in the book in various places, this asymmetry of passion,
which masquerades as consensus. And maybe you can say more about that.
Yeah. So one of the things that, I guess, another thing that's fundamentally different now
is that you have a very participatory environment, right? So
we all can go out there and shape public opinion through our own posts and our own
means of contribution. And one of the things that you see, and it really starts, it starts to become
very visible in 2015 on Twitter in particular, is that small groups of people who become very,
very activated all decide, okay,
on this day, at this time, we're all going to talk about this thing, right? It's very,
very coordinated in the early days. There were even apps that were made to help you do this.
There was one called Thunderclap where you could register your social media accounts and one kind
of activist organization could essentially co-opt your social media account to send out the same
message at the same time in
hopes of triggering a trending algorithm, which makes it look like a whole lot of people are
talking about something all at the same time. That's not necessarily actually a large number
of people. It's just a large number of accounts. And one person can control thousands of accounts
potentially. So you have this interesting phenomenon that happens where the perception of a majority opinion or a significant area of interest also becomes something that is a function of, you know, algorithmic curation, surfacing content to people that makes it look like a much bigger deal than it actually is.
I guess there are variants of that where it doesn't have to be enabled by any gaming of the technology. It's just the fact that the loudest people online are the voices that you hear,
especially when they're unpleasant and they use kind of trolling tactics that not merely become more salient,
but actually just actively silence and block the participation of
more moderate voices. So you just get this sense that everyone agrees. I mean, I would have to be
a moral monster to even have a divergent opinion on this topic because the opinions I'm seeing
online are so strident and the reputational damage done to anyone who traduces any part of this new orthodoxy is so complete, it's just you completely lose sight of the fact that most people on of, you know, demonizing your opponents in propaganda, right? This is, you know, anytime you see a conflict, of course, the, you know, referencing World War II, as we did earlier, those people over there, right, the Nazis, the, you know, pick your kind of adversary. There is that phenomenon of demonization that becomes very effective,
you know, sometimes justified, sometimes not. And what you see, though, on social platforms is when
you go through that demonization process, you can essentially push people out of the conversation
entirely. Because as you note, you can make it seem like it's a social liability to have an opinion that aligns with something that is,
you know, that belongs to them over there. And so you see that the group gradually becomes more and
more insular, more and more strident, and people who are seen as deviating in some way are either
sort of, you know, they self-censor or they are pushed out. And so you do see those
groups become more homogenous. You see them become often more combative. And that, you know, that
comes to create the phenomenon that we see today of like social media as the gladiatorial arena,
right? It's not the public square at all. You're not there to debate and dialogue and be in
communion with your neighbors. You're there to, you know, own and destroy your enemies. And you're going to do that, you know, using some very particular
types of tactics to intimidate them out of the conversation or harass them out of the conversation
or, you know, or just create an environment where nobody in their right mind wants to participate
when the cost of participation is, you know, is what you see in front of you.
Hmm. Yeah. One other aspect of of this which drives me fairly crazy is
the unwillingness to criticize one's own side, right? I mean, that is just a clear filtering
function which increasingly radicalizes the various sides of any issue. And it is, you know,
among perhaps a few other variables, in my mind, it is the thing that more or less guarantees a
race to the bottom. I mean, it's the thing that more or less guarantees a race to
the bottom. I mean, it's just you get less and less honest and more and more strident,
and you're willing to defame the other side by any means necessary. And it's just very quickly,
no one is having an honest conversation about real facts.
Right. And I think what you see there is there's actually incentives that drive people
to do that at this point. And one of the things I tried to do with the book was, you know, was talk
about it in terms of like, what are the incentives that lead us to this place? Like why? There's,
of course, there's social psychology, there's crowd psychology, there's human nature reasons
why this happens. And I try to go into that. You know, Cass Sunstein's had a
whole body of literature on how crowds trend towards the most extreme over time. But one of
the things that happens on social media is the most extreme voices are rewarded, right? And
they're rewarded in terms of clout within the group, which has always been the case. That's
the sort of social component. But there's also an interesting financial component that's also very,
very different today, which is not only is propaganda participatory, but it can be quite profitable.
And so you have that phenomenon where the influencer has to appeal to the niche and there's a finite amount of money that's going to be spent in the form of, you know, patronage or sub stack subs or, you know, attention, various forms of attention.
And, you know, in terms of Twitter, it's rev share,
you know, depending if your tweet is seen by a lot of people, there's that potential for rev share.
If you're on YouTube, it's who gets the sponsorships, right? So that accruing some
attention becomes like a flywheel effect for getting more and becoming, you know,
developing more profit from it as well. And so when you're catering to a niche,
this is the phenomenon you see with audience capture, where the influencer becomes, you know, it sort of feeds the crowd, right? The crowd gets
Mark's dream, so does the influencer. And that phenomenon is happening in tandem. And so it does
become very much a race to the bottom, in part, you know, motivated by the ideological and social
reasons. But also there's a, you know, a profit component to it that I think is important to
highlight because it's really unique and different in this particular media environment.
The idea, the figure of the influencer really comes out of social media in a very distinct way.
Yeah, this is a phrase which I think Eric Weinstein coined, which many of us use now, the phenomenon of audience capture, where a creator or influencer
begins to notice, signal in their audience that they cater to, and there's this effect of,
where you see people, I mean, we will probably talk about some of them, you see people who've
just grown radicalized by the act of pandering to their own audience.
Well, I think one thing that happens is if you are not doing that pandering, somebody else will
step in and do it for you, right? And so if you are not selling those subs or, you know, doing those,
you know, capturing that attention in that moment, somebody is going to be there to do it. And so
you are going to lose in a sense. And there's, you know, kind of ego
components to that. There's financial components to that. You know, you do see a pretty short
lifespan for people who become influential in only one thing. And then when that thing ceases
to become the thing of the moment, they have to find ways to make themselves relevant.
We saw this with, you know, people who became highly influential during
COVID as COVID has waned, right? As COVID is not the kind of all-encompassing attention grabber
that it was in 2020 or even 2021, you see them kind of pivoting and going off into other adjacent
kind of culture worry areas where they can continue to monetize their audience, engage with
their audience and remain relevant to the public conversation. Yeah, well, I want to get to your
personal story and just, you know, all the drama that has been kicked off in your life of late.
Perhaps COVID is the bridge, and we can talk about COVID and perhaps what you've learned from
dealing with the anti-vax community long before COVID,
because that's really where you got inducted into this network phenomenon.
As a student of computer science, you had the tools to respond to what you were seeing there,
but it was the first part of conspiratorial culture you got blindsided by was just as a parent,
dealing with vaccine requirements or lack thereof in
your kid's school. But to take COVID for a second, what do you think happened during COVID? What did
we get wrong? I mean, when the history of that moment is written by sober people who have not
been deranged by their own bad incentives, what do you think the verdict will be of what happened
to us and what should we have learned? What should we not do again when you're talking about the
attempt to influence societies in good ways in response to a moving target of a global pandemic
that we did not understand in the beginning and
did not understand in the middle, but understood differently and, you know, understood differently
again in the end. And, you know, the evolving scientific conversation was at every point
entangled with the political needs of the moment and just a wilderness filled with bad actors and grifters and, you know, actual lunatics,
what happened and what should we have learned? You know, it's such a complicated question.
I'm trying to think of how to break it down. I think first, for me, the very first
inkling we had that there was going to be a very, very serious outbreak of something was in December of 2019 or so.
And I was paying attention to it, actually, because I was looking at state media narratives at the time.
I was doing a bunch of work on Chinese propaganda and Chinese influence operations over the years, and they began talking about it and their state media began focusing very heavily on their response, right?
The incredible propaganda operation that began out of Chinese state media about their response.
And then interestingly, ways in which they had these sort of online influencers,
they're sometimes called the wolf warriors, these Twitter accounts that began to reach for
conspiracy extremely early on, right? Well, yes, people are saying that this came from China,
but what if the U.S. military brought it
during the World Military Games in Wuhan
a couple months earlier?
Some people on the internet are saying
that this is actually a bioweapon
that came out of Fort Detrick, right?
And so I thought, okay, this is gonna be a thing, right?
This is gonna be a narrative battle.
And this was before it, you know,
before it reached American shores
and before it became politicized
through the unique lens of the American culture war, what was very interesting to me was that
the anti-vaxxers were on it, right? They were on it. They were like, this is, you know, they
started making videos. This is fantastic for us because if it does come to the U.S., they're going
to rush out a vaccine and then people are going to see how shoddy vaccine science really is, right?
They're going to come and we're going to convert
them. And they really saw this as an incredible opportunity because they also didn't believe that
it was real, right? So it was simultaneously not real, but also a thing that was real that
might get a vaccine and then the world would realize how corrupt vaccines were. One of the
things you notice is, you know, like in quantum mechanics, right, two completely conflicting
states can be true simultaneously until you have the observation. I think about that a lot when I
watch kind of conspiratorial narratives evolve. But what happens with COVID is you have the
anti-vaxxers and the people who are well-networked and very well, you know, kind of well-connected
early on that are preparing to mount a response before it even gets here. And then you have,
meanwhile, the public health
authorities who I talk about in the book, my dealings with them back in 2015, 2016,
during some of the measles outbreaks, they do not understand modern communication.
There's this phrase that I've never forgotten, these are just some people online, right?
And that was something that was sort of, it sounds very patronizing what they meant by it, And it is very patronizing, of course. But what they meant by it was that, yes, there were anti-vaxxers. Yes, they had large followings. But ultimately, people would vaccinate because they trusted their doctors. They trusted their experts. And it was, you know, in the toss up between experts and just some people online, they thought the experts would continue to win. And I did not. And I thought,
okay, somebody at some point is going to be responsible for modernizing communication
within this institution or any other institution. And that turned out not to be the case because
there was nothing that was really urgent, right, that really would galvanize them into recognizing
what they had to do until all of a sudden it was in front of them and live and they could not cope.
And one of the ways that you saw this play out very early on was in the conversation about masks, where you had influential people with large followings on social media.
in obology comes to mind, saying, hey, this is a big deal. People should be wearing masks. People shouldn't be traveling, even as you don't see the health institutions kind of coming down on that
side. They're reticent. They're not communicating. So you have a situation where there is an
information void and it's being filled by people who, in this particular case, turn out, you know,
to be, we thought, correct. Now the anti-maskers are arguing that they were never correct.
we thought correct. Now the anti-maskers are arguing that they were never correct.
But you see this incomplete information and nobody knows what is true. But in the meantime,
the health officials are not speaking. The other people are. And so when they finally do come out and say, yes, you should be wearing masks, they appear to be leading from behind. So they take
kind of a credibility hit. And one of the things that you see is scientists who are waiting until they're absolutely sure of something to come out with commentary.
Even as the conversation is moving, the public is forming opinions, the influencers are driving the narratives, and the health officials are still very much sitting on the sidelines.
So that's one phenomenon.
But then the other thing that you see is it quickly becomes politicized, right?
This is an election year after all. But it's also, you know, the anti-vaccine movement did move from being kind of crunchy lefty crazies in the, you know, Jenny McCarthy, greener vaccines era to being much more of the, you know, the sort of right wing conspiracy theor. That shift starts to happen around 2017. And so it becomes an identity. And once it becomes an identity, you have influencers who politicize
the vaccine, who politicize the treatments, and everything becomes adversarial. You have to be
communicating about how evil and terrible the other side is. And that becomes the sort of
dominant means of engaging. There is always some form of dominant means of engaging. There is always
some form of aggrievement. There is always some complaint. And so you have both real institutional
problems, real institutional shortfalls, and then engineered and exacerbated anger and aggrievement
at institutions because it is profitable and provides attention to the people who become
the contrarians who are offering an alternative point of view. They begin to develop large
followings and then they double down by constantly implying that anything that comes out of an
institutional communication is somehow compromised. And moreover, any effort to surface good information
and downrank bad information
is some sort of horrific act of censorship. So that becomes part of the discourse around that
time as well. Yeah, that's just a larger point we may come back to, but perhaps we should just
touch it briefly now. We're going to talk a lot about the reaction to perceived acts of censorship,
the reaction to perceived acts of censorship. But one almost never hears the people who are most exercised about this issue entertain the question, should the government or should
institutions try to amplify any message preferentially under any conditions, right?
Here you have the condition of a
global pandemic in which it is believed that millions will die very likely if we don't get
our act together, or many more millions will die than will, in fact, will die unnecessarily if we
don't get our act together. And, you know, the tacit assumption of all of these people for whom
the Twitter files is the biggest story of the decade is that any attempt to steer
the conversation, any attempt to flag misinformation, any attempt to amplify genuine signal and deprecate actual noise is sinister, right? No one can be
trusted to do that. And that is, I mean, I think if you take 30 seconds to think about that,
that as an algorithm for dealing with any possible global emergency, that is just about
the stupidest thing anyone has ever thought, right? So then what are we arguing about? We're arguing about specific cases of influence and whether or not they were ethically deployed, whether they in fact were, you know, based on facts or misunderstandings. But it's a little bit like the claim to free speech absolutism online of which, you know, no one with a straight face can actually defend it when you look at the fact that, you know, even a place like 4chan has to have a moderation policy.
Right. Alex Jones, yes, has one of the best, yeah.
Yeah, you point that out in your book. InfoWars has, I think you quote their terms of service,
which are, you know, as seemingly normal as any other terms of service.
Right. Well, I mean, it's, you know, there has to be some kind of, you know, guardrails. And and one of the ways that that manifests, sometimes it's about, you know, harassment and that sort of thing. There were one of the things that was interesting about COVID was the rapidly evolving policies. And this is where you do see the platforms recognizing that, hey, this stuff is all going to be going viral on our site,
and we're going to have to think about that. And it is treated in some ways, I think, by people
who hadn't been following the conversation. It's treated as like a novel thing that just emerges
with COVID, but it's actually not. And one of the reasons why in the book I try to draw the
through line is that there had been, for example, a measles
outbreak. And in Samoa, there had been a measles outbreak in Brooklyn. The one in Samoa killed
about 85 kids. The one in Brooklyn hospitalized several hundred kids. And what you see is the
platforms beginning to come up with policies for trying to up-level good health information
very early on. It's not something that comes up during COVID. They build on the policies that they've pulled together for these other outbreaks and
these other situations. And what they try to do is they try to amplify good information. And one
of the things that's interesting about that, and I talk about in the book, having conversations
with them about this, this was where the freedom of speech, not freedom of reach kind of model of
thinking comes into play, which is, you know, you allow these groups to stay on
the platform. They're not taken down. But what you see the platforms do around these other outbreaks
is they stop accepting ad dollars, right? They stop putting anti-vaccine targeting in their
sort of list of targetable criteria. They no longer proactively recommend these groups.
That was something that happened to me in, you know, 2015. I'm sorry, 2014. I had had a baby and all of a sudden these anti-vaccine groups were being
recommended to me, not because I searched for them, but because they were being proactively
pushed. And so you see the platforms begin to think about, hey, ethically, what should we
proactively amplify? Maybe this is not something we have to proactively amplify. If people want
to search and go find it, they can, but we're not going to proactively boost it. So these are the sorts
of frameworks and questions they've already been asking for four or five years prior to when COVID
happens. But one thing that they constantly emphasize in conversations with, you know,
researchers like me is, unfortunately, the institutional health authorities produce
really boring content.
Nobody is picking it up and sharing it, right?
Nobody is like, hey, this is a really great, you know, PDF by the CDC.
Let me go and boost it.
That's not happening.
So what you see is physicians groups who know this, right?
Like people who are like normal people on the Internet who are extremely online know that nobody is sharing that stuff. And people who are engaging with patients all day long actually begin to also say, hey,
I'm a doctor. I have something of a following, not very big, but I understand what's happening.
How can I get my, you know, my experience as a frontline worker during COVID kind of out there into the world so people understand what's happening? Or when the vaccine rolls out,
how can I explain, like, how can I contextualize a side effect? So what you start to see is people
who have never worked to be influential on the internet, they unfortunately don't have very big
followings, all of a sudden realizing that the institutional communicators are not doing that
great a job. The government is putting out messages, but the government is distrusted by,
you know know half the
people at any given point in time in the u.s these days so can they try to counterspeak can they try
to put out content and they do and they are but they're not getting any lift they're not getting
any amplification so this becomes a question of how should platforms up level good content and
good speakers and accurate information to the, you know, best
possible extent that we understand what's accurate at a given point in time. And it becomes very much
a, you know, they make mistakes as they're doing it. You do see policies come into play, like the
decision to throttle the lab leak hypothesis, right? Which is a weird one. It's kind of an
outlier if you look at all the other policies that they pulled together, because most of the others relate to some form of harm, right? A false cure can actually hurt you. Misinformation about a vaccine that increases hesitancy can actually hurt you. But the lab leak, like that one sort of a, you know, like it's just not a...
No, this was just fell into the woke bucket of, you know, this is quasi-racist, right?
woke bucket of, you know, this is quasi-racist, right? Yes, that was how it was justified. But even so, it was one of these things where, you know, it was a perplexing choice. And unfortunately,
then it became a cudgel to sort of undermine or to complain about every policy that they
tried to implement, many of which did have, as you know, like very real reasons for existing.
like very real reasons for existing.
Yeah, yeah.
Is there something unique in your mind about vaccines here?
Because I mean, I just noticed that there's,
I mean, maybe there are, you know,
activist groups around other categories of medical error that I'm just, or perceived medical error
that I'm not aware of.
But I don't see people get,
I mean, this really does have a cultic, you know, slash
religious quality to it.
This level of activism, advocacy, and immunity to any kind of, you know, fact-based correction.
And so, like, I mean, to take a, like a personal example, like, you know, like I think I've
said this before on the podcast, I tried
to take statins at one point and got side effects and then discovered that something
like 5% of people just can't take statins because they get muscle aches and they can
get torn Achilles tendons.
And statins are great for millions and millions of people, but they really suck for about
5% of people who try them.
And I'm in that group.
But so having had that experience, it would never have occurred to me to have become an anti-statin activist, right?
Or to find a group that could kind of ramify my personal concerns about statins.
You know, was I harmed?
And, you know, was this experience, you know, even if I had a torn Achilles tendon, which I happily didn't,
it would never
have occurred to me to organize my life around the dangers of statins based on my personal
statin injury. It occurs to me now I have another example of this. I had some hearing loss that just
came out of nowhere about 20 years ago. There are all kinds of drugs that are ototoxic, right? I mean, perhaps I took a course of antibiotics or, you know, some other drug that destroyed, you know, some of the hair cells in my cochlea, right?
I don't know.
But just again, it would never have occurred to me to then make, kind of go on a quasi-spiritual quest to figure out the connection here. And yet vaccines, and I guess I'm in the
process of starting to answer my own question here, that I guess because they relate to,
you know, something we're putting into the bodies of healthy children. But even there,
there are all kinds of interventions and drugs that children get exposed to that I just don't
think draw this kind of energy. I mean, do you have thoughts about this?
So it is, I think, you know, kids is a huge piece of it, right? I mean, what, do you have thoughts about this? So in the, so it is,
I think, you know, kids is a huge piece of it, right? You know, everybody, particularly after
you've had a baby, you know, you have a, first of all, you're deluged with information about how
you should take care of the baby, how you should deliver the baby. You know, the anti-epidural
crew is very much, you want to talk about people who make it their life's work to scream at you on the internet.
Talk to anybody who's been in a, you know, mom board and is debating whether to have medication when they deliver or not.
But the thing that's interesting is that the first, so anti-vaccine narratives are very old.
They go back to the 1800s, you know, the first, the advent of variolation for smallpox, right, is seen as something akin to witchcraft, right? You're taking, because it comes, you know, comes from that you should be compelled to do something for the public good. This is something that various
moments in history have has been seen as, you know, something pro-social that we do to help
our neighbor versus something that the, you know, authoritarian tyrants demand of us.
There's some actually terrible stats coming out now about how Republicans feel about school vaccines and just the precipitous decline in support among people who have the strongest degree of support for Donald Trump.
So very heavily correlated to that.
And it's sort of stats are just beginning to come out now as it's become part of political identity.
But one of the things that happens, and I call it asymmetry of passion in the book, is that you have people who sincerely believe that vaccines cause autism. And, you know, it is something that parents are very afraid of. So a lot of the narratives around vaccines connect it to autism or to SIDS is the other big one, sudden infant death syndrome.
is the other big one, sudden infant death syndrome. And so it creates a degree of hesitancy because these are not small side effects. These are life-altering, you know, potentially fatal,
in the case of SIDS, risks that the anti-vaccine propagandists are telling you you are taking
when you don't need to. The argument is your baby is healthy. Why would you do this?
So the cost of, you know, the cost is what makes people very afraid, I think.
And you have most people who go and vaccinate, you know, go vaccinate, nothing happens. And they
don't talk about the many, many, many positive experiences or the neutral experiences.
How fun it is not to get measles, yeah.
Right, exactly. And so what you hear instead is only the people who either do have a legitimate
documented vaccine reaction,
and that is, you know, you have a higher chance of being struck by lightning, or things that they
attribute to a vaccine themselves, all evidence to the contrary, like autism. And that narrative,
even though it's been, you know, debunked over and over and over again, people have to trust
the debunkers, which means they
have to trust the scientists or trust the health authorities. And as distrust in government and
health authorities has increased, you're going to see, and we're already seeing a rise of distrust
in childhood immunizations as well, that's not rooted so much in the actual facts and evidence,
but just in what is the information you're seeing and who have you decided is a trustworthy source. Yeah, there's an asymmetry
here between committing some harm and that triggering, you know, loss aversion or harm
aversion and balancing that against this hypothetical risk that one is avoiding,
but one will never really be confirmed to have avoided it, right? So the
idea that you could do your child some lasting harm or even kill them based on an effort to
avoid something that is in fact abstract, you know, that's just the worst possible emotional optics.
All right, so we've kind of set the context for your own personal adventure or misadventure here. What's the right place to enter here? I mean, actually one, so the last time you
were on the podcast, you were on with Barry Weiss and Michael Schellenberger. Barry has kind of
dropped out of the, you know, this conversation and this controversy. So I don't know that we need to bring her into it.
But Michael, in the aftermath of that podcast, I just stumbled upon an amazing clip of Schellenberger
talking to Brett Weinstein about your appearance on my podcast and your appearance on the Joe Rogan
podcast and how nefarious all that appeared. It was very, very strange that you had
appeared on my podcast next to Schellenberger. And it was even more deeply strange that you had
somehow gotten onto Joe Rogan's podcast. And all of this was quite amusing from my point of view,
because I know exactly how you got onto both of those podcasts, because one of the podcasts was
mine and I just happened to invite you and Schellenberger because I wanted the two of you to have a
conversation along with Barry.
And I also happen to know that you got on Rogan's podcast because I texted him and said
he should invite you on the podcast.
So I was the evil nexus here, but they were spinning up a conspiracy about you, that you are kind of a deep state CIA plant that has been weaponized to mislead the masses about many things, but, you know, government censorship and free speech, you know, perhaps, you know, first among them. And then you were, they along with a few other people, I guess Matt Taibbi is
prominent here, in unveiling the treasure trove of the Twitter files for the world to see,
they really went after you by name. And this has, you know, you feel free to pick up the story here,
but this has really been extraordinarily disruptive to your
life. I mean, you and I haven't talked about this, but I mean, just viewing it from the outside,
it certainly seems to have been. So tell us what happened here.
Yeah, I think about it as, you know, it's opportunism and a fairly classic smear campaign,
right? One thing that's interesting about my job is I've seen this happen to so many people that when it happened to me, it was not either surprising nor novel. It was more, and we
can talk about this, the frustration of how to respond to it because I had opinions and Stanford
had opinions and they were not aligned. But now that you mentioned Stanford, I mentioned it earlier,
but perhaps you should just say what you were doing at Stanford, what the Internet Observatory
was. Yeah. So Stanford Internet Observatory,
I joined in 2019. I was the first research director. And we were a center for the study of adversarial abuse online. And that took several different types of research. That was
trust and safety research, where the adversarial harms were looking at things like spam and scams,
pro-anorexia content, brigading,
harassment, you know, bad experiences that people have on the internet, sort of human nature side
of, there's a phrase sometimes that trust and safety employees use, like the problem with social
media is people. So it looks at, you know, how online conflict happens and things like that.
But another area of work that we did looked at propaganda and influence and state actor
disinformation campaigns and things like this. So I did a lot of work on Russia and influence and state actor disinformation campaigns and things
like this. So I did a lot of work on Russia and China and Iran. Sometimes the U.S. Pentagon,
right, was the adversary running influence operations. And I did a bunch of work studying
those dynamics, including, and this is, I think, what put me on Tybee's radar in the context of
Russia, right? And so I think the first time I came on your pod actually was before I even started
at Stanford. It was because I was one of the outside researchers who did the investigation
into the Russia data set that the social media platforms turned over to the Senate Intelligence
Committee related to the Internet Research Agency's propaganda operations from 2015 to 2018.
So that included their election interference and then also the GRU, Russian military intelligence,
and the sort of hack and leak operations that they put together. And the work I did there was
very, honestly, mundane, candidly. It was, how do we describe the stuff in this data set? How do we
understand the strategy and tactics of a modern Russian propaganda campaign carried out on social
media? I never said it swung the election. I never said anything about collusion, none of that stuff.
But one of the things that happens is when people want to smear you, they find ways to tie you into
existing hated concepts or groups. And so all of a sudden, they tried to turn me into this
Russiagate hoaxer is the phrase that they use, alleging that I said that this somehow swung the
election, which I had never said. But again, one of the
things that you learn very quickly is that when the allegation is made, the audience is not going
to go and dig up everything I've said or done or written over the last eight years. They're just
going to take the word of their trusted influencer. So one of the things that SIO did, going back to
SIO, is that in 2020, we ran a very big project called the Election Integrity Partnership
in conjunction with the University of Washington, this group called Grafica, and the Digital
Forensics Research Lab at the Atlantic Council, who also periodically gets tagged as being one of
these, you know, imperialist Russia hoaxers. And so the work that we did in the 2020 election just
sought to study narratives about voting. So not Hunter Biden's laptop. That was completely
out of scope for the project. We were only looking at things related to allegations about
procedures having to do with voting. So, for example, tweets that said vote on Wednesday,
not on Tuesday. So procedural interference things or narratives that attempted to delegitimize the election preemptively. And we laid out this scope
very clearly, publicly. And we had a whole public website, a rapid response, you know,
tweets, blog posts, you name it. We worked in full public view. This was not funded by the
government. And what we did was we studied the 2020 election from August until November. And
we just kind of documented and chronicled the most wild,
viral narratives, the sort of election rumors and propaganda. At the time, we used the word
misinformation a lot, too, you know, that we're spinning up about dead people voting and voting
machine fraud. And, you know, and of course, unfortunately, we started the project thinking
it was going to be a lot of state actors, right, thinking it was going to be Russia and China and
Iran. And they were in there a bit. But the people trying to undermine the 2020 election is the president of the United
States. So this turns into, you know, we are, again, this is an academic enterprise. We have
about 120 students who are working on this project. And what they're doing is they're
creating JIRA tickets. So JIRA is just a kind of online, sorry, not online, JIRA is like a tech
kind of customer service ticket queue. If you've ever filed a bug report for a, you know, app,
it's gone into a JIRA somewhere. And people just kind of trace the tickets, they just follow it
through. So we were using that technology to trace election narratives. And periodically,
we would tag social media platforms, meaning we would let them know,
hey, you've got a viral thing here.
It appears to violate your policies.
You know, you should have a look at it.
Or we would periodically tag
state and local election officials.
So for example, hey,
this narrative about Sharpie markers in Arizona
isn't going anywhere.
It's getting bigger.
It's actually probably worth a response.
We're not going to tell you how to respond,
but you know, this is a thing that is worth paying attention to.
And they, in turn, could also reach out to us. And they had the ability to reach out to
platforms directly. They didn't need us to serve as some sort of switchboard conduit to, you know,
get platforms to take down tweets that offended election officials or the Department of Homeland Security or whomever. But that was how it was reclassified by the Twitter Files boys and the right-wing media
that discovered that we had done this work two years after we did it and got mad about it then.
And this was around the time that the election fraud narratives and court cases and the Dominion lawsuits and all of
the sort of big lie, right, the effort to actually delegitimize the election had kind of fallen away.
They'd lost all their court cases. There was no evidence that anything had been stolen.
And so they turned to looking at us and they reframed the project that we had done as being something that kind of cataloged and triaged and addressed election rumors and misinformation into something that had been a vast cabal-like effort to censor conservative speech.
And what you start to see is this individual named Mike Benz, who connects with Tybee and Schellenberger ahead of their Twitter files testimony to Jim Jordan's committee, he begins to make these crazy allegations
that we had somehow, you know, used an AI censorship Death Star.
I'm not making that up.
It's a verbatim claim to mass censor preemptively entire classes and categories of narrative.
And again, this is one of these things where if you
stop and think for 30 seconds, the man is alleging that we somehow managed to censor
the discussion about Sharpiegate and, you know, Sharpie markers in Arizona,
or the discussion about Dominion voting machines. I mean, these are things that anybody who paid
even marginal attention to the narratives that went viral on social media about the 2020 election will remember.
And that's because they were not censored at all.
They were instead actually wildly viral.
And one of the things that we did was after the election, we added up how many tweets had gone into these sort of top 10 most viral procedural and procedural election rumors we'd looked at or these
delegitimization claims. And the number that we arrived at was 22 million. So again, after the
election, we do this analysis, we add up a bunch of numbers and we say, okay, there are about 22
million tweets related to these most viral election rumors. And then Matt Taibbi and Michael
Schellenberger, under oath to Jim Jordan's committee, say that we censored 22 million tweets. So the actual number, and again, this is all in our report. Anybody can
go and read it. It sat on the internet for two years prior to this all happening. But what they
do instead is they reframe this as some vast cabal that has censored 22 million tweets. No evidence
is ever offered of this, mind you. There is no Twitter files dump
in which they produce the list of the 22 million tweets. Elon Musk has never made the supposed 22
million tweets available to the public, nor has Schellenberger or Tybee produced any actual emails
in which we are requesting that this happen. But it doesn't actually matter at this point,
right? Because they managed to sort of launder this through the right-wing media apparatus, and then more importantly, through a series of congressional
hearings. And all of a sudden, the actual work that we did, just sort of studying these election
rumors becomes, you know, oh, along with the Hunter Biden laptop, they censored all of these
things. All of a sudden, we're also lumped into, you know, Hunter Biden land, which is just a
shibboleth, you know, and that just kind of conveys this idea that we
had something to do with that also, even though we never worked on it and never weighed in on it.
So again, smearing by association is a very common tactic. It's just when it becomes,
when it's not only internet lunatics that are doing this, but when it is the sitting,
gavel-holding members of the United States Congress and attorneys general that are doing it, that's when you cross a line from, okay, this is an online smear campaign and that's, you know, kind of a pain in the ass, but, you know, cost of doing business, to now the American government has been weaponized against its own citizens to go on these crazy witch hunts based on nothing more than, you know, something some yokel put out in a tweet.
Yes, I want to echo a big picture point that you make in your book somewhere, I think explicitly,
which is that what's happening here is that you have an academic institution and a group
of people, most of whom are students, I think most of whom are undergraduates even, doing a project to study
the spread of misinformation online entirely within the scope of their own free speech rights.
And the product of that effort is of much greater concern to the people we're talking about,
you know, Matt Taibbi, Michael Schellenberger, this person Mike Benz, who seems like a proper lunatic, and everyone else who's trailing in
their wake, you know, in Trumpistan. This is a much greater concern to them than the actual
attempt in plain view by the sitting president to steal an election, right? I mean, like, that's
how upside down this whole thing is.
Well, the people who, the congressmen with the gavels who subpoena us and demand our documents
are congressmen who voted not to certify the election. The attorneys general who, you know,
begin to name check our undergraduate students in depositions, right? You can imagine what the internet does with that.
These are attorneys generals who joined amicus briefs
to overturn the Pennsylvania vote, right?
To fight against the Pennsylvania vote
alongside Ken Paxton in Texas, right?
So you have, and then the people
who then subsequently sue us, by the way,
because that's the third part of this,
you know, which I can't really talk about
because I'm under pending litigation now
for a little over a year at this point.
Stephen Miller, an America First legal, sue us based on this, you know, this allegation, this series of allegations.
Again, evidence-free, baseless allegations.
But again, the people suing us are people who also supported the big lie, right?
So there is something that these entities have in common.
It's not accidental.
This is political retaliation.
entities have in common. It's not accidental. This is political retaliation. And I think that that is a piece that, you know, I found it a little bit frustrating that we did not emphasize
that in our communication about what was happening. That that is like left out. That is, you know,
the motivation to be as...
If you'd like to continue listening to this conversation, you'll need to subscribe at
SamHarris.org. Once you do, you'll get access to
all full-length episodes of the Making Sense Podcast. The podcast is available to everyone
through our scholarship program, so if you can't afford a subscription, please request a free
account on the website. The Making Sense Podcast is ad-free and relies entirely on listener support,
and you can subscribe now at SamHarris.org.