On with Kara Swisher - How the Right Launders Online Propaganda with Renée DiResta
Episode Date: August 15, 2024Renée DiResta is one the world’s leading experts on online disinformation and propaganda and the author of the new book, Invisible Rulers, The People Who Turn Lies into Reality. About two months ag...o, DiResta found out her contract as the technical research manager at the Stanford Internet Observatory would not be renewed. What’s more, the SIO, one of the foremost academic programs studying abuse online, would be essentially hollowed out. The university blames funding challenges, and says it has “not shut down or dismantled SIO as a result of outside pressure.” However, many journalists and fellow researchers suspect that political pressure from the right, including congressional hearings led by Rep. Jim Jordan and lawsuits from people like Stephen Miller, caused Stanford to cave. Kara and Renée discuss the drama at the SIO; Invisible Rulers; the coordinated effort by the right to target academic researchers who study online propaganda and disinformation; the larger strategy to push back against content moderation by social media platforms; and the role the platforms themselves and their CEOS (looking at you, Elon) play in this fight. Questions? Comments? Email us at on@voxmedia.com or find Kara on Instagram/Threads as @karaswisher Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Support for this show is brought to you by Nissan Kicks.
It's never too late to try new things,
and it's never too late to reinvent yourself.
The all-new reimagined Nissan Kicks
is the city-sized crossover vehicle
that's been completely revamped for urban adventure.
From the design and styling to the performance,
all the way to features like the Bose Personal Plus sound system,
you can get closer to everything you love about city life
in the all-new, reimagined Nissan Kicks.
Learn more at www.nissanusa.com slash 2025 dash kicks.
Available feature.
Bose is a registered trademark of the Bose Corporation.
Do you feel like your leads never lead anywhere?
And you're making content that no one sees?
And it takes forever to build a campaign?
Well, that's why we built HubSpot.
It's an AI-powered customer platform that builds campaigns for you,
tells you which leads are worth knowing,
and makes writing blogs, creating videos, and posting on social a breeze.
So now, it's easier than ever to be a marketer.
Get started at HubSpot.com slash marketers.
Hi everyone, from New York Magazine and the Vox Media Podcast Network.
This is On with Kara Swisher, and I'm Kara Swisher.
My guest today is Rene DiResta, one of the world's leading experts on online disinformation and propaganda,
and the former technical research manager at Stanford's Internet Observatory, also known as SIO.
Stanford launched the SIO in order to investigate online abuse
and the many ways that people attempt to manipulate, harass, and target others online.
But earlier this summer, the university initiated its own quiet self-destruction
after a storm of coordinated congressional inquiries from that famous clown Jim Jordan
and class action lawsuits by conservative groups,
led by people like Stephen Miller, who claimed that the
SIO and its researchers colluded illegally with the federal government to censor speech. I'm sorry
to tell you, this is nonsense. They are nonsensical, aggressive people who are trying to make trouble
by telling lies almost persistently in order to stop people from researching important issues
around online abuse. All right, alleged lies. They tell
alleged lies. The university claims that the SIO has not been dismantled or shut down as a result
of outside pressure, but rather forced to refocus due to lack of grant money. Okay, Stanford,
profile and courage. Either way, Renee was caught in the middle of the storm, and I'll obviously
ask her about what happened there. She also just released a book that showcases some of
her research. It's called Invisible Rulers, The People Who Turn Lies Into Reality. And it's about
the interplay between influencers, algorithms, and the online crowds who are just as culpable.
I've interviewed Renee plenty of times, including on this podcast, and there is so much to get to
today. We'll talk about everything that happened at the Stanford Internet Observatory, her book,
of course, and the CEOs behind the platforms that spread propaganda. People like Elon Musk, who may be
on the hook for some of the riots happening in the UK right now. As I said, there's a lot to get to.
Our expert question today comes from Chris Krabs, the former director of the Cybersecurity and
Infrastructure Security Agency, who was fired by Donald Trump via, you guessed it, Twitter, for saying the
election was not stolen. By the way, the election was not stolen. Let's get to Renee now.
Hi, Renee. Thanks for being on On. It's great to see you.
Great to see you, too.
So we've talked a lot over the many years, and we've got a lot to get to today.
So let's dive right in.
Let's start with the Stanford Internet Observatory.
You joined the organization in 2019 as a technical research manager.
This past June, you learned that your contract with the university would not be renewed.
You weren't the only one.
Other staffers were told to look for work elsewhere, despite what the university
says. The group essentially is dismantled, and I've talked to a lot of people. There had been
a coordinated campaign by Republicans that painted you and the university as part of a conspiracy to
silence conservative voices. It's ongoing. Let's first talk about the work there, because you
worked with Alex Samos and students and everyone else. So explain what you were doing there and what happened. Yeah, so Stanford Internet Observatory
was an interdisciplinary research organization within Stanford Cyber Policy Center, and it looked
at abuse of online information technologies. I describe it as adverse serial abuse, right? So I
was looking at everything from spam and scams to child safety, which was a huge part of our work,
from spam and scams to child safety, which was a huge part of our work, information integrity.
So that was where the disinformation work came into play.
A lot of like state actor influence campaigns that we studied over the years. We looked at emerging technologies, how emerging technologies change the information environment,
and then how new, you know, how kind of bad actors, right, like people who were making
CSAM and other kinds of horrible content were using new technologies to increase their, you know, their scope and scale of operations.
Right. So typical, a lot of universities were doing this. Stanford, Harvard,
whole bunches. And then there were groups that were doing it too.
Yes. But one of the things that we did was we led a couple of inter-institutional projects
alongside Kate Starbird's team at UW,
DFR Lab over at the Atlantic Council, and Grafica.
And those looked at elections, and they looked at vaccines.
And in both cases, they were looking at understanding what were the best kind of rapid responses for emerging rumors on the internet.
So during the election, that might be a rumor that dead
people were voting or that Sharpie markers were rendering your ballot useless, things like that.
So very much focused on voting procedures. And then the context of the vaccines,
rumors that were going viral around the safety of the vaccines, efficacy, conspiracy theories.
And again, the thing with a rumor is you don't actually know the truth, right? It's impossible to know in that moment. So one of the goals of the project was to try to help
other people who were able to respond and did know what was actually happening on the ground
have some indication that they should be responding when and, you know.
So essentially just tracking these conspiracy theories in action.
Yep.
And determining whether they were accurate, correct?
They, determining whether they were accurate really correct? They determining whether they were accurate
really relied on looking to see if like fact checks came out.
So that was where we would actually,
we never made a determination about whether or not
something was true or false.
We would say like, hey, here's this information over here.
And we would put out a rapid response
that sometimes would link and say,
these fact checkers over here say this thing is true.
We would reach out to tech platforms occasionally.
Hey, you've got this thing going viral on your platform, right? They had relationships with fact checkers. They,
in turn, could go and get a label, you know, put on content. Exactly. And those platforms were more
open before, correct? They were not terrifically open, but somewhat, sometimes. They were trying
very hard to make sure that the election was free and fair and that
bad actors, particularly state actors, were not using their platforms to manipulate the public.
So that's when you got on the radar of conspiracy theorists. So talk about what happened because
you're studying conspiracy theories and then the conspiracy theorists don't like this too much.
It actually wound up happening nearly two years after the work was done, right? So in August of
2022, we're doing work on the 2022 election. Again, we were, funny enough, Elon loved a lot
of the work at the time. He amplified the work. He boosted the work we did talking about Russian,
Chinese, and Iranian operations. Yoel Roth still worked there at the time, in fact, right? There was still the sense that he had just bought the platform and
he didn't want it to be a disaster for the 2022 midterms. So that was the environment we were
operating in. But this blog kind of comes out of nowhere. It calls itself the Foundation for
Freedom Online. Turns out it's like one guy at the time, and he claims that he's a whistleblower,
which implies he has some inside knowledge. We didn't know who the hell he was.
Turned out he worked for the State Department for like two months.
But he claimed that during his time at state that he had been the, quote unquote, head of cyber at state.
This is not true either.
That he had seen evidence that there was a vast plot to silence conservatives and that we were behind it.
Which they had been thinking for years.
I had argued with a bunch of them about,
especially on Twitter, that they got pushed down.
What was the word they used?
The shadow band, right?
That was their thing for a while.
That was their thing in 2017, 2018.
This took it a step further and alleged
that we were the source of election rigging.
We had, in fact, swung the election by silencing
and suppressing, using an AI censorship Death Star superweapon, all of the narratives alleging
voter fraud in the 2020 election. This is like stupid on its face to anybody with half a brain
who remembers the 2020 election, but that doesn't matter because this person was briefing Jim Jordan and Dan Bishop
and he was arguing in these posts that once they had subpoena power, they should investigate us.
Sure. So this is laid out in August of 2022. And this is, this plays, what my point is,
it plays into this idea that they were being manipulated and that conservative voices were
being, whether it was Josh Hawley, were being, Senator Hawley,
being suppressed, correct?
Exactly.
Yeah.
So this just feeds into the grievance cycle that, you know, very long running grievance
cycle.
But of course, because it feeds into that, right wing media picks it up, right?
And it becomes a whole thing.
Marjorie Taylor Greene is weighing in on how silenced and censored she is.
And so what happens is they manage to connect the dots between the Twitter files allegations of some vast censorship regime happening within Twitter 1.0 and then the work that we did studying elections and vaccines.
And they allege that we colluded with Twitter to suppress 22 million tweets. There were two separate congressional inquiries into the Stanford Internet Observatory and several class action suits, which I think were more—the Jim Jordan thing was a circus.
It was a spectacle, really.
And the Twitter files was a nothing burger of an investigation and found that people were doing their jobs, essentially.
And so they didn't find what they were looking for.
But these class action suits by conservative groups, which is a tactic that Elon uses now,
and Peter Thiel has used it, Trump uses it, a lot of them use these things. This one was
Trump's former advisor, Stephen Miller, and I believe it names you personally.
Yeah, yeah, the family separation policy guy who now runs a lawsuit mill called America First
Legal. Yeah, this is the problem, right? I mean, there's a couple different levels of annoyance. The first is like the mobs on Twitter that decide that you're some sort of
like... You're a CIA agent, correct? Yeah, I'm a CIA agent, evil like... Explain what you did. You
worked for... For the agency? I was an intern when I was an undergrad. So you're a CIA agent. That's
their allegation. Yeah. Not only that, the CIA got me my job at Stanford. It placed me there is this theory that they seem to have.
Yeah, you're CIA Renee. I've read about you.
Right. Yep. And so that, I mean, that just generates a lot of like personal harassment and annoyance. It is what it is. You know, you get threats on the internet, you know this.
It's when Congress picks it up and uses subpoena power to compel you to turn over your emails in response to allegations like this, right?
When they then write reports alleging that they have found, you know, some evil nexus.
This is a very effective playbook.
It's, you know, been shown to work.
And then the other piece of it is the civil litigation and the lawfare.
And so we wind up getting sued by Stephen Miller, and he kind of like grabs these plaintiffs, one of whom we just never even heard of before. And, you know, that's, I can't even comment on it because it's still pending, right? You're just sort of under pending litigation. And it really, that is ironically, the chilling effect. That is the impact on free speech. Right. And so the university did what? They didn't want you to talk a lot. I know that from talking to a lot of people there and then suddenly
not. They put the remaining staffers in a larger cyber policy center. And essentially, you described
as capitulation. Explain your perspectives. They didn't shut it down, but they did not shut it down,
right? Well, nobody really is there anymore.
So, no, it's not technically shut down.
But, you know, that's unfortunately the reality.
There's, I think, just a couple of people who are working on very specific projects.
They reoriented it around child safety exclusively.
And that's very, very, very important work, to be clear.
I mean, my frustration was even as they were trying to decide what to do, the child safety work in particular really
needed to continue. But I would argue, so did the election work, right? And we have a very
extremely online election happening right now, right? We have extremely unpredictable platforms.
We have new platforms. We have so many that have emerged since 2022, gotten popular.
We have all sorts of incentivized actors that want to create chaos
and spread rumors and things like this. And one thing that's interesting about it is we also have
new and emerging technologies that are shaping the political discourse. And instead of doing that
work that we had done to study rumors and narratives as they emerged, I think just a
handful of academic
institutions are doing that now. Kate Starbird's team still is, right? But the thing that got
destroyed through the set of actions was the network, right? The network of researchers,
the network of communicators. We spoke with state and local election officials because state and
local election officials are the people on the ground, independent of party, who actually understand what is happening in their districts
and who need to be communicating with their constituents. And by cutting those ties,
by making it so that any conversation is collusion and any cooperation is a cabal,
what you've essentially done, you destroy it.
Right. They make normal conversation into collusion. That's correct.
Which is not illegal either, correct? I mean, what is the allegation of illegality here from
your perspective? Well, the theory is that we took tips from DHS, the CIA, you know, they really, they changed it depending on which day
of the week it is. They told us to tell platforms to censor content, that the government decided
what tweets were going to come down and used Stanford to launder requests for millions of
tweets to come down to suppress a particular viewpoint. And the argument
is that this is a deep state effort to do that. So we were not funded by the government. The
government did not tell us to do this. But because there's no actual evidence that can back up that
specific claim, they just use a whole lot of innuendo. But the perception stays,
and that's one of the things that is very challenging to overcome, right?
Right, right. And it also creates financial hardship and time and space hardship, correct?
Yes, exactly.
And you can't speak, right?
Right, you can't speak.
So they're suppressing speech by suing you.
It's funny how that works, right?
The free speech team.
Another irony, the researcher who works to understand and expose misinformation in conspiracy
years becomes enmeshed in a conspiracy, although that seems correct the way to happen, right?
In some weird ways.
It happens for a reason, right?
Because the easiest way to address the problematic finds is to smear the individual.
This has always happened, right?
Whenever you don't like a climate science finding, you attack the scientist.
It happened during McCarthy era many times.
Right.
McCarthyism is exactly the parallel here, actually.
That, I think, it's been disappointing to is suppressive or censorious or doing a bad
thing, you know, the equivalent of, in McCarthyism, of course, it's communism, but to allege that you
have some sort of inherent ideological bias and you're trying to subversively do something to
some other group. And then the other piece of it is the effort to make the work toxic and make the people who do the work, you know, dangerous or bad or untouchable.
And that is the entire operation.
Right, right.
There were several people they did this to even before is to try to question their patriotism, whether they're working for things, making them communists.
patriotism, whether they're working for things, making them communists. They have very similar tools for doing this, right? That you insinuate and then have no proof, and then
media has to sort of report it out, et cetera, et cetera. But the fact of the matter is you're
out of a job. Your identity is being questioned. You and your former colleagues have been buried
in lawsuits. You've gotten death threats. What are you doing now since your contract wasn't
renewed, correct?
Well, I still have projects, I mean.
Right, right.
I still have work that I want to get done. I think I had an amazing last five years because I was getting paid to do work that I really wanted to do anyway.
By Stanford, which is, you know, as you know, it's a bit frustrating to see institutions not understand the power that they hold in this moment and the need to to stand up, right, to learn the lessons of McCarthyism. The people who came out of that era as heroes,
people like Arthur Miller,
are people who spoke out, right?
Who did not let themselves be intimidated
and they were sort of vindicated later on.
But, you know, it feels like a bit of a wait.
We'll be back in a minute. This is advertiser content from Zelle. When you picture an online scammer, what do you see?
For the longest time, we have these images of somebody sitting,
crouched over their computer with a hoodie on,
just kind of typing away in the middle of the night.
And honestly, that's not what it is anymore.
That's Ian Mitchell, a banker turned fraud fighter.
These days, online scams look more like crime syndicates
than individual
con artists. And they're making bank. Last year, scammers made off with more than $10 billion.
It's mind-blowing to see the kind of infrastructure that's been built to facilitate
scamming at scale. There are hundreds, if not thousands, of scam centers all around the world.
These are very savvy business
people. These are organized criminal rings. And so once we understand the magnitude of this problem,
we can protect people better. One challenge that fraud fighters like Ian face is that scam victims
sometimes feel too ashamed to discuss what happened to them. But Ian says one of our best defenses is simple.
We need to talk to each other.
We need to have those awkward conversations around
what do you do if you have text messages you don't recognize?
What do you do if you start getting asked to send information that's more sensitive?
Even my own father fell victim to a, thank goodness,
a smaller dollar scam, but he fell victim.
And we have these conversations all the time.
So we are all at risk. And we all need to work together to protect each other.
Learn more about how to protect yourself at vox.com slash Zelle. And when using digital
payment platforms, remember to only send money to people you know and trust.
Support for this show is brought to you by Nissan Kicks. It's never too late to try new
things and it's never too late to reinvent yourself. The all new reimagined Nissan Kicks
is the city-sized crossover vehicle that's been completely revamped for urban adventure.
From the design and styling to the performance, all the way to features like the Bose Personal
Plus sound system.
You can get closer to everything you love about city life in the all-new reimagined Nissan Kicks. Learn more at www.nissanusa.com slash 2025 dash kicks.
Available feature, Bose is a registered trademark of the Bose Corporation.
Support for this show comes from Constant Contact. You know what's not easy?
Marketing. And when you're starting your small business, while you're so focused on the day-to-day,
the personnel, and the finances, marketing is the last thing on your mind. But if customers
don't know about you, the rest of it doesn't really matter. Luckily, there's Constant Contact.
Constant Contact's award-winning marketing platform can help your businesses stand out,
stay top of mind, and see big results.
Sell more, raise more, and build more genuine relationships with your audience
through a suite of digital marketing tools made to fast-track your growth.
With Constant Contact, you can get email marketing that helps you create through a suite of digital marketing tools made to fast track your growth.
With Constant Contact, you can get email marketing that helps you create and send the perfect email to every customer,
and create, promote, and manage your events with ease, all in one place.
Get all the automation, integration, and reporting tools that get your marketing running seamlessly,
all backed by Constant Contact's expert live customer support. Ready, set, grow. Go to constantcontact.ca and start your free trial today.
Go to constantcontact.ca for your free trial. Constantcontact.ca.
So you have written a book. You published your first book after this. It's called different than before? Because I've studied propaganda
in history for a long time, and every single, there's not an autocracy who hasn't used it,
there's not a democracy that hasn't used it, right? Some kind of conspiracy theories or
propaganda, misinformation. How do you like to describe it, and how is this different?
I describe it a lot as in those environments, it was very, very top-down.
And that's because in the mass media era, control of the ability to propagandize was held by very few, right?
And so you had to have some kind of access in order to have that degree of influence.
And now we have a really fascinating information environment where we have this figure, the influencer, right,
who has the reach of mass media. It's the kind of figure that can only exist, that is like native to the
last 20 years. We've had celebrities, we've had influential people, but this idea of mass reach
is somewhat, you know, mass reach for individuals is very new. What I wrote the book about was
actually to describe the incentives, right? So there's always a system of incentives underlying propaganda. Chomsky writes about this
in the 80s, talking about advertising, talking about creating hatred of the other, right? The
sort of filters that go into why mass media propagandizes in some states in some ways.
And he doesn't do it to say, like, you should hate mass media, you should distrust mass media. He
does it to say, your eyes should be open to the system of incentives that you understand why you're
seeing what you're seeing. And that was what I wanted to do, basically saying we have this new
environment that is, you asked what's different, inherently participatory. There's so much focus
on the algorithm, right, and the sort of infrastructure of social, and that misses
the incentives of the individuals that
come into play. The influencer is the other, you know, sort of massive figure in the space.
But then there's the power of the crowd and the crowd's ability to create a reality, right,
to shape public opinion. And it's happening in niches, right, where mass media at least tried
to do something nominally unifying, if only because you had so few options to choose from. What you start to see with social media is that you're when you choose the niche that you're going to serve, that relationship between the influencer and the crowd is really interesting. Right. Right. The influencer needs the crowd. The crowd is what triggers the algorithm to share the influencer stuff. And the crowd feels together as a group.
Yeah, exactly.
So all three of these things, like this is the system.
I have long described the first time I went over to AOL, there was a group of quilters who'd never met except online.
And they made a whole quilt online.
And I thought, this is lovely.
And then I thought, this could be bad.
This could have bad people.
These are quilters.
They were lovely.
They made a beautiful AOL quilt.
But they don't have to make that.
They can make anything. And I thought, oh, this isn't maybe good.
It's a difference in how power is distributed, right? It went from top-down propaganda being
a function of the top-down to now the potential to be from the bottom up.
Or anywhere.
Right. And very distributed. And it's actually that, it's that network that is what matters.
And that's why, just to connect the dots back to the last thing we were saying, when you destroy the networks between people who are trying to melt the counter response, you essentially cripple one side of the equation, right?
If you maintain the network and you destroy their network, then they are at a disadvantage because networked communication is the entire ballgame.
Look at the stories we're talking about with regard to, you know, J.D. Vance and his couch.
He did not fuck a couch, everybody.
He did not.
He did not, just for, you know, to be clear.
It needs to be said.
It does.
It does.
You never know.
I think it is that concept, though, of like ordinary people can make that the story of the day for what are we now at three
weeks yeah that never would have risen to the top of any old media environment ever ever right it
would have been talked about maybe haha but never never ever gotten anywhere because it wouldn't
have been created in the first place right yes exactly this was just a guy who just made it up
um and said he made it up which and it didn't really matter it's still know, you have a vice presidential candidate making a tasteless joke about it in his acceptance
speech. One of the things you've used with propaganda, which I prefer too over misinformation,
disinformation, explain the difference of why you think that propaganda works. I keep saying just,
it's propaganda. Just like, that's just like. Yeah, this is where I've been for years.
Misinformation implies that something is factually wrong, right?
Right.
That there is a truth that we know and this is the opposite of that, right?
And that is just not what is happening on the internet.
I was misinformed, like, yeah.
Right.
And that also assumes that you want to be informed and that if somebody just goes and
puts out the fact check,
the problem is solved, right? And that was, I think, kind of a prevailing belief in the early
quaint days when we called this like fake news in 2016, right? And that's just not what we saw.
When we would look at election rumors in 2020, we framed that project originally saying
election misinformation. That was still in, you know, four years ago, made sense to us. And then
as we watched that, those moments go viral, it was always that there was going to be a truth that
was going to come out, right? You would know if the ballots in the dumpster were deliberate or
accidental or, you know, the facts of the case. But in that moment, it wasn't misinformation. It was a rumor. It was something where people didn't know the truth. And then the other thing that had been added onto it was that you had these highly partisan figures who picked up these rumors, and then they amplified them as if they were true in service to a political agenda.
And so they would say things like, big if true, right? And it was this phenomenon of taking
the way that humans communicate rumors and uncertainty, how we make sense of the world,
but then taking that uncertainty and leveraging it using innuendo for the political moment. And
that was where it becomes information with an agenda at that point. And propaganda has always
been the word we've used for that.
So I felt like we were torturing ourselves into this model of true and false.
Yeah, because misinformation is I misspoke and disinformation is someone did it on purpose as a lie.
I just like propaganda and lie.
That seems to sort of cover the thing for me.
It's pretty good.
We've had these terms for a while.
One of the things you talk about is the virality.
And it's something I've talked about for years and years and years, especially I used to say Google is fast, quick,
you know, speedy, contextual, and accurate, right? When you search for some Rene D'Aresta,
you get Rene D'Aresta, right? You know what I mean? You don't get anything else. It's a utility.
And when you do it on social media, you can get just about anything. You can get Michael
Schellenberger saying lies about you or whatever. So it's a very different. Virality is the problem, though, when you apply it to a lot of things because even speedy and virality together creates an architecture of propaganda very beautifully, especially when its engagement is involved, too.
involved too. And virality is, you know, it's a function of engagement. Very literally, right?
You have to click a button in order to make that retweet happen. One of the things I wanted to talk about was the idea of the crowd as passive is wrong. It is not, we are not sitting there on
the internet passively receiving things. J.D. Vance's couch did not go viral because like some,
you know, gnomes made it go viral. It went viral because individual people
decided it was interesting, entertaining, funny, and they wanted to be part of it. And that's like
people click the button. We are exercising agency in that moment. We are not automatons.
And we're saying, okay, we're going to, in this moment, shape what our friends see,
shape how other people process information, determine what trends. And it's not necessarily a conscious decision, but virality is active. Virality is not accidental.
No, absolutely. It's different than the passive, say, watching a television or looking at a
billboard. You have a chapter in a book called, If You Make a Trend, You Make It True. Like,
anybody can make a trend, which is what's powerful about it. And then the media does take it up,
because when media reports on it, even if it's not true, just like the couch thing, which I think is, you know, I'm not a fan of J.D. Vance, but he didn't fuck a couch. Stop it. That's enough. That's enough with that. It's funny, but it's not, right? He should not be subject to the same thing as he himself does to people.
in the context of like 1960s radical politics guidebooks. I was thinking of Saul Alinsky in this book, Rules for Radicals, where he has this thing that he says. There's two things. He says,
you know, ridicule is man's most potent weapon, right? Which is why weird and all these other
things are working. You know why they're working. That's the main reason. It's very hard. There's
no defense. But the other thing that he says is like a good tactic is one that
your people enjoy. And I think about that a lot because the norms that we've created over the last
10 years of this now is that people think that this is what political activism is, right? And
I'm saying that descriptively. I'm not even making a normative judgment here myself. I'm just saying that the left finds this appealing right now because there was this sense that you had to take the
high road for so long and that the system would somehow restore itself if the adults in the room
didn't participate. People will behave. Right. And that's not, unfortunately, where we are,
because if only one side is doing that while the other side continues to realize that trolling works for capturing attention, for shaping opinion, for galvanizing a base.
Well, you can see the Harris campaign doing it.
Yeah.
When they go low, we go high.
They're going low now.
We're going to go lower.
It's depressing on one level.
On the other level, I'm like, okay, well, now we're going to see what happens when both sides fight this way.
Is there going to be a realization that this is not actually something that we should want?
Except it does work.
It's going to work.
And actually, the Harris campaign is very funny.
Yeah.
We'll be back in a minute.
Support for this podcast comes from Anthropic.
You already know that AI is transforming the world around us,
but lost in all the enthusiasm and excitement is a really important question.
How can AI actually work for you?
And where should you even start?
Claude from Anthropic may be the answer.
Claude is a next-generation AI assistant,
built to help you work more efficiently without sacrificing safety or reliability.
Anthropic's latest model,
Claude 3.5 Sonnet,
can help you organize thoughts,
solve tricky problems,
analyze data, and more.
Whether you're brainstorming alone
or working on a team with thousands of people.
All at a price that works for just about any use case.
If you're trying to crack a problem
involving advanced reasoning,
need to distill the essence of complex images or graphs, Thank you. The leadership team founded the company with a commitment to an ethical approach that puts humanity first.
To learn more, visit anthropic.com slash Claude.
That's anthropic.com slash Claude.
sees and it takes forever to build a campaign? Well, that's why we built HubSpot. It's an AI powered customer platform that builds campaigns for you, tells you which leads are worth knowing
and makes writing blogs, creating videos and posting on social a breeze. So now it's easier
than ever to be a marketer. Get started at HubSpot.com slash marketers.
Get started at HubSpot.com slash marketers. in their career trajectories? And how do they find their next great idea? Invest 30 minutes in an episode today.
Subscribe wherever you get your podcasts.
Published by Capital Client Group, Inc.
Let's dive into the role of incentives of the tech companies,
because this is great for them.
Enragement equals engagement, right?
And the incentives, which I've talked about a lot.
And it's also a mess.
That's why they want to get out of politics or they want to not police propaganda on their
platforms. Talk a little bit about the benefits they get. And then you brought up earlier,
these companies also made it more difficult for you to look at what's happening, for you to see
what's going on behind the curtain at Oz. Meta, just for people that don't know, no longer supports
CrowdTangle, which it bought, and Twitter is charging for access to its API.
And I wouldn't even trust it, frankly.
Given there's a large societal pushback against quote-unquote big tech, why do they feel they can make these platforms more opaque?
And what's the incentives to either hide it or get out of it altogether, fixing it in any way?
hide it or get out of it altogether, fixing it in any way?
Well, for moderation, after 2020 and then 2022, you do see the shift begin to happen because,
just to be clear, we were not the only ones subpoenaed by Jim Jordan. It was not only,
you know, I think he sent out 91 subpoenas, right, the man who didn't honor his own.
But it wasn't just us. It was tech platforms that were getting, I don't know if they were subpoenaed or just received letters.
Yeah, they did.
They did get subpoenaed, right?
Some of them did, yeah.
So they, too, had to go through this process.
And it became a, you know, careful what you do here.
You know, we see this as censorship.
We're going to create an entire perception that moderation is censorship.
We're going to link these two terms in people's heads.
And the platforms did not
push back on that. And to be clear, neither did institutions, neither did academia, right? Nobody
really did actually. And there was a bit of a, like a freeze, right? Where people didn't say
like, actually a label is not censorship. I did. I got like railed in front of Congress as a result,
but you have to decide that it is worth it to have the fight.
The other thing that begins to happen, though, is that you have the Europeans, right?
So we're talking about this very much in the context of the American election, the American
culture war.
Then the Europeans do come in and they say, you are going to do the following things.
We expect compliance with the Digital Services Act.
And one of the reasons, I think, for the end of CrowdTangle is that they're trying to launch this thing that they then see as – it's the meta content library that they see as being a way for them to comply with what the European Union wants.
Unfortunately, it's not as good, candidly, as CrowdTangle was, and that's why I like Twitter, when you have researchers saying, hey, we see this type of manipulation, in the olden days, pre-2022, you would have actually had, I know for us, you would have had engagement with the platform teams directly.
There was a sense that this was going to be, you know.
You were plotting with them, Renee.
You know that you were plotting.
So there's a... I know this is reframed as a plot, but it's really like, hey,
here are these accounts. We think they're this. What do you think that is? You know,
it was really quite boring, actually. I mean, they make us sound much more exciting than we are.
But that, again, when you, those conversations aren't happening anymore. And conversations between, you know, the platforms and government
aren't happening anymore. And some people are like, that's fantastic. That's great.
Well, but in June, the Supreme Court overturned a lower court decision that sought to impose
limits on a federal government's ability to communicate with social media companies,
which was great. In essence, they okayed the Biden administration's contact with platforms
during the pandemic. It was a good ruling in that way, but it does make platforms more leery
about doing anything, correct? Even if the Supreme Court has given the government the ability to do
so. Well, and one thing that I think is actually quite reasonable as far as responses is that
you should be disclosing, for example, government takedown requests.
Like, that is a thing that is good global policy, period.
We should absolutely want those lists public.
We should want the knowledge that the government is making takedown requests clear.
We should have an understanding of why.
That is a completely reasonable policy response to this.
why. That is a completely reasonable policy response to this. Most of the investigations are not being conducted with the intention of actually getting to substantive, meaningful
policy responses or legislation. That's the problem. So content moderation is expensive.
They only get into trouble. Moderating content, when they deplatformed Trump after January 6th,
they got only into trouble. They get the attacks. they get the lawsuits, they get all this stuff.
Is there going to be more moderation? No, less is what you're going to see, correct? Much less.
Right. That's the expectation. Again, I think it should be different and better, right? And we were
doing a bunch of work on, you know, if you think about this from the standpoint of like system
design, which is what it is, then you can answer questions like, what should a curation algorithm upvote, right? Moderation
is what you do at the end state when something has kind of failed, right? Where something has
gone south and you have to respond and react to it. But you can create better incentives earlier
on by creating design and curation. Yes, absolutely. It's an architecture problem.
It's something I've talked about a lot.
It's an architecture problem.
If it's incented to go for crazy, it's going to be crazy.
And if it's incented to be nuts, it'll be nuts.
If it's incented to be happy, it'll be happy, right?
And I think that's the difficulty is that it runs smack into their very, I would say,
juvenile ideas around free speech of what it is, right? They
have these sort of like dummies guide to free speech whenever I talk to them. But let's, before
we get to Elon and who thinks he's Mr. Free Speech, which he is Mr. Suppressed Speech as far as my
experience with him has been, and speaking of which, he's suing a group of advertisers,
the World Federation of Advertisers, for allegedly organizing a systematic illegal boycott against X. Let me just say, this is just nonsense. You
can't make advertisers advertise on your Nazi porn bar, your Nazi porn bar that you made,
that you want to do that now. You can't call yourself a free speech absolucent and sue people
for exercising their right to free speech. It's just, you aren't one. It's just like, let's start
with that. He's pulled a similar move, threatening to sue the ADL.
As I said, it's his MO, his weird version of contact moderation, which is he suppresses the content of people he doesn't like.
He's doing exactly what they had long accused Twitter of doing.
And so same little thing happened to you in SIO in many ways.
So what do you tell these advertisers?
How should they defend themselves?
Because he's judged shopping by putting it in a Texas court. Yeah, that's one. I mean, I would say fight. I
mean, that's really... People need to have the fight at this point because the entire intent
is to intimidate and drive capitulation, unfortunately. It's very frustrating. I was
never a big... I'm not a lawyer. I've never been a big follower of courts or procedures or things, but I have been very interested in this idea of like,
what can be done about the forum shopping problem? I mean, I'm being sued in Louisiana.
Why? Because, you know, some plaintiff who I'd never heard of lives there, right? Simple,
simple, there it is. That's the answer. And they want it there. And they want it there, right. And
I actually didn't realize the extent to which that was possible. I was very naive about that, And they want it there. lawyer on Twitter noted, it's a joke. But before that happens, it's going to be two years of
procedural fighting, and it's going to bleed whatever funding and budget they have.
So the problem for a lot of these groups is that what happens next reinforces how
effective the strategy is seen to be and potentially increases the likelihood that
it will be deployed against others. Right. No, it works. It works because it's expensive. I think these people
should get money from Reid Hoffman and just go to town. That's what I, you know what I mean? Just
like fight back. I think eventually that's what will happen. But Elon is at the forefront of
scaling back content moderation efforts there. But Secretary of State in five states recently
put pressure on him to fix Grox, X's AI chatbot saying it had spread election disinformation.
But do you think it matters at all?
Because no one's really on X, although we'll talk about what happened in Britain.
But talk a little bit about this.
Like these secretaries of state are very concerned about the disinformation and he himself is pushing it.
He took down something about concentration camps because it was a fake headline from a newspaper. It's one
thing to have conspiracy theories running around rampant on Twitter, but what's happened here in
the UK and Ireland with these riots, talk about that. I think, so I was working the last couple
of days. I'm not intimately versed in every single post that went up or came down, but the-
He was suggesting the country's on the brink of civil war.
Oh, no, no, I know.
Conspiracy theories about the UK police system.
And he compared them to the Soviet Union.
Police officers have been injured, buildings, cars set afire, people in real danger.
Elon is a really terrible person for having done this.
But go ahead.
I think the challenge is people don't want the government regulation of content, right, particularly not in Western democracies.
And I'm very sympathetic to that.
One of the questions is, you know, where is the line with incitement?
He's not actively telling anyone to go do a thing, right?
So there's not going to be a whole lot of action you can take.
telling anyone to go do a thing, right? So there's not going to be a whole lot of action you can take.
And so one of the problems that we have, again, when we get to like, how do you respond?
Elon owns the platform. So the curation and amplification and all of that is set by him, which is an interesting position to be in. I think one of the things that confronts us, though, is the question of what kind of, you know, what can you do about societal responses?
I don't know that there is an immediate solution to the problem.
I wish I had one.
No accountability, right.
Right, because there is no accountability.
Well, there might be in the United Kingdom, even if it's not here, correct?
Right, that may be true.
And again, they have different laws. Well, there might be in the United Kingdom people who go and do the action, right? When you see this in January 6th,
right, the actual people who went into the building are the people who bear the consequences
because they then took an action in the real world. So it may not be Facebook or Rumble or
Parler or whatever. Right. And that's consistently where we've seen, you know, the punishment goes to the people who actually do the thing, right?
Right, right.
Who take the violent action, which is reasonable.
But they could then point to this.
I mean, just for people to know, there was a killing of three young girls in Southport in England,
and it escalated violent demonstrations across the country after a false suspect was named online.
Demonstration fueled by anti-migrant, anti-Muslim, people who just like to fight lies that were spread on X.
In under 24 hours, a false name had already received over 30,000 mentions from more than 18,000 unique accounts on the platform.
It was amplified by far-right leaders.
It was trending.
It was recommended to users through algorithm.
And then its owner did that.
Anyway, speaking of, I want to talk about the upcoming election.
In every episode, we get a question from an outside expert. Chris Krebb called in
a question for you. Let's play it. Hi, Cara, and hello, Renee. What a timely interview.
So I guess my question is this. In the eight years since 2016, the Russian interference in
the 2016 election, is it really the Russians that we need to be
worried about? Or has that playbook that they've developed of finding seams in society,
elevating certain discourses and then pouring kerosene on both sides of the divine and stoking
that fire, has that playbook been picked up by other actors? And is it not just the actors we need to
be worried about, but is it also the platforms and the players and the people sitting at the
top of those platforms? Just you think about this all the time. You've written a book about it.
Would love to hear where your head's at right now in 2024 on the cusp of yet another
most important elections of our lifetime?
Yeah, it's a great question.
So the 2020 election work that we did that I described earlier, right,
the Election Integrity Partnership,
and Krebs was the head of DHS CISA during 2020, right?
It was the Trump appointees around the government,
just to be clear, when we did our election work.
Including Chris, who was fired.
Yeah, exactly.
And again, when we started that project, who was fired. agency, the sort of Russian troll factory, whatever gunpowder it had left, we assumed we would see that. And we did, right? You know, their accounts have been decimated by two years of
platforms, particularly Facebook, constantly taking them down. And so they didn't have,
they were not able to reconstitute that and be influential. We saw the Iranians do some stuff.
We saw the Chinese do some stuff. It was mostly minor and around the edges. But the rumors that undermined and led to
actual real world violence came from authentic American voices, authentic American influencers,
including the president of the United States himself. And that's where you do see that the
recognition that appealing to a niche, appealing to divisiveness, driving people apart, and most
importantly, convincing your faction that they have to be constantly at war with the other
factions, that's so normalized now that the Russian and Chinese and Iranian trolls who exist
are such a small part of amplifying that. So it's real. It's democracy doesn't die in darkness. It dies in the full light of day, right?
Yeah, it's not.
I mean, there are a lot of people
who really want it to be Russia.
And the problem is
when it's inauthentic state actors,
we have a really clear,
morally unambiguous response,
which is the accounts come down,
the pages come down,
the network is disrupted,
and that's it, right?
Which they've done.
Yeah, nobody cries for the, you know, nobody's like, oh, they're really censoring those poor
Russian trolls. But when the same thing is done by real people, then we have a speech problem,
right? Then we have a culture problem, then we have a societal problem, and that's where we
actually are. And I think the most important thing to be thinking about now is independent of actor,
right?
Whether it's a Russian troll, we can, you know, moderate them differently.
But ultimately, the question becomes, what do you do to restore that societal cohesion?
What do you do to diffuse that vitriol?
And that's not a technological problem.
It can be enhanced by technology. You can use bridging algorithms.
There's all sorts of different, again, design and structural things you can do.
You can use bridging algorithms.
There's all sorts of different, again, design and structural things you can do.
But the question becomes, how do you diminish the impact of lies about voting manipulation and things like this when people are not going to trust the people doing the diffusing?
Well, you have Tim Walz come in and change the spark plug.
They all feel better.
But, you know, in all seriousness, we've got the presidential election coming up.
Same guy running who has been accused of sparking the insurrection, just like you talked about. Two things. And then I have one final question. How are platforms prepared to deal with the potential election misinformation or propaganda this time around? And what if he does it again, calls for violence after he loses the election? What happens then? Yeah, one of the things they need to do is lay that out unambiguously now,
right? That is, I think, the number one thing that every platform has to do, which is to say,
even if you are a candidate, even if you are an elected official, this is what we are going to do
in response to this kind of incitement. And they need to have that there, and then they need to
follow through, right? And that's the other piece. Will they? I don't know. But, I mean, that is the role.
I mean, these are not poor businesses, right?
They can, you know.
I get it.
But right now Mark Zuckerberg is, look at my pretty chain.
That's where he is.
I mean, there's other people within the – I understand that there is, you know, the visibility of the CEO and the CEO setting the tone. But I do think that to whatever
extent they have moderation and policy teams, they need to be coming out now with what's going
to happen. Who is critical? What platforms are critical to this right now? I think Twitter's a
lot smaller than people think it is. I think Twitter is... It attracts the media. Yeah,
it attracts the media, but it's also, I mean, you know, it can galvanize a group to act,
right? And that's what we were just talking about in the context of the UK, right? It can
provide a spark, create a sense of legitimacy. It also provides a platform where people who go out
and act then share their actions online, right? So it sort of gives them an opportunity to build
some clout. And meanwhile, platforms like Facebook have seriously deprecated political
content already, right? They've already moved away from it. And we just don't see as much in the way
of, you know— So they have to do this in advance before it happens and then say what we're going
to do after it and then actually do it, which they will—none of which they will do. You don't think
so. All right, last question. Are you worried about a similar thing? Because it didn't happen Which they will, none of which they will do. really challenging questions in front of us. Will it be scattered small-scale political violence?
You know, I think that you're seeing a lot of election officials who are very afraid of that.
You're seeing a lot of election officials who are afraid that it's going to come for them
personally. And that I think is one of the areas where, that's the other area that I think
platforms have to be particularly vigilant on, the need to protect election officials from threats
and doxing and harassment, because these are ordinary people doing their jobs.
And we already saw with the two election workers in Georgia who wound up winning a massive
defamation suit against Rudy Giuliani.
But again, they had to move out of their houses before that happened.
Like the costs are pretty profound.
So are you worried for this election?
Yeah, well, I'm worried about scattered political violence, right?
That I am actually very worried about.
Caused by online.
Yes.
Having seen it in Britain.
Democrat or Republican have made very, very strong statements. And just that sort of bipartisan show that like we are committed to a free and fair election. The actual election will be free and
fair. It's a question of what incentivized actors try to convince the public that it is not so if
it does not go their way. That's what I'm most concerned about. All right. Thank you, Renee.
Again, her book is Invisible Rulers, The People Who Turn Lies Into Reality. Thank you so much. Thank you. Cailin Lynch. Our engineers are Rick Kwan, Fernando Arruda, and Aaliyah Jackson. And our
theme music is by Trackademics. If you're already following the show, you are now part of the On
with Kara Swisher conspiracy to take over the world. If not, CIA Renee is on to you. That's so
ridiculous. Go wherever you listen to podcasts, search for On with Kara Swisher and hit follow.
Thanks for listening to On with Kara Swisher from New York Magazine,
the Vox Media Podcast Network, and us.
We'll be back on Monday with more.
Support for this show is brought to you by Nissan Kicks.
It's never too late to try new things,
and it's never too late to reinvent yourself. The all-new reimagined Nissan Kicks. It's never too late to try new things. And it's never too late
to reinvent yourself.
The all-new Reimagined Nissan Kicks
is the city-sized crossover vehicle
that's been completely revamped
for urban adventure.
From the design and styling
to the performance,
all the way to features
like the Bose Personal Plus sound system,
you can get closer to everything
you love about city life
in the all-new Reimagined Nissan Kicks. Learn more at www.nissanusa.com slash 2025 dash kicks.
Available feature, Bose is a registered trademark of the Bose Corporation.
Support for this podcast comes from Stripe. Stripe is a payments and billing platform
supporting millions of businesses around the world,
including companies like Uber, BMW, and DoorDash.
Stripe has helped countless startups and established companies alike reach their growth targets,
make progress on their missions, and reach more customers globally.
The platform offers a suite of specialized features and tools to fast-track growth,
like Stripe Billing, which makes it easy to handle subscription-based charges, invoicing, and all recurring revenue management needs.
You can learn how Stripe helps companies of all sizes make progress at Stripe.com.
That's Stripe.com to learn more. Stripe. Make progress.