The Problem With Jon Stewart - The Real Election Threat with Casey Newton and Renée DiResta
Episode Date: March 18, 2026As the Senate debates the SAVE America Act amid unfounded claims of voter fraud, Jon is joined by Georgetown Research Professor Renée DiResta and Platformer editor Casey Newton to examine what actual...ly threatens our elections. Together, they investigate how algorithms are engineered to push users toward platform owners' preferred ideologies, explore the incentives driving Silicon Valley's rightward shift, and discuss how Republicans have weaponized disinformation to undermine electoral trust and rewrite voting rules in their favor. This episode is brought to you by: GROUND NEWS - Go to https://groundnews.com/stewart to see how any news story is being framed by news outlets around the world and across the political spectrum. Use this link to get 40% off unlimited access with the Vantage Subscription. MAGIC SPOON - Get $5 off your next order at https://magicspoon.com/tws Follow The Weekly Show with Jon Stewart on social media for more: > YouTube: https://www.youtube.com/@weeklyshowpodcast > Instagram: https://www.instagram.com/weeklyshowpodcast > TikTok: https://tiktok.com/@weeklyshowpodcast > X: https://x.com/weeklyshowpod > BlueSky: https://bsky.app/profile/theweeklyshowpodcast.com Host/Executive Producer – Jon Stewart Executive Producer – James Dixon Executive Producer – Chris McShane Executive Producer – Caity Gray Lead Producer – Lauren Walker Producer – Brittany Mehmedovic Producer – Gillian Spear Video Editor & Engineer – Rob Vitolo Audio Editor & Engineer – Nicole Boyce Music by Hansdle Hsu Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
At Desjardin, we speak business.
We speak equipment modernization.
We're fluent in data digitization and expansion into foreign markets.
And we can talk all day about streamlining manufacturing processes.
Because at Desjardin business, we speak the same language you do.
Business.
So join the more than 400,000 Canadian entrepreneurs who already count on us.
And contact Desjardin today.
We'd love to talk, business.
When West Jet first took flight in 1996, the vibes were a bit different.
People thought denim on denim was peak fashion, inline skates were everywhere,
and two out of three women rocked, the Rachel.
While those things stayed in the 90s, one thing that hasn't is that fuzzy feeling you get
when WestJet welcomes you on board.
Here's to WestJetting since 96.
Travel back in time with us and actually travel with us at westjet.com slash 30 years.
Hello, everybody.
Welcome to the weekly show podcast.
My name is John Stewart, and I will be your...
your audio guide to this week's podcast.
It is Tuesday.
It is March 17th.
It is St. Patrick's Day.
And I am...
Oh, that's...
It's all blue.
Blue and gray.
I'm in blue and gray.
Does that signify anything?
Honestly, today?
It probably doesn't.
We are still, I guess,
entering week three of our war with Iran.
But I want to talk about a different threat
to the country.
The Senate this week, I believe,
is going to be trying to figure out
bureaucratic loopholes
to try and get their Save America Act through,
even though they don't really have the votes for it
because they could never pass the thresholds of filibuster
and certainly not passage.
But they want to get it done
because they want to introduce the safeguards
to the American electorate
because so many undocumented people,
so many non-Americans skew our elections so brutally
even though they don't,
even though there's no evidence of it.
Other than that, this is an incredibly crucial
piece of legislation that must be passed. So today, what I would like this episode to be focused on
is the real threat ironically to American democracy in our election. And that is the algorithms
and social media platforms that push this bullshit and get it out into the electorate so that it
becomes canon, even though it lacks the evidence for that. And so to get into that topic,
we bring in experts in this field. So let's get to them now.
Very excited as we talk about the real threats to the American electoral system,
to the American Democratic system. We're delighted with our two guests today.
We've got Renee DeResta. She's the Associate Research Professor at Georgetown McCourt School of Public
Policy. You always know how good something is by how long its name is.
And that's the longest name there is.
Also the author of Invisible Rulers, the people who turn lies into reality.
And Casey Newton, who is the editor of platformer.
And of course, you all know him as a co-host, a hard fork, which is the New York Times podcast about technology and future.
And hello.
Hello.
Hey, John.
What is happening with the both of you?
Listen, the reason why we're doing this now.
So I don't know if you're familiar with the Save Act.
It's the Save America Act.
And as you know, we all, that's where we live, America.
And they're going to save it by voting on it.
And what it's going to do is it's going to protect our electoral system and our democracy from the scourge.
Scourge?
Yeah.
Of non-citizen voting, which, as you know, is in the, I think, billion.
or either of you.
And let's just start by sort of defining what this Save Act is seeking to accomplish.
Basically, the idea is it's an idea that most people can get behind.
Voting is for American citizens.
Yes.
Let me ask you both now.
Generally, are our elections decided by American citizens?
Absolutely.
Yeah.
So we have an agreement.
We have you and I are all in agreement.
So why is it?
And this gets maybe to the heart of the issue.
70% of Americans support the SAVE Act because it's going to make sure that citizens vote.
But generally citizens vote.
But 50%, I think, somewhere along that line, believe that undocumented voting is an enormous problem.
And he's done by the thousands and millions.
Why is that?
And I'll start with Renee.
Yeah.
So there's been rumors about that for ages.
and you can look back to, I mean, we can go back to the 2016 election.
You can go prior to that.
You basically see these stories.
We call them tropes in election rumor research.
And I use the word rumor on purpose, right?
A rumor is something where it sounds like it could be true.
It resonates with people.
They think, oh, this might be something that is happening.
They heard it from a friend who heard it from a friend who saw it on the internet, right?
There's a sort of trace back to a claim that some guy said somewhere.
And with this rumor of non-citizens voting, what you hear is this theory that your vote is going to be stolen from you.
Your candidate might lose because somebody who is not supposed to be doing a thing is doing that thing.
So there's a sense that you might be wronged.
And that's why it lands so hard emotionally.
And if you look back, you see the same types of stories, the same types of tropes landing over and over and over again.
You have the bust in voter.
You have the person voting with their maiden name.
You have the person who is here illegally voting.
These are stories that recur over and over and over again.
And the reason they recur is because they seem plausible, they seem believable.
And most people are not persuaded by statistics.
They don't go and say, oh, you know, Cato says that this is a small problem.
Heritage says this is a small problem.
Brennan Center.
To conservative.
Exactly.
Exactly.
I mentioned those on purpose.
Exactly.
Yes.
So when you go and you look at even the libertarian and the right-wing studies of
voter fraud, you find repeatedly that when you are looking at honest statistics, when you're
looking at the actual studies of the problem, it is infinitesimally small. But when you go and you look
at social media and you hear people who are sharing these stories that they relate to, that they
feel true, that's why these, that's why these rumors continue to propagate. When we're talking
about social media and those kinds of things and they propagate along that way, is that happenstance?
is that because the rumor mill or how does it propagate?
Why does it propagate?
How does a video of an election worker in Georgia reaching under the table to pull out a bucket of votes that is not in any way nefarious become the centerpiece of these larger conspiracies, et cetera?
Sure.
Well, you know, as Renee just pointed out, there's something really emotional about seeing something like that on social media, right?
somebody's just pulled out a bucket of votes and seems like something nefarious is happening here.
And depending on, you know, what caption the sort of, you know, aggrieved user might put underneath
it, all of a sudden it's going to start getting that engagement, right?
The algorithm is going to say, hey, this seems like it's pretty interesting.
We're going to show this to a lot more people.
And over time, the elites of the sort of the Republican Party, whoever can sort of use this
to their advantage is going to say, aha, this is something that I can use to sort of make my
case.
And so that's that kind of, you know, the algorithms and the elites are kind of working hand
in hand to spread whatever kind of emotional rumor might serve their cause.
And these causes, I mean, ultimately the aim genuine or disingenuous of protecting the electoral
system, you would consider it to be, you know, an honorable one.
You don't want.
And there are times that it does it.
I think there was a study done since 1982 that there were almost 1,500 people.
Right.
total now you could say like well in small elections even one vote two votes three votes can make a
difference and and and which is true but it's very clear that the irony of this is that the larger
threat to our electoral system and our american democracy is the manner in which social media
can spread these tropes and these inaccuracies to a really
much wider group of people and light these fires. And is that the type of thing? We're utterly
ignoring the actual threat to our democracy. Would that be accurate, Renee? Well, yes. The challenge is
that there's not a lot you can do about that because the way that as case here, this is the problem.
I thought we were fixing this today. Son of them. So the, you know, I worked on a project called the election
integrity partnership where we just traced rumor after rumor after rumor. And we wrote in real time
what was happening, how it started, where it started, what to the best of our knowledge, what the
truth was. And you're doing this. This was at Stanford University. This was when I was at Stanford.
This was in 2020. We did this in 2020, which was, of course, the year of Stop the Steel, right?
Remember. And so as we would trace these stories. And what Casey's describing is true, you have the
influencers and the algorithms. But the third piece of that is the crowd, right? The online community
that surrounds the influencer that believes it, that amplifies it, and that moves it from platform
to platform, right? People are the glue between the online platforms. Just because one platform
maybe has a policy that says, we're going to moderate this content doesn't mean that all
platforms have that, first of all. And second of all, again, like I said, you can't fact-check
your way out of this stuff. When you try to do that, people just feel that their voices are being
suppressed, if you try to silence the rumor, right? If you if you kind of nuke it and stop it from
trending, as happened occasionally in 2020, then they believe that there is, you know,
they are trying to prevent you from knowing the truth. And then that becomes kind of a second
order, you know, we call it the stric end effect, right? This idea that you are actually
amplifying the theory by trying to suppress it. So one of the things that you have to try to do then
is counterspeak against it. But the problem is, oftentimes election officials,
They are not influencers, right?
They do not have very...
I mean, let's be honest.
They are sometimes just male people and nurses and things.
Shame on them.
They have other jobs.
They have jobs.
They have elections to run.
They haven't been weaponized by dark money that goes into the system.
But they've got small followings.
And all seriousness, they have small followings, right?
They're out there.
Again, they're trying to put out facts.
Facts do not land against an emotional story.
story. And the way that that rumor mill works, one influencer says it, another one boosts it,
big if true, have you heard? You know, it's viral by the time the guy with 200 followers is like,
actually, let me tell you about how those ballots actually work. Let me tell you why this rumor isn't
true. That guy's not going to get amplification. So unless the platform is actively trying to
uprank and surface good information, which is something they were trying to do in 2020 and no
longer are. We can talk about why that is. Unless they are trying to actively uprank good
information, the good information is not making it out there. And then the other piece of that is
that the deeply distrustful crowds that have been taught that the election is going to be stolen,
right? They have heard this over and over and over again, are not inclined to believe the fact
check or the information that the election worker is going to put out. Right. Well, we see that,
Casey. And, you know, look, 2020 was stopped the steel. And there was all this,
and the Democrats had rigged the election and we're going to have we're going to have to get the
cyber ninjas in there to figure out exactly what went wrong and it routed through Venezuela and
China sent in elections and then suddenly in 2024 uh no that one that one actually was pretty good
that that one everybody forgot to do that that one worked pretty well yeah and so it really does seem
to be an argument of convenience yes absolutely it's an argument
of convenience. I also think that another thing that's important to highlight here is just the demand
for these narratives, right? In a moment in 2020, when Donald Trump had clearly lost, there was a huge
demand on the right for that not to be true. Once he won in 2024, that demand sort of went away,
right? And so that energy was able to go elsewhere. But I think it's really important to talk about
the demand side, because the algorithms, they're super important. But as Renee said, the people are
what's gluing that together. And they just sort of want certain things to be true. And in the media
environment that we have now, they can just kind of go out and pick their own reality based on what
they want. Talk about that for a moment. So if we want to think about this in sort of economic theory,
you're saying that there is supply side, I'll say misinformation. People get it wrong. That just
happens sometimes out of good faith. But the more nefarious one that's been weaponized is
disinformation. So what are the elements of supply side disinformation? And then we'll talk about
demand side, but supply side disinformation, what would be considered the elements of that?
That's where you start to see people who, again, are incentivized to seed content, to try to put out
plausible theories, to keep hope alive, but really more importantly, to cast doubt on the integrity
of the election. And that's what you saw a lot in 2020. The idea that you could just offer yet one more
justification, one more reason, one more variant on a theory. The reason we use rumor actually is
because you don't even have to know what the intent is. It's just a story that is passed from
person to person that resonates emotionally. With disinformation, we talk a lot about foreign actors
who are in the mix too, right? You have these agitators, these people who are in there because
they see an opportunity to advance their own cause. Donald Trump is talking about Iran.
That blew me away. Because, you know, of course, disinformation was a word that we couldn't say.
for a period of about, you know, three, four years there. But now, now we're talking about it again
because it was a thing that did in fact happen. It was not particularly major or significant in 2020 or in
24, far less than what we saw in 2016. But, you know, you do have foreign actors in the mix.
And these are bots? And they're they're seeding the narrative with paid bots? Or is this the kind of
thing where they talked about, you know, there's a 16-year-old in Uzbekistan and he's being given money to
invent stories that sound plausible and then seeding the turf. You know, Maria Ressa talks about this
often the idea that a lie spread seven times faster than the truth. So these are people absolutely
with intention and purpose, sowing the seeds of confusion and misinformation. That would be the supply
side actor in league with the paid influencers whose profiles are boosted by Algon.
Would that be accurate?
That's pretty accurate.
So again, you have a different, the accounts that are content creators and the accounts that
are amplifiers are not the same.
The amplifiers are, this is where you see bots usually, right?
And an amplifier would be the sorts of accounts that just click the like button or click the
retweet button on Twitter, click the share button.
And the reason is you need to have engagement in order to trigger the algorithm to share it out
to more real people.
So the reason that you have automated accounts, the reason that you can use Facebook, you can
use fakes or rent networks of accounts that are used in commercial spam, the reason you see those
accounts come in to the, on the supply side, is that you need to have engagement. Something has to
get those like counts up, and that's where you see those automated accounts. And then the,
as you're describing, accounts that are actually writing the content or saying the thing,
you do want to have some sort of legitimacy or trust there. And that's where, again, you'll see
sometimes they'll be paid and sometimes they'll be an account. That's where the, the cat turds of the
world will step in and help. I mean, I think that people sometimes underestimate with some of those folks
just how ideologically motivated they actually are. Oh, I don't know. I don't underestimate that at all.
I think they are absolutely purely ideological warriors, but are sometimes shaped by the financial
incentives that go in there. They've become that. It becomes their identity. They start to earn money on it,
which brings us to the point. And Casey, you know, this will be.
I think be kind of in your wheelhouse.
Let's talk about the artist formerly known as Twitter.
So, and this gets us to the crux of the irony.
Elon Musk for a long time and really incredibly consistently and vehemently,
has pushed this idea that undocumented non-citizen voting is rampant.
It is sowing the seeds of our destruction.
and we cannot do it.
He's tweeted about it.
I think 1,300 times or interacted with stories about it.
The irony of it all is that this guy's platform, this guy's algorithm, which he is in charge
of, I see his shit on my feed all the time.
I don't ever interact with it.
There's no reason for it to be there.
That he is a far more relevant actor.
in the warping of our democracy through his money and his algorithm,
than any measure of undocumented non-citizen voting will ever be.
Absolutely.
I mean, there was a paper published in Nature in February,
and they did a study where they had two groups,
and they showed one the sort of ranked algorithmic feed,
and then they showed one group just sort of a chronological feed,
and they found that people that saw the algorithmic feed on X moved first,
to the right than the control group by like a very significant measure, right? So if you're actively
using X, you are probably subconsciously moving a little bit to the right over time. And as you
point out, John, that is just a far greater effect than what are essentially these mythical cases
of an undocumented immigrant voting in one election somewhere. Let's break that down because it's
very easy to cast aspersions. What his argument, and I think his people's argument would be,
goes, well, now that we're getting uncensored material, now that the First Amendment has primacy,
people move to the right because they learn the truth. But the truth is that algorithm
incentivizes the misinformation from the right, and he designs it. Absolutely. And like a platformer,
my newsletter, we broke the story a couple years ago that after one of his tweets did not get
as much engagement as Joe Biden's during the Super Bowl, he went back to his,
engineers and he said, you need to re-engineer this so that my tweets are getting more prominence.
And so they did. And so that is the reason why, even if you don't follow the guy and you're
using X all the time, you're going to see his views, which contain, you know, so many just
various, like right-wing ideas and conspiracies. Like, he built the algorithm this way. And
like, if you're like me and Renee, and you've been covering this stuff for a decade,
in 2020, conservatives were holding hearings saying platforms must be ideologically neutral.
You must never suppress any sort of speech. Why aren't, you know, the right and the left should be
equal on these platforms. Fast forward to today,
Exodus our right-wing political project full stop.
Look, the algorithm is killing us. The algorithm, the way that it incentivizes, the
hostility and weaponizes ideology and all the, it just, it's, it's not right.
But the antidote, the antidote is information. And that's where ground news comes in.
Ground news, it's this website nap. It's designed to give readers a better way, an easier way,
to navigate the news.
It pulls together every article
about the same news story
from all outlets all over the world
and puts them in one place
and not, not incentivized
for like the worst, most hostile,
most partisan take.
It tells you where it's coming from.
You can see starkly in black and white
how these different organizations
and algorithms are manipulating the information
that we get.
They show you how reliable the source is and who's funding it.
Who's funding it?
Follow the money.
Know who's behind the headline.
Oh, who is this, Rupert Murdoch fella?
He seems delightful.
He seems to have a somewhat pointed view of the world.
I'm telling you, man.
The Nobel Peace Center has even mentioned that Ground News is an excellent way to stay informed.
Noble Peace Center.
That's, I think, the one that Trump started.
I think it's the 3D prints Nobel Peace Prize
It just hands them out
The platform's independently operated
supported by its subscribers
So they stay independent
And they stay mission driven
They don't get sucked into this slot
If you want to see the full picture
Go to Ground News
They can help you through the noise
And get to the heart of the news
Go to groundnews.com slash steward
Subscribe for 40% off
The Unlimited Access Vantage Subscription
discount available only for a limited time.
This brings the price down to like $5 a month.
That's groundnews.com slash steward or scan the QR code on the screen.
Now, Renee, they actually came after your group pretty hard.
Tell the story of that.
So your group is studying how these things are.
And by the way, later on, we'll get into the balance between,
Because I do think there are First Amendment concerns with a lot of these different things.
And that's maybe why it makes it more difficult to do that.
But Renee, what happened with the Stanford Research Group that you were a part of?
Yeah.
So we ran that project in 2020.
And what we did in 2020 was we were tracking these election rumors.
And we worked, we had a, we set up a tip line.
Right.
And we sent an email to the RNC.
We sent an email to the DNC.
We sent it to NWACP, AARP, a bunch of these civil society groups saying, hey, we want to help.
because one of the things we can do is we can trace where rumors come from where they're going and we can try to get fact checks out.
And per the point about the election officials, they have an election to run.
Their job is not to be sitting on social media trying to triage and figure out if rumors are disinformation.
This was the first major election since Russia in 2016.
We thought we were going to see a lot of state actor stuff.
That was one of the reasons why we did the project.
It turned out that most of the rumors about election theft, most of the rumors about delegitimization, most of the stuff trying to suppress the vote,
came from the sitting president of the United States, which we wrote about.
That's reality. I'm not going to sugarcoat it, right? So we write about that. And as we go
through this project, there's four different research centers that are participating in this,
120 undergraduate and graduate students that are the main analysts on this project. And we have a
GERA ticketing system. If you've ever called in with a customer service hotline, somebody
like makes a ticket for you and that ticket kind of goes around the, you know, the organization
the building and they like, you know, different people on your ticket. That's how we traced
these things. So what is the attempt that you're trying to map? What, what are you mapping?
Yeah, so we're tracking rumors as they go viral and then we're trying to get them to people who can
respond to them. So that might be the platform. Sometimes we would tag a platform in and say,
hey, Facebook, hey, Twitter, you have this thing that's going viral on your platform. It violates
your policy. Go have a look at it, right? And then the platform, you'd see this in the little
ticketing tags. They would say, thank you very much. We're looking.
And about 60% of the time, actually, they would do nothing.
30% of the time, they would slap a label on it saying this content is disputed.
You know, Donald Trump would say something about mail and ballots being fraudulent.
They would say this content is disputed, get the facts about mail and ballots, and they would link you out to an information site.
About 10% of the time, they would take something down.
They would decide that it rose to the threshold of actually actioning it with a takedown.
So in the course of the full period of the election, we sent about 3,000 URLs in total, right?
3,000, that's actually very important that number. So we also communicated with state and local
election officials. They had access to our tip line. So a local election official in Kentucky,
for example, sent in a tip saying there is an account pretending to be an election worker.
I don't know who this person is. They're claiming that they're destroying ballots. That was the
kind of thing that we could then go and look at, see, hey, does this look like it's foreign?
Does this look domestic? Is this something that a platform should be tagged? Just like a triage.
center. You're like an election observer, but rather than existing kind of in the practical world,
you're doing this in the online world and virtually and trying to point out inconsistencies
and things that may be troublesome to be investigated. Correct. Seems above board. And so fast forward to,
so we did this project, by the way, there was a 200-page report that sat on the internet after it was
all done. We wrote about it. And we made a table where after the election was over,
We did a data poll on Twitter and we pulled in the total number of tweets of the most viral election rumors, things that everybody has heard of.
Dominion, right, that Dominion machines were flipping votes, that there were Italian, you know, these Italian space laser theories, right?
That the Sharpie markers in Arizona had changed ballots.
So the top 10 most viral rumors that everybody saw, we added up the number of tweets.
It was 22 million tweets.
Jim Jordan
22 million
viral tweets
and this was the number
that we put out there
just showing the scope
and the scale
of how much stuff
had been making the rounds
on these very,
very viral stories.
Right.
Jim Jordan.
Jim Jordan is a congressman
from Ohio, very respected.
Extremely honest man.
Extremely honest, legendary.
There's not a piece of legislation
that has passed in America
in the last 20 years
that does not bear that man's name
as a co-sponsor
couldn't have a grander reputation
Sports coach, sports coach really really cares about the youth.
There's some issues about his time as a wrestling coach that may be slightly on it,
but has completely turned it around and is now a paragon of American sensibility and legislation.
Continue.
Also, happened to be an election denier.
Wait, what?
Son of them.
So.
All right.
So then he goes.
So the election deniers, you know, the house flips.
And Jim Jordan gets his gavel and starts this.
committee called the, I forget the proper name, which just shorthand is the weaponization
committee, but it's the subcommittee of the House Judiciary Committee to investigate the
weaponization of the federal government. He decides there has been a Biden censorship regime.
And even though the agencies that we engaged with during the 2020 election were run by
Trump appointees, again, run by Trump appointees, that despite the fact that we were talking
to state and local election officials and occasionally when we did speak to federal government
agencies. Like when the Iranians ran an influence operation pretending to be the proud boys,
we did talk to the FBI about that because our team saw that early on. We did speak to the FBI.
Trump appointees, you know. But these are real. These are real. These are real things that are
happening. The Iranians literally tried to pretend that they actually did that. Yes. And yes.
And Trump was very upset about this like a week ago. That was one of the justifications
apparently for why we just, you know, bombed Iran. But as we're doing all this work,
as we're doing all this work, we're talking to DHS CISA occasionally also.
As this is all happening, Jim Jordan gets his gavel two years later and accuses us of censoring 22 million tweets, that we were part of a vast plot by the Biden regime to steal the election by censoring 22 million tweets.
So again, they claim that that that number that we added up after the fact of the things that everybody saw, they claim was really the stuff that we censored.
They're saying that, and forgive me if this is, but I'm just going to try and think of their theory of the case.
is their theory of the case, and Casey weigh in on this as well, that by pointing that out,
that you are in league, that you are intimidating the social media platforms to go through
and call things you consider misinformation or disinformation.
And by doing so, you are unleveling the playing field.
Would that be their theory?
And where does the 3000 come in?
So the 3000 would have been the thing to actually have that conversation about.
The 22 million number was published in March of 2021.
So long after January 6th even, right?
So long after that would have had an impact.
The way that platforms engage with researchers, which I think is worth the public understanding,
is that platforms will reach out and they will periodically say,
hey, we're considering doing a policy about this. What do you think? And then you can weigh in on that
policy as an academic researcher who works in a particular field. This is not a secret, right? Twitter had
a council of 60 different civil society organizations. We were not on that council, by the way,
but Twitter had it civil society organizations council. And so whenever they were writing a policy
about hate speech or about harassment or whatever they're, you know, that issue,
that those councils dealt with, they would reach out to those entities and they would say, hey,
we're going to launch a new policy about this. You guys have an opportunity to provide some
feedback. And the reason for this is because back in 2016, 2015 timeframe, nobody on the
outside was engaging with them at all, right? All of their policies were developed entirely
internally. And that didn't make people happy either because then it was just one guy, basically,
the CEO of the company, Zuckerberg, Dorsey, whoever it was at the time, making that determination. And so
the idea behind coming up with councils or reaching out to academics is that you have an opportunity
to say your piece. And again, as I mentioned, just because you say something doesn't mean that
they listen to you, which is why when we published that report and we said we sent in 3,000
tweets, we were absolutely transparent about this. It's, again, it's sitting up there on the internet
for two years. And we also said they ignored, they did not act on 60% of those 3,000 tweets.
Right. So what you see from that, again, is they did not feel pressured to do anything in response to what we were saying or what we were suggesting. They took things under advisement and they occasionally acted. But more often than not, they did nothing. Only when it rose to a certain standard.
Right. And most of the time, they put a label on things. And that, I think, is also important to understand. So we were essentially scapegoated because Jim Jordan and the election deniers needed to come up with some justification.
after the fact for how the 2020 election was stolen.
Yes.
The irony of investigating you for weaponization was weaponization.
So that goes.
But Casey.
Yeah.
Well, I mean, it absolutely was.
You know, they wound up shutting down the center at Stanford where Renee worked or at least,
you know, prevented them from doing the kind of research that they were doing.
You know, they're filing like lawsuits against undergraduates who like dared to study what
was happening during the 2020 election.
So the weaponization was truly coming from inside the house.
But also, like, I really do not want to give these guys too much credit and say, like, you know, there was some, like, principle that they had to defend. Like, if you've watched these hearings, it truly is just about creating a spectacle and manufacturing this sense of grievance that will then enable Republicans to take further steps to disenfranchise American voters. Like, it really is that simple.
Well, let's see, though, if we can play devil's advocate and try and figure out in the interest of fairness, what is the...
glimmer of truth within whatever it is that they're using to do the weaponization.
So let's go back to, I don't think you would say that the culture of the social media platforms
had a liberal slant to it. I think we all probably agree it did. If you think about Facebook and
Twitter and those companies, they are steeped in probably at least an aesthetic amongst the
workers that liens may be liberal. Would that be fair?
I think so. Or did. I mean, I think like the most liberal that they got was like if you looked at the content policies they had, they were like steeped in the tradition of human rights. You know, like they believed that hate speech was bad and that you should try to stop people from seeing that if they were part of a protected group. But also in part of in the culture when much more people were on the ramparts about the usage of certain words or various things. I'm just trying to get at like the psychology of where this.
is. So Elon, they recognize that these are powerful tools. So we're going to walk back a little bit
just to get to kind of the genesis of this. Mark Zuckerberg does his Zucker bucks, spends $400 million
ostensibly to beef up resources. This is during COVID. So maybe they're putting up
plexiglass on things. They're getting people more access. But he has the misfortune that.
of spending $400 million on an election Donald Trump lost.
Right.
Right.
So that also becomes part of the narrative.
Yeah.
So I'm just trying to walk through so that the culture is,
maybe you consider it liberal.
Zuckerberg spends all this money.
He doesn't do it ideologically.
Combine that with then, Musk, who is having during COVID
an ideological rebirth, getting in touch with his South
African roots, if you will. And we get into this idea of the Twitter files. He buys it because he's so
disgusted by their censorship. And to be fair, during COVID, there was information that the government
put pressure on these social media groups to remove, and that information they asked to be removed,
did not necessarily turn out to be wrong. Yeah. Would that be fair? Yeah. There was definitely pressure
and just sort of like, you know, things the government said related to, like, you know, masking and how is the virus transmitted?
There were things that the government said that turned out not to be true.
I think it's also one thing I'll note on that front.
There's the reality.
And then again, there's the exaggeration.
Yes.
I'm trying to get to the reality as well.
But go ahead.
Yeah.
Yeah.
So there is, there is, so there's a court case that you're possibly familiar with, right?
The Murthy v. Missouri case, Missouri v. Biden, where you see this litigated and it goes all the way up to the Supreme.
court. Explain very briefly just the genesis of the case in the background.
So two election deniers, the attorneys general of Missouri and Louisiana filed a, no, I think
it's really important again to get the underlying motivation, right?
It's just so wild to like two election deniers, the attorney generals of Missouri and Louisiana,
one of whom is now the sitting senator of Missouri, Eric Schmidt, right? So, yeah, no, let's let's
be, look, I think it's really, again, I think getting at the motivation is something that I feel like
mainstream media dropped the ball on candidly. And I'm going to be angry about that for a long time.
Understood, as you should be. But so they file this lawsuit alleging that, again, there is a Biden censorship regime, which somehow started in the Trump administration, but holding that aside, that they eventually stopped focusing on the election because they have to deal with the inconvenient reality that at the time, these appointees were Trump appointees. So what they do to get around that inconvenient reality is they allege the deep state.
right, the unfalsifiable claim of the deep state.
If you worked there at the time and something went inconveniently for Trump, it was the deep state.
But then holding that aside, we're just going to jump ahead into the future.
And now it's Biden during COVID.
And so you do see, again, you do see the government reaching out and communicating with the platforms.
Now, the government has First Amendment rights and the government communicates with the platform as well.
Now, very clearly, did the government under Trump also reach out to the, in other words, is that something that happens?
100% across the board.
Exactly.
So that's an important thing to remember.
Right.
And they're still doing it, right?
They're still reaching out today complaining about platform moderation of ice-related content, right?
So again, platforms and governments have had back and forth this tension for since platforms
have existed, right?
And not to put too fine a point on it is, but in all the complaints about Zuckerberg being intimidated by Biden,
Donald Trump threatened to jail Mark Zuckerberg if he ever did anything like that again.
So if we're ever going to be talking about government intervention and intimidation to a social platform, let's be fair that it's one thing for the government to reach out.
It's another thing for the president of the United States to say, and I will put you in jail.
Yeah. Imagine if Joe Biden had said, I'm going to jail Elon Musk if I lose the election, right?
Like, you know, conservatives would have lost their minds.
So in 2018, you also saw threats by Trump and an executive.
order even to try to revoke platform liability protection, right? So platforms have liability protection.
That's that rule 230? CDA 230, yes. Yes. So what happens in this lawsuit is that the
attorneys general of Missouri and Louisiana sue with a judge. I'm currently being sued in front of
that judge. What? Why? Well, Stephen Miller sued me in front of that judge. We can talk about that after.
No. Yes. Wait. Because this is a machine, you understand.
Gould Stephen Miller? That Stephen Miller? Yeah, yeah, yeah. Yes. Dead-eye Stephen Miller.
Yes. Stephen Miller that walks by plants and they die. That's Stephen Miller.
Yes. It's an honor, but yes. Wow. Wow. All right. We'll get into that later.
Yes. So he- Don't let me forget that.
They pick this judge, right? They pick this judge. And they file this lawsuit. And they allege that the Biden administration did what's called job owning.
So again, because the government has First Amendment rights, the government can communicate,
with the platforms. The question is, does it rise to the level of coercion? Does it rise to the level
of the government saying, for example, nice platform you've got there, shame if anything happened to
it, if you don't do this? Or if I'd have to jail you. Hypothetically. Hypothetically.
And so they do all these depositions, right? And they're deposing Fauci. They're deposing, you know,
FBI agents. They're deposing Department of Homeland Security agents. Because what they're trying to
trying to find is evidence that the government was secretly demanding that platforms take down content.
And what they see, what they do encounter are they have these emails from,
Rob Flarety, the White House Digital Director, where he is sending emails saying, like,
what the hell happened here?
You need to explain yourself.
If you actually dig in, a lot of those emails that become very notorious are Rob Flerty
asking about the White House's own Instagram account.
So again, you have the grain of truth.
where the White House is occasionally communicating with the platforms using strong language.
But the stuff that they really blow up, the stuff that they really make, you know, these huge media moments.
This is the Twitter files in the whole thing.
Right. If you actually kind of delve down into it, what you find is that it's like the Twitter files, literally they were taking emails and cutting them in half and pretending the top half of the email said something it didn't.
So it's just the most incredibly dishonest misrepresentation of the actual evidence as they walk through these.
these cases. And this is reflected then in the Supreme Court finding, which is that the judge that
they kind of cherry pick in Louisiana says, this is the biggest censorship effort the world has ever
seen, issues an injunction, actually issues and injunction, you know, this very broad spectrum
injunction saying the government can't possibly talk to platforms. This becomes a problem.
The Fifth Circuit Court of Appeals, which is very conservative, actually walks back that injunction,
which is a remarkable thing to see. It eventually makes it up to SCOTUS. Amy,
Connie Barrett writes the opinion and she says there are clearly erroneous findings by the lower court.
The evidence just doesn't stand up here. And she tosses it for standing because what she says is that none of the plaintiffs in the case, for example, Jay Batacharya, right, who is now the, what is he?
Jesus. CDC head at the moment.
Yes. CDC head, yes. It's like musical chairs with these guys and the health officials these days.
But so the, so NIH head, CDC head, whatever his current role is, both. He is that. He is that.
And he is, you know, he accuses the administration of censoring him, but there's not a single email in which the White House so much as mentions him.
And so Amy Coney Barrett tosses this back, tosses it for standing, and says the lower courts have these erroneous findings.
And so it's basically kind of, you know, kicked back down.
But that's the kind of thing you get from these soy latte drink injustices.
This Amy Coney Barrett, if I see her any more with the little kitten ear.
hat and the resist signs, I'll lose my mind. Casey, this points to a really interesting dichotomy.
Yeah. The difference between the court of social media and the court. Yeah. And you find often that
a lot of these weaponized complaints and all these things don't bear. Let's go back to the
2020 election. None of their complaints withstood the scrutiny.
of courts, withstood the scrutiny of bodies that have evidentiary standards.
But in many respects, that's not really what matters here, is it?
No, I mean, when you are just browsing your social media feed, your evidentiary standard is this, does this feel true to me?
Does this justify whatever I thought before I opened the feed?
If so, I'm going to share it.
And I think what's really scary about that is that the particular kinds of things that we're talking about today are being used as pretext to disenfranchise American voters.
Like that's the ballgame. Do we get to pick our leaders or not?
Well, talk about that for a moment because the idea is, so this is not benign.
No.
The idea of just saying like, oh, people need to present ID on its surface sounds wildly reasonable.
but underneath it are a lot of issues like women who did not change their maiden name to their married name,
would have to then somehow find their birth certificate and go get a passport.
Like there's a lot of hoops to this.
Yeah, I mean, absolutely.
I saw one study that said that there may be as many as 69 million women who took their spouse's name
and don't have a birth certificate matching their current legal name.
you know, women may be likely are on balance to vote for Democrats than Republicans. And so if you're
a Republican and you're pushing this, like, you probably don't care that married women may be less
likely to vote in the next election, right? We also know this is probably going to affect a lot of
trans voters. If you're Republican, you'd be happy probably if no trans voters voted in the next
election, right? So when you look at the groups that are affected here, it is just generally people that
Republicans could stand to live without ever voting again. And let's just to put a fine point on it,
The reason why we're talking about this today is the Republicans are considering blowing up the filibuster,
which I really don't give that much of a shit about to begin with, and doing all kinds of things to pass this act that's going to raise identification standards so that only American citizens vote, which does not appear to be a problem of any substance, while protecting the actual mechanism that seems to be distorting,
American democracy.
And I want to get into, and Renee, I'll ask you this because when you could say, like, well,
what's the game for them?
The game is, let's look at Elon Musk's net worth by creating this algorithm on this platform,
by donating $350 million to Donald Trump and Republicans.
His net worth has skyrocketed.
And the AI tech guys, they've all benefited in a wildly disprovening.
proportionate manner through their coziness to this administration. Would that be a fair statement?
That is a fair statement. It's about maintaining power, right? One of the things with social
media is that they are, they're tools of reach, their tools of persuasion, their tools for
organizing and gathering and activating. And when you have the capacity, particularly with something
like Twitter, which is very, very good at activation, you are, you know, controlling an incredibly
potent infrastructure. One of the things that happened after Elon bought Twitter is that you saw
influential accounts on the left leave. And so there's been this fragmentation to a bunch of
different platforms. You've got blue sky, you've got threads, but there hasn't really been any kind
of cohesion. Right. There's no real competitor, even as of now. Yeah. There is no competitor in that
regard, particularly for things like breaking news or shaping information in the moment. Their Instagram is
great. You can grow very large accounts, very large reach. There's a lot of
a lot of political influencers on threads who are reaching left-leaning audiences, but it is not
the same type of algorithm.
It is just not the same structural function in the political discourse.
And that is the thing that is significantly different.
And in terms of, I mean, and we're talking about every one of these guys.
And that's the other thing.
It's not just the algorithm that skews, you know, our democracy.
It's the money.
And since Citizens United, I've just got a little list here.
Elon Musk donated 250 plus million, right, and has gained $234 billion.
Bezos paid $40 million for a Melania documentary and another $40 million probably in, you know, advertising and everything else.
Cut down on the Washington Post. He's up, you know, $15 billion.
Zuckerberg, whose Zuckerbucks was, you know, so crucial to stealing the election, did an investment pledge of, you know, and even said, you saw the media.
when he said to Donald Trump, how much should I say I'm giving?
All these guys are, are they mercenaries?
Are they just cozying up to an administration?
Or are they ideological brethren now with them?
So this is really important to talk about because I think the true ideology is capitalism, right?
Like you go back into the 2010s, most of the people that you mentioned, with the exception of Musk,
they were essentially good liberals.
Although, of course, you know, Musk sort of had his flirtations with liberal causes as well.
And shouldn't we have complained about them then, though?
Shouldn't we have come?
Just because we thought their aesthetic and their mentality was that, shouldn't we have been complaining about their algorithms and their influence and their money then?
I think like going back, there were a lot of complaints about algorithms that these places were increasingly becoming these centralized like centers of speech that did not have a lot of democratic.
oversight or control, like Mark Zuckerberg has total control over meta. Even his own board doesn't get a say,
right? And so all these guys give a lot of money to Democratic presidents and democratic causes.
And in the end, they just don't get that much for it, right? Joe Biden tries to break up meta.
He tries to break up Amazon. He creates various regulatory problems for Elon Musk. And at the end of the day,
these guys are transactional. Right. They hire Lena Khan. Yes, the audacity of that, right? But then Trump comes
along and you can just agree to build part of his ballroom and he gives you whatever you want.
That's a very recognizable character to a business person, right? It's like we can just kind of
strike a deal. So that is what you're seeing across American politics now. It's just a bunch of
oligarchs who've grown impossibly rich and powerful who are just able to buy what they want.
And protecting the turf. Where do you put Sam Altman in all this? He's another one that I,
he just seems to be kind of this weird character that shape shifts for whatever the moment calls.
he'll stand up and say, Anthropic is doing the right thing and then vacuum up their contracts
when DOD cuts them loose.
He is a shapeshifter.
Like when you talk to people who have worked with him, they will tell you that one of their
biggest issues with him is that he is always telling you what you want to hear.
It's why he's actually quite charming in person.
Politically, he has probably been a little bit more like liberally aligned.
Like me, he's a gay guy.
And I think that's where his natural sympathies are.
But if you ask him about Trump today, he's incredibly careful.
I asked him on stage about Trump last year, and he said, well, you know, I think he's really, really thoughtful about AI.
It was news to me, John, but that's what he told me.
I'm a serial guy.
But I got to tell you, when you're a little older, and not so easy to find, you know, it's not as cute when you're going through the, whatever they call them there, the stars, clovers and mushrooms and being like, oh, right, but my cholesterol is 187.
It's just saying, cereal not necessarily the best thing for you anymore, except now, magic spoon.
Magic spoon.
It gives you that feeling, Saturday morning cereal, while you get there 13 grams of protein,
zero sugar, five grams of net carbs per serving, which is how I always chose my cereals when I was younger.
I used to say to my mother growing up, how many?
What's my net carbs here?
Five grams, seven grams.
What are we dealing with?
But this stuff, Magic Spoon, keeps you fueled, whether it's breakfast, late night snack,
post-workout, whatever it is.
They got flavors, too.
It's not just one thing.
You got fruity, roasted, cocoa, cinnamon crunch, marshmallow, s'mores, all the stuff that you love.
Magic Spoon.
Look for Magic Spoon on Amazon or at your nearest grocery store.
There are plant-based versions of the cereal as well.
Even vegans.
Could you feel like they had a challenge?
You'll find vegan options at Whole Foods.
Or get $5 off your next.
order at magic spoon.com slash TWS. That's magic spoon.com slash TWS for $5 off.
As you speak to these folks, their sense of Donald Trump, you know, look, Elon Musk said,
and I asked him about this once, he said, I'm a free speech absolutist. So I said to him,
so how do you support Donald Trump, who clearly has said he wants to censor content,
disagrees with. He threatens to throw Mark Zuckerberg in jail. And he said to me flat out,
oh, that's just bluster. But now you see they're weaponizing that censorship for FCC approval
and all kinds of other things. Is he just utterly full of shit? I mean, he himself has said, he
himself has said, oh, those people are treasonous and should be thrown in jail for saying things
he disagrees with. So he's just utterly full of shit. Yeah, this is a man.
who when he took over Twitter, he started banning journalists because they put their Instagram bio in their, in their, like, Twitter bio, you know, he rewrote an algorithm to privilege his own speech over that of others.
He banned people from Twitter for publishing the whereabouts of his private jet.
Like, the list goes on and on.
The guy has never cared about free speech, except insofar as that benefits him.
His own speech.
Yeah.
René, you were going to say something.
I was going to say, I think it's really important to understand the word censorship, not as something that free speech activists.
on the web have cared about this for a very, very long time, right? I mean, we saw the freedom of speech,
not freedom of reach thing that sits on top of his content moderation policy was something
Aza Raskin and I came up with in 2018, right, that argument that you should be able to maximize
content, you should want as much to stay up as possible. And then at that point, you think about,
like, how do you decide what to curate? How do you decide what to preferentially amplify?
When you have a crisis like COVID, it is not bad for a platform to decide, hey,
maybe we should in response to search queries have a little knowledge panel up at the top that
returns something from the CDC, right? Because people are looking for accurate information.
They made perfectly reasonable decisions to uprank stuff. Did they take too much down?
Yes, you can make the argument that they absolutely did.
No, I think that's fair. I think they did. I mean, and I think they would even maybe cop to that now.
I think that they would say that, too, again, when you get to certain types of content moderation
policies like the lab leak hypothesis that became, you know, such a thing, I thought,
I thought that was a stupid policy.
Right?
I thought that was a very dumb policy.
But I also want to say, I also want to say, because I think people don't realize it, that
was a meta-only policy.
Twitter didn't do that.
YouTube didn't do that.
Only meta had that policy, and they had it for three months, right?
So it sounds like a thing that was, you know, in place for two years, and it actually
wasn't.
So when you actually look at, and I encourage anybody to do this, because the one thing is
the policy documents are there.
You can go look at them, right?
That aspect of the transparency thing is there.
What you see from Elon, though, is that he borrows the moral weight of the word censorship while emptying it of moral content.
And that's why I think...
Bars!
Bars!
Oh, man!
Renee!
Lay it down.
That's exactly right.
No, but what killed me...
Casey and I were on Kara's pod together.
When Elon started arguing that his AI had the right to nudify children, right?
And that if you said that his AI didn't have the right to nudify children, I just want to say that
again, that you were censoring him, that that was an act of censorship.
And at the same time, he in Turkey just took down the opposition completely under the guise of like,
hey, that's the law that they have. So I just have to follow that.
What are you going to do?
Yeah. It's all nonsense.
So it becomes a shield, right? It becomes a mental stop word where the minute you say that
word. People hear it and they stop thinking about what is it he's actually justifying with that word.
He used it to justify the notification of children, the notification, the non-consensual
notification of women. Now that's based on his AI or something, right? Yes. A.I. Model that
did that. Yes. And so if if that kind of, you know, if we say that moderation of that kind of content is
censorship, then that concept has just lost all meaning, right? And that's where you can't, you can't cry free speech,
absolutism and and make that claim, in my opinion. I also want to get into, there's a distinction here,
and I think it's a really important one. There's a difference between free speech and algorithmic speech.
Algorithmic speech is ultra processed. And I generally do the distinction of, you know, Twitter speech is
free speech in the way that Doritos are food. Like, it's not really. It's processed. You know,
that free speech isn't, we let everybody know when you're tweet and we,
incentivize them to hostility and outrage, and we monetize their ability to argue.
And the algorithm is not free speech.
It's just not.
It puts it through an opaque process that elevates certain speech,
speech that you deem more important, or speech that your business model deems more
monetarily beneficial.
And so how do we draw the distinction between this idea of,
free speech and the algorithmic speech, which is a perversion of speech.
Well, I think it's also important for people to understand that on a social media platform,
the First Amendment right is the platforms.
It is not the users.
It is the platform's First Amendment right to decide what it editorially curates.
That's what algorithmic curation is.
This has been reinforced legally over and over and over again.
This is why, ironically, the conservatives lose their must carry law cases, right?
That is why platforms are allowed to take things down.
And on the flip side, it is why, you know, again, it allows platforms to leave everything up or take everything down.
It allows them to pendulum swing in accordance with new leadership coming in, right?
Because the First Amendment right on a private platform belongs to the platform that is to the company that is making the editorial curation decision.
It doesn't belong to the user.
And this is a thing that frustrates a lot of people.
This is where you, you know, you hear the complaints that the platform is censoring me.
In reality, the platform is deciding what to uprank, what to downrank and how to set the policies.
And by the way, if Elon Musk, if, you know, if somebody came after him for, you know, whatever it is, downvoting something or constricting their thing or making those decisions, the Republicans be the first one to say, hey, that's his, that's his platform.
That's his First Amendment right.
Yeah.
This was why they started to, to avoid regulation, right, to have these.
self-regulatory mechanisms, that's what all those councils and, you know, the periodic outreach to
academics, the periodic outreach to government, that's what all of those things were. It was the,
hey, you guys, you know, you can weigh in, you can give us some feedback so that it doesn't look like
we're making these decisions unilaterally. And then in a way, you could argue that for, you know,
for a time, I think that they were trying to be good citizens, maybe. Maybe I'm giving them too much
credit. No, they've been unleashed with what they consider animal spirits now, I would assume.
Casey, what were you going to say?
You know, the thing that I just want people to remember whenever they're looking at a feed,
whether it is X, Instagram, anything that is ranked in this way, you were staring at something
that has been engineered to hypnotize you.
And what, you know what I mean?
And what hypnotizes you?
Conflict, outrage, weird stuff, sexy stuff, stuff that's going to produce a really strong
emotional response.
So what I try to train myself to do, and I struggle with this too, is like, when I see something,
online that makes me feel a very strong emotion, that is the moment that I'm trying to be the
most skeptical. That is what I'm saying, wait, who is posting this? Why are they posting it?
What are they trying to get me to feel? And that is the kind of core tension that you're just
always going to experience when you're looking at an app like this. So as we as we sort of break it down
there, you know, the Republican focus is on this so-called Save America Act, which is going to
safeguard our elections. But, you know, our premise is that the real threat comes from this
algorithmic manipulation of our speech combined with the unceasing amount of money that can be thrown
into the pot by these new gilded age, whatever they are, you know, robber barons. How do we find our
way? Is it, you know, if we take the analogy of algorithms to processed food, is there an ingredients
list of speech. How do we label that, you know, community notes, I think does a very nice job. I actually
agree with that. Yeah. But I also think community notes is still weaponized politically. I would like to
see a community note for good faith and bad faith. I'd like to see some kind of good faith,
bad faith, like the way they do in restaurants in New York City. If you see a sea in the window,
you are not eating their soup.
What are some of the, are there tools that can be in this arena for us?
I mean, to me, I think the sort of fast food analogy is a really good one.
But like the solution to McDonald's is not like be really careful inside the McDonald's
and always try to order the salad.
It's don't go to McDonald's too much.
And I think we need a slow food movement for the media.
The good news is I think we already have one.
I think podcasts are actually a pill.
of this slow food movement.
When you hear three people talking about something for an hour,
you're probably going to get a richer and more nuanced picture.
You really haven't listened to this podcast much, have you?
That is not my milieu.
I think you're doing a lot of good here, John.
But newsletters are also part of this, right?
Not every newsletter, but I think there's a lot of really smart people
that are just kind of like sharing their thoughts in this like very long form way.
So I just think we need to find other strategies like that.
The strategy is not what will make Instagram better because I just think that's like probably a losing game.
Oh, that's interesting. Renée, do you agree that that's, you know, the idea of kind of urging them to become better citizens is not going to be bear any fruit?
I think people have been trying to do it for a decade now.
Sure.
I feel a little bit discouraged on that front too.
You know, I have kids.
My oldest is 12 and YouTube is a.
Yeah, yeah, yeah, exactly.
No, no, we really are.
It's not, I feel like the cusp is like fourth grade now.
It's like nine.
Oh, yeah, you're probably right.
He's not on social media, but, you know, you see these kids, they realize that, like,
they can turn Google Docs into a chat app because everybody's on a Chromebook.
And then they can take a YouTube link, throw it into Google Docs, and it'll play embedded in Google Docs.
And you're, like, fighting with your kid to not watch.
They're so far ahead of us.
You're like, trying to make your kid, like, maybe you should pay attention in math instead of watching some degenerate streamer, right?
You know, with kids, you always feel like you're the Iranian regime and they're at BPA.
And they're just getting around everything.
Oh, they really do.
The reason I was thinking about this is Casey was talking, though, is I just, you know,
you try to emphasize, like, make good decisions, right?
The, you know, look, I'm not going to, I can't, I can't keep him off it, right?
I can say, like, I'm not going to let you have a social media account.
I understand you're going to watch YouTube.
Let me explain to you where some of this stuff comes from.
Because a lot of, a lot of what's really interesting with middle schoolers is like,
meme culture for us in 2016 is so normalized for them.
A lot of the stuff, even like manosphere content, it's just so in the water at this point.
It's just there.
Does that help it lose its effect in some respects?
Like when it first comes out, it's novel.
Like, look, this is a relatively new form of communication and television and radio created disruption.
Hell, the printing press created disruption.
Everybody thinks, oh, the printing press happened.
And that ushered in the Enlightenment.
It really didn't.
and 200 years of like killing, killing people and going, you know, burning witches.
Are we in a period of adjustment where your kids won't be affected in the same way because it's native to them?
I think they, I think that they know that it's not good. I think that, you know, for a lot of them, they don't want social media. They're not looking for it. I used to hear this. We would have high school students over to Stanford actually a fair bit. And there would actually be a lot of high school students who are like, I just don't want to be on it.
I just don't see the point.
I don't feel good about myself when I spend hours scrolling a feed, and so I don't do it anymore.
And so you do hear a little bit of that.
And nothing is more humbling than having your five-year-old say, is that a no mom or is that a you're distracted on your phone, no?
Right?
How is it that they always see right through us?
How is it?
I hate that about them.
Right.
Yeah, my kids are 12, 9, and 5.
And the 5-year-old recognizes that sometimes she's getting a.
distraction response as opposed to an actual response. And they will absolutely say it to you because,
you know, they recognize what's happening. And it's very humbling moment, you know, realizing
that you were just as addicted, probably more so than they are. So I think that is it, is it normalized?
I think I struggle with how do I, knowing what I know, I mean, this is, you know, this is my job
is to look at the worst stuff on the internet. But how do you keep your kids away from it?
Or at least keep it from penetrating.
But I think, and I want to ask you, you know, Casey, you talk about sort of that relating it to the slow food movement.
And I've seen things like that or farm to table.
And they always become New York Times style section columns, but they never actually become ubiquitous within the culture.
And I wonder, are we thinking about it the wrong way?
Because in some respects, this is a battle.
And what I see a lot on the left is we got that guy's given misinformation.
So we've got to stop that or we've got to take that guy out and I always view it very differently, which is no, you fight information with information and you have to fight it as tenaciously.
You know, I've been locked in battles with that.
You know, I remember when we were trying to do PACT Act for it was a burn pit bill for veterans.
You would think, you know, who's going to be against that.
but there was a very strong group of weaponized right-wing influencers who spread misinformation
about that bill and we could have gone the route of you've got to take this down you've got to
put a community know but what we did is we went right at them as tenaciously as we could and is there
a model in that you know it's been shocking to me that it's mouth
practice in terms of social media companies that nobody has created a viable competitor yet for
Twitter. Like that blows my mind. And people are trying, right? Like blue sky is trying,
threads is trying. I think these networks get really entrenched and they're difficult to disrupt.
You know, what you just said, it made me think of like what Gavin Newsom is doing. I thought that's been
very effective. Yeah. I mean, I think people have like a lot of different feelings about it,
but you cannot deny that the guy is like down in the trenches and he's fighting
the fight on the terms that Trump has created. And I do think it is worth, you know, someone doing that to
kind of see what happens. I wrote this article. I think it was, I wrote it actually, I mean,
candidly, it was after Stanford shut down the internet observatory and I was pissed. But it was a,
it was an article basically saying like, you have to fight. I mean, poor the point, right?
It was, there was such a capitulation on that front. What happened when Stephen Miller sued us and all
the subpoenas came down? Oh, yeah, right. I forgot about that. And all the hubbub, I forgot that the
undead has filed a lawsuit against Renee. Talk to us a little bit about what? Why?
Well, it's pending litigation, so I can't go into the details. But it's, I mean, the basics I can cover,
which basically it's the same lawsuit. It alleges that we, so they found a plaintiff,
somebody we'd never heard of, never, you know, never talked about, but she lives in that district.
So that's how that's how to get their judge, right? But we, they alleged that in our communication with
the platforms, um, when we said, for example,
example, Gateway Pundit wrote a false story or if we tagged it, a gateway pundit wrote a lot of
false stories in the 2020 election. I'm just going to say that again. Kind of Gateway Pundits thing.
That's absolutely, yeah. But as they said that that was us acting as like, you know, basically
de facto agents of the government, right? That we were, they alleged that DHS was secretly
puppet mastering us to do it. And so we were violating the civil rights of the people that we talked
about or that we wrote about or that we flagged or that we talked to platforms about. I mean,
this is going to get dismissed eventually.
is such a stupid theory. It makes no sense at all. But the point is to tie you up in legal bills for
three years and to shut you up because, again, like I said, I can't really go into the details of it.
Just at a curiosity, though, I guarantee you Stephen Miller and DHS are in touch with those platforms
directly. 100%. Yes, absolutely. They are. They don't even hide it. So wouldn't that be somewhat of a
defense? Well, John, you're assuming that like hypocrisy matters. You have to, I mean, again, like,
like you said before, right, eventually the courts come down in the realm of reality. But in the
meantime, the way that the, the way that the institutions, this is the point I'm trying to make,
the way that the institutions decide whether to persist is in the court of public opinion.
Did they feel pressure? Do they feel like you're going to constantly get, you know,
are they going to be constantly sued? Are they going to be constantly harassed? Are people going
to be constantly complaining about the university? Is there going to be reputational harm? Is there
to be reputational damage. And that is what ultimately makes a lot of these institutions decide
that the best thing for them to do is to shut up and say nothing. It is the wrong approach. It has been
the wrong approach for several years now. But making them understand that you have to fight in the
arena is a foundationally different shift because they're hearing from their PR people who were
trained in the 1990s era of crisis comms or comms that you shut up and you let the media cycle
move past. And they don't understand that there's no such thing as like the media cycle
in the age of social media, when they can just, you know, kick it back up again once they've made you a
character in a cinematic universe. And there's no such thing as a frictionless existence. This idea that
they think if they just make themselves small enough, they can live in a frictionless existence is
ridiculous because what you end up making yourself is a tasteless pablum so inoffensive that it serves no one
and does nothing. Casey, maybe you have a sense of this. How is it that? And I don't think
there's any question that this is a bioengineered product that is designed to escape whatever it is that is self-protective within the human brain to continue it is monetizing your attention and your life they care about nothing it is a parasite
while it can be viable and good it is nefarious in its intent and monetize in its intent how the head
is a product that powerful, that dangerous, go utterly unregulated for any of its effects?
And how do these companies escape any liability?
It's a great question.
And I think there has been a lot of progress lately, actually, in trying to shift that discussion.
I would say that prior to 2020, we didn't really think of these products.
in the same way that you would think about other regulated goods like alcohol or tobacco.
And people tried to pressure them to make different sort of content moderation decisions.
And well, maybe if the algorithm worked a little bit differently, we would all be happy again.
But you fast forward to today and what people are saying is like, actually your product is just broken.
Actually, it's not safe for anyone to use if they're under 16.
It's probably not safe for us either, but let's at least start with the children.
And so you look around the world and country after country is now saying, we're actually just going to ban this stuff until you turn 16.
because again, while it's probably like very bad for adults, we are increasingly confident that
it's bad for children. And so I just think that's like kind of the start. And, you know, maybe like,
maybe this is just cope. But part of me wants to think that in the same way that like banning children
from smoking eventually let a lot of adults to stop smoking too, I wonder if we're not going to
see something similar for social media. Right. No, I think that's that's quite possible. Is there
a warning label, Renee, that we could possibly come up with. This is your brain on Facebook.
Facebook. The surgeon general advocated for that. People were trying that. This was actually an idea that was that was making the rounds for a while because again, like, what are the, I feel like Center for Humane Tech had some ideas around like switching your phone to gray scale so it was less appealing. A lot of these, I mean, I think there's actually studies that show that people, you know, will, I think that there's actually some.
Advocacy. You know, scientific basis to that. Yeah. Like, it's just less appealing. Well, Renee, I switched my face to gray scale.
And it's clearly less appealing.
Well, guys, I very much appreciate you taking a time.
This has been absolutely fascinating and really helpful in understanding it.
Renee Dorese, Associate Research Professor George Hamm of Court School,
public policy, and also the author of Invisible Rulers,
the people who turn lies in reality.
And Casey Newton, he's got a platformer and, of course, co-hosts of Hard Fork.
So, guys, thank you so much for being here.
Thank you for having us.
Thank you, John.
It's great to be here.
Man, I enjoy a nice expert panel.
Those guys know their shit.
I must apologize to all of you, our wonderful panel, Lauren Walker,
Brittany McEmmett, Gillian Spear, will not be joining me today.
We had, can only be described as a technical malfunction,
which, as many of you know, is also my nickname.
So we weren't quite able to work it all through, I'm going to say,
the cloud, even though I know I'm pulling that out of my ass.
But really appreciate it.
And they will obviously be back next week.
And I want to shout them out again because of their work, giving me the information to allow me to have a cogent, coherent, coherent conversation with people who are expert in their field.
So lead producer Lauren Walker, producer, Brittany Mehmetovic, producer Jillian Spear, video editor and engineer Rob Vatola, who did yeoman's work, even getting this thing.
done audio editor and engineer
Nicole boys there I don't know
I think they're pulling all nighters just to get
this thing out by Wednesday
and our executive producers Chris McShane
and Katie Gray we will see you guys
next week
bye bye
the weekly show with John Stewart is a comedy
central podcast is produced by Paramount
audio and busboy productions
podcasts
