Tangle - Section 230 hits the Supreme Court.
Episode Date: February 23, 2023On Tuesday, the Supreme Court heard arguments in Gonzalez v. Google, a case that could change the liability social media companies have for the content published on their platforms.You can read today'...s podcast here, today’s “Under the Radar” story here and today’s “Have a nice day” story here.Today’s clickables: Quick Hits (1:41), Today’s Story (3:16), Those against changing (6:52), Those in favor of changing (11:26), Isaac’s Take (16:46), Your Questions Answered (19:43), Under the Radar (21:53), Numbers (22:38), Have A Nice Day (23:09)You can subscribe to Tangle by clicking here or drop something in our tip jar by clicking here.Our podcast is written by Isaac Saul and edited by Zosha Warpeha. Music for the podcast was produced by Diet 75.Our newsletter is edited by Bailey Saul, Sean Brady, Ari Weitzman, and produced in conjunction with Tangle’s social media manager Magdalena Bokowa, who also created our logo.--- Send in a voice message: https://podcasters.spotify.com/pod/show/tanglenews/message Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
Based on Charles Yu's award-winning book,
Interior Chinatown follows the story of Willis Wu,
a background character trapped in a police procedural
who dreams about a world beyond Chinatown.
When he inadvertently becomes a witness to a crime,
Willis begins to unravel a criminal web,
his family's buried history,
and what it feels like to be in the spotlight.
Interior Chinatown is streaming November 19th,
only on Disney+.
Chinatown is streaming November 19th, only on Disney+. From executive producer Isaac Saul, this is Tangle.
Good morning, good afternoon, and good evening, and welcome to the Tangle Podcast,
the place we get views from across the political spectrum. Some independent thinking without all
that hysterical nonsense you find everywhere else. I'm your host, Isaac Saul, and on today's episode,
we are going to be talking about Section 230 and the Supreme Court case dealing with it,
where oral arguments took place earlier this week.
Before we jump in, though, one quick note. Tomorrow, I'm going to be covering the Fox News
Dominion voting systems lawsuit and some of the revelations from that lawsuit, including
some pretty interesting text messages that were exchanged between Fox News hosts during the January 6th craziness around
the riots at the Capitol. It's going to be a pretty interesting story, and it's obviously
one that's important to me because I think it's about media bias. It's also about election fraud
and allegations of election fraud and all that stuff tied up into one. So if you want to get
that edition, you need to subscribe. Please go to readtangle.com and become a member. Memberships are cheap. They are just $50 a year, which is $4.16
a month. Or you can sign up for our monthly rate, which is just $5 a month.
All right, with that out of the way, we'll jump in with our quick hits today.
First up, China's Xi Jinping will visit Russia, according to Vladimir Putin,
raising concerns that Beijing may begin providing military support for the invasion of Ukraine.
Number two, Senator Jon Tester, the Democrat from Montana,
said that he will seek re-election for a fourth term in Congress.
Number three, Democrat Jennifer McClellan won a special election in Virginia to fill Representative Donald McKeachin's seat after he died last year.
She is the first Black congresswoman from the state.
Number four, Israeli forces carried out a raid in the West Bank yesterday,
which set off clashes that killed 10 Palestinians, including six militants.
Israel was seeking to arrest three
Palestinian militants who were allegedly planning attacks. Number five, the FDA proposed a new rule
that would allow plant-based milk alternatives to continue using the word milk in their labeling.
news alert now on the supreme court hearing on section 230 which has potentially revolutionary implications for the internet internet as we know it the supreme court is hearing arguments today
in a case that could completely reshape the internet gonzalez versus google addresses
whether or not tech companies should
be held legally liable for harmful content promoted by their algorithms. The Supreme Court
hearing arguments in a landmark case that defines how content platforms handle user content. At
stake, how much platforms can moderate users' posts and whether they should be held accountable.
On Tuesday, the Supreme Court heard oral arguments
in Gonzalez v. Google, a case that could change the liability rules for social media companies
relative to the content that is published on their platforms. Section 230 is a part of the
Communications Decency Act, passed in 1996, that helps govern liability on the internet.
Often referred to as the 26 words that created the internet, section 230 says this,
no provider or user of an interactive computer service shall be treated as the publisher or
speaker of any information provided by another information content provider. Practically
speaking, this means platforms like YouTube, Facebook, and Twitter are not liable for what
someone publishes on their platforms in the same way that the New York Times would be liable for what a journalist writes in its paper. There are some exceptions to Section 230.
Illegal sex trafficking, violations of federal criminal law, or copyright violations all confer
upon the platforms a higher degree of liability to monitor and remove offending content.
So what's happening right now? Well, the family of an American who was killed in a 2015 terrorist attack in Paris is suing Google,
saying that, through its subsidiary, YouTube,
Google violated the anti-terrorism law by promoting ISIS's videos through its recommendations algorithm,
which helped increase recruitment and led to the death of their family member, Nohemi Gonzalez.
Lower courts have ruled in favor of Google,
saying Section 230 protects them from being liable for the third-party content posted on its service. For different reasons, during
oral arguments, nearly all the justices seemed skeptical of changing the law. Justice Clarence
Thomas suggested recommendations were protected by Section 230, so long as the algorithm was
treating content similarly. Justices Elena Kagan and Brett Kavanaugh suggested that if Section 230
was not the best way to govern the internet, Congress should make a change, not the Supreme Court.
These are not like the nine greatest experts on the internet, Kagan said in a comment that
drew headlines.
Katonji Brown Jackson was the only justice who appeared interested in tearing down the
law, but even her line of questioning suggested she was not convinced by the arguments the
Gonzalez family's lawyer was making. Once again, the argument shot life into a debate about what should happen to
Section 230. On Wednesday, the court is hearing a second case, Twitter vs. Tomne, which touches
on related issues but is not relevant to Section 230. Both conservative politicians and some
progressive activists oppose Section 230 for different reasons, while many people revere it
as the law that helped create the internet as we know it. Since this issue does not fall down traditional
partisan lines today, we're going to break down our arguments based on those who support Section
230 as it is and those who are against changing Section 230.
Many argue that the Supreme Court striking down parts of Section 230 would destroy the
internet as we know it, worsening the functionality of the most popular platforms. Some praise the majority of justices who seem to understand this is a problem
for Congress. Others argue that removing Section 230 would lead to more online censorship, not less,
which everyone should be wary of. In Slate, Mark Joseph Stern said Brett Kavanaugh just made the
best argument for saving the internet. The plaintiffs have zero evidence that any of the
Paris terrorists saw these suggestions. They simply speculated that users may have been radicalized
into joining ISIS because of YouTube's algorithmic recommendations. At this point, you might ask,
if there's no proof the algorithm played any role in radicalizing or inciting the terrorists,
why did the plaintiffs mention it at all? Why not sue over the mere existence of ISIS recruitment
videos on YouTube, which is the true gravamen of the complaint anyway?
That's where Section 230 comes in, Stern said.
The law, passed in 1996, generally bars lawsuits against a website for hosting other people's
expression, even if that expression is harmful and illegal.
Section 230 expressly protects any, quote, interactive computer service, end quote,
that chooses to, quote, filter screen or organize content. Filtering and organizing content, of course, is precisely what algorithms do.
It makes no sense to claim that a website simultaneously gains and loses immunity by
organizing speech, Stern said. As Justice Brett Kavanaugh explained, it would mean that the very
thing that makes the websites an interactive computer service also means that it loses the
very protection of Section 230.
And just as a textual and structural matter,
we don't usually read a statute to, in essence, defeat itself.
It was also Kavanaugh who delivered a remarkable defense of the law as it's read today.
Why not let Congress take a look at this
and try to fashion something along the lines of what you're saying?
Or, as he put it later,
isn't it better to keep it the way it is and put the burden on Congress to change that? In National Review, Bobby Miller said reforming the law would
have dire consequences for the right. Conservatives have long claimed, often rightfully so, that big
tech is silencing their voices, Miller said. As a remedy, they have sought an overhaul of Section 230
of the Communications Decency Act in 1996, the foundation of the modern internet. But experts are warning
that attempts to persuade the Supreme Court to roll back the liability protections in Section 230,
enjoyed by the internet platforms, are ill-advised. The petitioners in Gonzalez and Tomna argue that
websites can be held liable for the algorithms they use to curate and present content. Shane
Tews, a senior fellow at AEI, disagrees. Narrowing Section 230 will create an appetite for greater censorship, resulting in more, not less, content moderation.
More moderation means more curtailing access to information and freedom of expression.
Tews is correct.
If Section 230 is circumscribed in either Gonzales or Aton, then social media companies will censor conservative speech more, not less, Miller said. If Silicon Valley knows
it can be held liable for algorithms that promote speech with even the slightest intonation of
incitement or defamation, the tech firms already predisposed to fear conservative views will
invariably clamp down. Those pushing for Section 230 reform ought to tread lightly. In Vox, Ian
Millhiser praised the Supreme Court for understanding it could break the internet.
In Vox, Ian Millhiser praised the Supreme Court for understanding it could break the internet.
Gonzalez v. Google, the case heard today, could subject social media websites and even search engines to ruinous liability, potentially forcing these companies to abandon their
business models or even shut down, he said. That said, most of the justices appeared
sufficiently spooked by the possibility they could destroy how the modern-day internet operates that
they are likely to find a way to prevent that outcome. As Justice Elena Kagan warned at one point during the Gonzalez argument,
the justices are not the nine greatest experts on the internet, so it makes sense for them to
approach a case that could fundamentally change how foundational websites operate with a high
degree of humility. The potential consequences of this legal theory are breathtaking, Millheiser
wrote. If Twitter, YouTube, or Facebook
may be held liable for any content that is served to users by one of their algorithms,
then these websites may need to dismantle the algorithms that make it possible for users to
start through the billions of videos, tweets, and other content published on these websites.
The Gonzalez case itself, for example, claims that Google should be liable because YouTube's
algorithm, which Google owns, sometimes served up ISIS recruitment videos to some users. And thus, Google is legally
responsible for the ISIS-led attacks that killed American citizens and their relatives.
This same theory could hamstring search engines, too.
All right, that is it for the voices who are against changing it.
And this brings us to the voices who are for changing it.
Many argue that the Internet's most destructive elements are proliferating because of Section 230.
Some make the case that big tech companies have almost total immunity for their actions as long as Section 230 exists as it does. Others argue that Section 230 should be reformed, but that reform would be best done not by the courts but by Congress. In the New York Times, Julia Angwin said it's time to
tear up big tech's get-out-of-jail-free card. The law, created when the number of websites could be
counted in the thousands, was designed to protect early internet companies from libel lawsuits when
their users inevitably slandered one another on online bulletin boards and chat rooms, Anglin wrote.
But since then, as the technology evolved to billions of websites and services that are
essential to our daily lives, courts and corporations have expanded it into an all-purpose
legal shield that has acted similarly to the qualified immunity doctrine that often protects
police officers from liability, even for violence and killing.
Based on Charles Yu's award-winning book, Interior Chinatown follows the story of Willis
Wu, a background character trapped in a police procedural who dreams about a world beyond
Chinatown. When he inadvertently becomes a witness to a crime, Willis begins to unravel
a criminal web, his family's buried
history, and what it feels like to be in the spotlight. Interior Chinatown is streaming
November 19th, only on Disney+. As a journalist who has been covering the harms inflicted by
technology for decades, I've watched how tech companies wield Section 230 to protect themselves
against the wide array of allegations, including facilitating
deadly drug sales, sexual harassment, illegal arms sales, and human trafficking, behavior that
they would have likely been held liable for in an offline context. There is a way to keep internet
content freewheeling while revoking tech's get-out-of-jail-free card, drawing a distinction
between speech and conduct. In this scenario, companies could continue to have immunity for the defamation cases that Congress intended, but they would be liable for illegal
conduct that their technology enables, Angwin said. Courts have already been heading in this
direction by rejecting the use of Section 230 in a case where Snapchat was held liable for its
design of a speed filter that encouraged three teenage boys to drive incredibly fast in the
hopes of receiving a virtual reward.
They crashed into a tree and died.
Drawing a distinction between speech and conduct
seems like a reasonable step toward forcing big tech to do something
when algorithms can be proved to be illegally violating civil rights,
product safety, anti-terrorism, and other important laws.
In Newsweek, Theo Wold made the case that big tech's immunity,
thanks to Section 230, is way too far-reaching. Big social media has been wielding Section 230 in court this way for
years, and often successfully. When parents have sued Meta after their teenage daughters developed
eating disorders promoted by Instagram's algorithm, or when parents have sued TikTok
after their children died attempting dangerous viral challenges the app's videos promoted,
Big social media has asserted Section 230 as a purported affirmative liability shield, Wold wrote.
Indeed, Big Social Media invokes Section 230 not merely to rebuff these plaintiffs' claims,
but to prevent courts from hearing the cases at all. Because Section 230 conveys immunity,
a plaintiff cannot even try to hold Big Social Media accountable for the harm its algorithms
cause to consumers. This expansive view of Section 230 should alarm all Americans because
it represents an industry asking for the unthinkable, complete immunity from civil
liability for harms caused by the very core of its profit-generating business, he said.
It is comparable to Exxon seeking legal immunity for all of its drilling-related activities,
or Southwest Airlines seeking immunity for all of its drilling-related activities, or Southwest Airlines
seeking immunity for all activities related to air travel, or Ford seeking immunity from claims
related to the manufacture of any wheeled vehicle. In each of these hypothetical scenarios, such a
sweeping immunity would perversely incentivize the underlying company to seek profits no matter
the human cost. What big social media seeks is no different. In the Washington Post, Henry Olson said,
regardless of the court's ruling, Congress needs to take action to limit Section 230's reach.
Section 230 is clearly right in one sense. No one would reasonably suggest that the U.S.
Postal Service or a phone company should be held liable for defamatory letters or phone calls.
Internet companies should have similar protections when they act in similarly passive manners. The problem, however, is that many tech companies do not act passively, Olson said.
Instead, they push content to users with algorithms, arguably acting more like a
traditional publisher, which is liable for defamatory content it chooses to print,
than merely a system operator. They have also exposed children to pornography and online bullies,
driving countless teenagers to depression and online bullies, driving countless
teenagers to depression and suicide. Add in the known use of social media by malign state actors
and terrorists to spread disinformation and radicalizing content, and it becomes clear that
the internet is no bed of roses. The case against Google will not definitively solve this problem.
Even if the court rules against the company, it would merely begin to weaken the legal protections
that shield big tech from paying for the damage it facilitates.
The real solution must ultimately come from politics, Olson wrote. Congress and the president
will have to create regulatory frameworks that dramatically reduce the harm created by online
companies, much as the Clean Air Act significantly cut down air pollutants. That new framework can
take many paths, depending on what harms are considered most unacceptable and which are most expensive to enact. Limiting children's
unsupervised internet access, as Senator Josh Hawley, the Republican from Missouri, has proposed,
seems to be a case of high-impact, lost-cost regulation. Efforts to minimize the ability
of terrorists to share provocative or inflammatory material will be much costlier to implement.
All right, so that is it for the people who are for and against Section 230 reform,
which brings us to my take. So in the Gonzalez case, I think my position is rather straightforward. The court should stay out of the way. Section 230 is quite clear as it's written, and it seems pretty obvious
to me that the algorithms big tech platforms use to serve up content are protected in part by
Section 230. Most of the justices seem to view it this way, and I don't think we're going to see
this case upend the law or how Section 230 functions.
That doesn't mean all is well, though. The devil's bargain of this arrangement is that I can use YouTube to look up how to fix a headlight on my 2006 Honda CR-V, and I can also use it to look
up how to make a homemade bomb. Or, more precisely, something like YouTube's algorithm could send me
from rather innocuous searching on combustible engines to more dangerous videos about bomb making, which feels like something slightly different than the question of what
people are posting on the platform and that platform's responsibility for it. It will probably
surprise nobody that I broadly support Section 230 and the rules it creates. Platforms like YouTube,
Facebook, and Twitter allow for free-flowing information sharing and make it easy for
non-institutional voices to have an
impact on the world, especially in the political space. If platforms were held liable for the
things their users publish, the platforms would grind this free flow of information to a halt
in order to focus more on moderation, and this would fundamentally harm the robust information
ecosystem the globe currently enjoys. But the line is incredibly difficult to draw.
Ian Millhiser offered a good analogy.
If a bookstore organizes its tables by topics and stacks all the sportsbooks together,
this would essentially function the same way algorithms do for users who clearly like sports.
The question is when that organization goes from benign to dangerous. This complexity is why the
court, with zero experts on technology or the core issues at hand, should have no role in drawing that line.
Which brings us to Congress.
The fact that Supreme Court ruling here could be disastrous does not mean there is no room
to do something about the liability these companies should have.
If companies are elevating illegal content, terrorism recruitment videos, for example,
it's only reasonable to expect some kind of liability.
Congress could carefully craft legislation that opens the doors to sue companies like Google if a plaintiff can prove a connection
between a criminal act and the promotion of criminal behavior by the platform. In Gonzalez,
no such explicit connection exists, but if it did, we'd benefit from there being a means for
the family to hold Google accountable. Again, though, that language is not something the nine
justices on the Supreme Court
should try to suss out.
Kudos to them for a rare bit of humility on this issue.
I'm hopeful that if or when they rule in favor of Google,
members of Congress should continue working
to draft legislation that limits the broad immunity
these companies have and offers some kind of recourse
when they make cataclysmic mistakes.
All right, that is it for my take,
which brings us to your questions answered. Richard from Shelton, Washington said,
what is your feeling on Speaker Kevin McCarthy releasing the videotapes to only Fox News?
So for anyone who missed the story on Fox News this week, host Tucker Carlson told viewers that House Speaker
Kevin McCarthy had given him thousands of hours of security footage from January 6th. Presumably,
Carlson is going to do a segment about what the footage actually shows. Democrats have criticized
McCarthy for the move, saying the footage contains closely held secrets about how members of Congress
protect themselves during an attack. There's one specific reason I'm intrigued by this move,
the counter-narrative. Because this footage was so closely held by the January 6th committee,
we only saw clips that were given to us through the filter of the committee's effort to prosecute
rioters. I'm sure there are some interesting things in there that never got any news coverage,
and knowing Carlson's team, they're sure to find them. But for basically every other reason,
I have a bad feeling about it. For one, as we'll
discuss in tomorrow's Friday edition, recent revelations about Carlson and Fox News show
demonstrably that they are not being honest with their viewers, especially about issues like
January 6th, which Carlson has already claimed was a false flag operation, it wasn't, and the 2020
election. So I certainly won't take whatever he releases at face value, nor should anyone take
anything the nightly networks do at face value anymore. Second, I think the security concerns
are real. Again, we don't know a lot about what footage was turned over, but it's not hard to
imagine why giving 40,000 hours of footage to a bunch of staffers at Fox News might be a bad idea.
If any of it leaks online or if it is published in full somewhere, it would give anyone
who wants a detailed look at how members of Congress are evacuated and protected, which
seems pretty dangerous. Third, I'll be keeping an eye on what McCarthy does next. He promised
the footage to Carlson and is fulfilling that promise, but he also said he would hand it over
to news organizations more widely once Carlson gets his exclusive. If he does that and the footage
becomes widely
available to more news organizations, I'd feel slightly better about it,
though still a little bit uneasy about so much of it existing out there.
All right, that is it for our reader question of the day, which brings us to our under the
radar section. The United States is hoping to create two semiconductor chip manufacturing clusters by 2030 in an effort to bring more chip manufacturing back to the United
States. Commerce Secretary Gina Raimondo said the United States will target the $53 billion
CHIPS Act to bring together research labs, fabrication plants, and packaging facilities
for the assembly of chips. It's unclear where the clusters would be located, but the Wall Street
Journal reports that Arizona, Ohio, and Texas are at the top of the list. U.S. and foreign
manufacturers have already unveiled more than 40 projects for a total investment of close to $200
billion. The Wall Street Journal has the story, and there's a link to it in today's episode
description. All right, next up is our numbers section. The number of active users on YouTube as of 2023
is 2.6 billion. The number of hours of content uploaded to YouTube every day is 720,000.
The hours of video watched by YouTube users every single day is 1 billion. The number of active
users on Facebook as of 2023 is 2.96 billion. The number
of photos uploaded to Facebook every day is 350 million. The number of Facebook messages sent each
day is 10 billion. All right, and last but not least, our have a nice day section. A solo Atlantic
rower has set a new world record. East Yorkshire resident Miriam Payne,
23 years old, rode from the Canary Islands to Antigua faster than any woman in history.
Her time of 59 days, 16 hours, and 36 minutes is a new world record. Payne said she was absolutely
knackered by the experience, and she was so tired that she just wanted to get to the end so she
could stop rowing. In order to qualify for the record, Payne had to complete the entire trip totally by herself,
making all her own repairs to her boat, seized the day,
and she helped raise money for mental health charities in East Yorkshire along the way.
The 3,000-mile race, about the width of the United States,
is considered one of the hardest rows in the world.
BBC News has the story, and there is a link to it in today's episode description.
All right, everybody, that is it for today's podcast. Like I said at the top,
if you want to hear from us tomorrow, please go become a subscriber and you'll get our
edition on Fox News and Dominion. If not, we'll be right back here on Monday. Have a great weekend. Peace. Our podcast is written by me, Isaac Saul, and edited by Zosia Warpea. Our script
is edited by Sean Brady, Ari Weitzman, and Bailey Saul. Shout out to our interns, Audrey Moorhead
and Watkins Kelly, and our social media manager, Magdalena Vakova, who created our podcast logo.
Kelly and our social media manager, Magdalena Vukova, who created our podcast logo. Music for the podcast was produced by Diet 75. For more from Tangle, check out our website at www.tangle.com. We'll be right back. in a police procedural who dreams about a world beyond Chinatown. When he inadvertently becomes
a witness to a crime, Willis begins to unravel a criminal web, his family's buried history,
and what it feels like to be in the spotlight. Interior Chinatown is streaming November 19th,
only on Disney+.