Pivot - Decoder with Nilay Patel: What's Next for the Controversial 'Child Safety' Internet Bill
Episode Date: September 3, 2024There’s a major internet speech regulation currently making its way through Congress, and it has a really good chance of becoming law. It’s called KOSPA: the Kids Online Safety and Privacy Act, wh...ich passed in the Senate with overwhelming bipartisan support late last month. At a high level, KOSPA could radically change how tech platforms handle speech in an effort to try and make the internet safer for minors. Nilay Patel talks with Verge senior policy reporter Lauren Feiner, who’s been covering these bills for months now, to explain what’s happening, what these bills actually do, and what the path forward for this legislation looks like. Listen to more from Decoder from Nilay Patel here. Learn more about your ad choices. Visit podcastchoices.com/adchoices
Transcript
Discussion (0)
Support for Pivot comes from Virgin Atlantic.
Too many of us are so focused on getting to our destination that we forgot to embrace the journey.
Well, when you fly Virgin Atlantic, that memorable trip begins right from the moment you check in.
On board, you'll find everything you need to relax, recharge, or carry on working.
Buy flat, private suites, fast Wi-Fi, hours of entertainment, delicious dining, and warm, welcoming service that's designed around you.
delicious dining and warm, welcoming service that's designed around you.
Check out virginatlantic.com for your next trip to London and beyond and see for yourself how traveling for business can always be a pleasure.
As a Fizz member, you can look forward to free data,
big savings on plans,
and having your unused data roll over to the following month.
Every month.
At Fizz, you always get more for your money.
Terms and conditions for our different programs and policies apply.
Details at Fizz.ca.
Hi, it's Kara.
We're off for the holiday today,
but we have an episode for you from Decoder with Nilay Patel.
In this episode, Nilay talks to Verge senior policy reporter Lauren Feiner
to break down the Kids Online Safety and Privacy Act, which is controversial and which recently passed in the Senate.
Enjoy, and we'll see you Friday.
Hello and welcome to Decoder. I'm Nilay Patel, editor-in-chief of The Verge, and Decoder is my show about big ideas and other problems.
We've talked a lot on this show about the various attempts to regulate the Internet in the United States,
and how almost all of them run into the very simple fact that the internet is mostly made up of speech.
The First Amendment prohibits most speech regulations in this country.
Literally, it says, Congress shall make no law abridging the freedom of speech.
That's why we don't have a lot of laws about the internet.
But there's a major internet speech regulation currently making its way through Congress right now,
and it has a really good chance of becoming law. It's called COSPA, the Kids Online Safety and Privacy Act,
which passed in the Senate late last month with overwhelming bipartisan support.
You've probably heard of COSPA's predecessor, COSA, the Kids Online Safety Act. We've been
talking about COSA for a while now, and in fact we discussed it here on Decoder earlier this year
with Senator Brian Schatz, the Democratic senator from Hawaii who was one of COSA's co-sponsors.
That got bundled up with another build called COPA 2.0,
the Children's Online Privacy Protection Act, and that's how you get COSPA.
At a broad level, COSPA is supposed to accomplish two big goals,
better protecting the privacy of miners online,
and making tech platforms more responsible for what those miners see and do.
The first part, COPA 2.0, is basically a spec bump. online, and making tech platforms more responsible for what those miners see and do.
The first part, COPPA 2.0, is basically a spec bump.
The first COPPA, passed in 1998, made it so websites and now social media apps can't
knowingly have users under the age of 13 on their platforms without their parents' consent.
Of course, that hasn't stopped any kids from using any of these things, and there's
been a host of research and experiences with kids on the internet since. So COPPA 2.0 bumps the age limit to 17 and bans things like showing targeted
ads to minors. This feels relatively straightforward. It's the second part, the COSA part,
that's been controversial for some time. And it remains controversial even as the bill gathers
momentum. COSA creates what's called a duty of care for platforms like Meta, Google,
TikTok, and others, effectively making them liable for showing harmful content to kids.
That's a speech regulation, through and through. It dictates what the platforms can show and what
their users can see. And like every speech regulation, that means COSPA has to get over
First Amendment objections. COSPA certainly has opponents who are making First Amendment arguments.
But there's a strong argument on the other side that the government's interest in protecting
children is enough to overcome those problems, and that the political power of parents being
worried about the effects of the internet on their children will push COSPA through.
But COSPA is far from a done deal. It hasn't passed the House of Representatives, which is
now in recess until September, and House leadership has indicated they may not even consider the bill in its current form.
On top of that, once the bill passes the Senate and the House, it ends up on the President's desk,
and we're in the middle of a very contentious presidential election.
So there's a lot to talk about, and a lot that might change. To help break it all down,
I'm talking to Verge senior policy reporter Lauren Feiner, who's been covering these bills
for months now. She's going to help explain what's going on, what these bills actually
do, and what the path forward for COSP looks like. Okay, COSP and the child safety debate
on the internet. Here we go. Lauren Feiner, welcome to Decoder.
Thanks for having me.
I would say I'm excited to talk to you, but it's about the First Amendment and regulating speech on the Internet.
So, always dicey, but I think we're going to have fun.
My plan is to have fun.
Let's start with the bills in Congress right now.
There's one called COSA and there's one called COSPA.
Congress right now. There's one called COSA and there's one called COSPA. And they are meant,
I think somewhat sincerely, to improve the lives of children on the internet. What are COSA and COSPA? COSA is the Kids Online Safety Act. And that's a bill that was introduced by
Senators Richard Blumenthal and Marsha Blackburn. It's a bipartisan bill that's received an
overwhelming amount of support in the Senate. It's basically a bill that's received an overwhelming amount of support in the Senate.
It's basically a bill that seeks to create a duty of care for online platforms that kids use so that basically the platforms have to be responsible for making their products safe for kids.
It's meant to safeguard them from things like eating disorder content or other mental health harms that they could
come across on the internet.
So COSPA is basically a mashup of COSA and this other kid safety bill called the Children's
Online Privacy Protection Act, COPPA 2.0.
They actually added teens into the title this time.
The original bill was meant to protect kids under 13 years of age. This one is under 17.
So it includes a greater group of kids in this bill. And it also does some additional things,
adding on to the existing protections for kids online, like banning targeted advertising to
that group. There have been a lot of attempts to save the children online. What about these two
bills made them the ones that got mashed up in advance? COSA was really taking a different
approach than we've seen before and was trying to do something really overarching. And I think
that was something that was really attractive to lawmakers who they see something like protecting
kids on the internet. And that seems like a really politically good issue.
So I think COSA, something it did interesting was just create this duty of care for these
platforms.
So that's something that even if the platforms change the way that they do business in the
next few years, or they change something fundamental about their platform, there's still this underlying responsibility, which I think is really attractive to lawmakers who want some sort
of lasting way to protect kids on the internet. And COPPA 2.0, I think, is, you know, a little
bit more of an update bill, but something that is maybe easy to get through because everyone wants
to protect kids on the internet,
wants to protect their privacy. And I think there's been kind of this updated understanding
among lawmakers that maybe we do need to protect a greater swath of kids online than just under 13.
There have been a lot of organizations advocating for this bill. We should talk about them
for a second. There are a lot of organizations advocating against it. Who are they and what are their specific concerns?
The people advocating against this bill are particularly concerned about COSA and the way
that it might impact free expression on the internet more broadly. And this is not just for
kids, it's for adults as well. But there's also concerns about how kids may or may not have access
to potentially
really helpful information on the internet. These are organizations like the Electronic Frontier
Foundation, the ACLU, Fight for the Future. And these groups are concerned on the one hand that
kids maybe won't be able to access important resources, especially if they're from marginalized
groups, you know, trans youth or kids who are in
difficult home situations, being able to access potentially life-saving or affirming information
on the internet if platforms decide to take a really broad, censorious approach to taking
content off of their platforms to avoid liability. There's also kind of this greater fear
that, you know, the bill doesn't necessarily mandate age verification, but if platforms feel
like they have to in some way have a better understanding of how old the people are using
their platforms, they might have to implement some ways to do that more that could be privacy invasive,
or maybe just kind of cleanse the kinds of content that they have on their platforms in general,
so that it can't be seen as unsafe for kids. It feels like that argument boils down to if you
force the platforms to make the internet safe for kids, they will just make it safe for kids,
and all the things that adults do will go away. That's definitely a big fear here that, you know, the internet is just going to become
kind of this place that is free of these more controversial discussions or
more controversial topics that maybe do need to be aired in some kind of capacity
to be hashed out. But is that forum going to be the internet moving forward?
If it is not the internet, we would just go back to what, like the 1980s?
It feels like you could still make platforms on the internet that don't have kids on them to talk
about these things. You would just have to do age verification, right? You have to guarantee
that kids couldn't access them. I think it's unclear. We'd have to see how it plays out in the courts, whether you have to really guarantee that kids aren't accessing them or you have to be pretty certain that it's just adults using them.
I think that kind of remains to be seen exactly how that might work.
But I think that is part of the fear here that it's, you know, we're creating an age gated Internet.
And then that fear is basically a First Amendment fear, right? We're
going to prevent everyone from speaking because we don't want them to speak to children. Right,
because if you take something like age verification, it's not just that you have
kids doing age verification to make sure they're not getting onto sites they shouldn't be. You're
having everyone do age verification because how could a website know you're not a kid until you do that?
Are there any mechanisms to do age verification that work?
From what I've seen, there aren't really any completely foolproof ones.
I think there's some that are better than others.
And the ones that are better tend to be a little bit more invasive than maybe people are used to at this point in time.
And, you know, I think maybe it's something where we see what the standard is that people are okay with shifting in the future. You know, are you okay
with having your face scanned to guess your age, if that's something that happens on device and
is deleted? Or are you okay with a digital ID that just is issued by the government or some
central entity and only gives away up or down information about
how old you are based on the birth date on that ID. There are solutions out there, but they could
be considered more privacy invasive than people are used to. And so I think we're going to see
what are people okay with in the kind of reality where these laws exist.
And then the other argument is there are a group of
children who might be exploring their gender or their sexuality in some way or some other topic
that society says is only for adults, and they won't have any access, and they will be blocked
off from their communities. That's a much squishier one, right, than the straightforward,
if you make the internet for children, you will be chilling the speech of adults as well.
It seems like there's a lot of conversation about what kinds of access children should
have to these communities. And there's at least some people who say that access
should be protected because it helps those children grow into themselves.
I think a lot of people would say there probably is some content that
we wouldn't really want kids to access at a certain age. Think about maybe at the most extreme,
like pornography. I think most people would say they don't think kids should be accessing porn
on the internet. But, you know, I think we're talking about something different here when it
comes to resources that could be actively helpful for some kids. And when we look at that, I think
the fear here isn't that the bill says that this information needs to be
taken down or it needs to be kept away from kids, because it actually says that if a kid is
searching out certain information, the platform is allowed to serve them that. And it says that
information that could actually mitigate some of the mental health harms that the bill contemplates,
that could actually mitigate some of the mental health harms that the bill contemplates, that's okay. But I think the concern here is that the platforms might go a step further than the bill
actually prescribes in order to avoid liability, because it could just be such a risk to have
something that might be politically charged in this moment, be to a kid and then have, you know,
a conservative AG or a conservative FTC come after them and say, this is something that
we don't think is appropriate for kids.
We need to take a quick break.
We'll be right back. Fox Creative.
This is advertiser content from Zelle.
When you picture an online scammer, what do you see?
For the longest time, we have these images of somebody sitting crouched over their computer with a hoodie on, just kind of typing away in the middle of the night.
And honestly, that's not what it is anymore.
That's Ian Mitchell, a banker turned fraud fighter.
These days, online scams look more like crime syndicates than individual con artists.
And they're making bank.
Last year, scammers made off with more than $10 billion.
bank. Last year, scammers made off with more than $10 billion.
It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands, of scam centers all around the world.
These are very savvy business people. These are organized criminal rings. And so once we
understand the magnitude of this problem, we can protect people better.
One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them.
But Ian says one of our best defenses is simple.
We need to talk to each other.
We need to have those awkward conversations around what do you do if you have text messages you don't recognize?
What do you do if you start getting asked to send information that's more sensitive?
Even my own father fell victim to a, thank goodness, a smaller dollar scam, but he fell victim.
And we have these conversations all the time.
So we are all at risk and we all need to work together to protect each other.
Learn more about how to protect yourself at vox.com slash zelle.
And when using digital payment platforms, remember to only send money to people you know and trust.
We're back with Verge senior policy reporter Lauren Feiner.
Before the break, we discussed why some critics have expressed concern about COSP.
But there's major bipartisan support for the package, and that momentum is what helped
secure passage in the Senate. So those are the big arguments against COSP. And it seems like
those arguments haven't been really persuasive yet. This bill is moving along. What are the
arguments in support of COSP? And who has really been making those arguments? The arguments in
support of COSP are really that this is a bill that can really fundamentally shift
the burden of protecting kids on the internet onto the platforms where kids are spending their time.
I think there's been this sense from a lot of parents that this is just too much to handle.
There's so many different platforms.
There's so much to keep track of, so many different platforms. There's so much to keep track of,
so many different threats on the internet that maybe kids in the past faced in a certain way,
but the scale of this and the speed at which it moves is just so vast. So I think parents have
really been looking for ways to protect their kids, something that's more all-encompassing.
And I think that's what a lot of parents feel like they found in COSPA. And I think that's why you're seeing parents really leading the charge here.
You know, we've seen kind of this persistent group of parents who have, in many cases,
lost their children to suicide after cyberbullying or other kinds of harms that they experienced on
the internet, really coming out and being the lead
voices for this bill and saying, you know, this is something that would have protected my child
and that I wanted to protect someone else's child. There are some organizations that had
strong opinions against COSA that have seemed to have died down over time. Who are they and
what has been done to ameliorate their concerns?
There was a prominent group of LGBTQ plus groups, groups like GLAAD, groups like Human Rights Campaign. These groups used to oppose COSA, but they've come out and said, we think after the most
recent revision that was earlier this year, that this bill doesn't really put the groups that we are meant to
protect in such grave harm. And they didn't go as far as to endorse the bill, but they said,
you know, the risks have been mitigated enough that we won't stand in the way of it passing.
And I think that was really significant. These concerns, you know, while maybe they do exist
on a certain level, maybe they're somewhat speculative or maybe
they wouldn't come to be in the way that some of these groups that are still opposing the bill
would fear. That said, I think no one really knows what will happen with these bills until
they take effect and until they're challenged in the courts because I think no platform is
going to say, here's exactly how we're going to comply until they see what the bounds are.
And was there something specific in the bill that was changed to reduce those concerns,
or did they just get over it?
There were a few different things that were changed. Things like limiting the ability of
state attorneys general to enforce certain aspects of the bill, like the duty of care,
which kind of ameliorated some fears that maybe
a particularly conservative or politicized AG might use this against content for trans youth,
for example. And then there were things like specifying that this is really talking about
like design features and not content. I think those were things that just like
gave these groups a little bit more of a
sense of ease. The Senate voted 91-3 in favor of COSPA. We're obviously waiting on what will happen
in the House. And it's an interesting vote because it doesn't feel partisan, right? 91, that's a lot.
And then the three who voted against it are all over the place. It's Ron Wyden and Rand Paul and
Mike Lee. And what's going on there? I think the reasons that these three voted against the bill
are somewhat varied, somewhat overlap, I guess you could say. You know, Ron Wyden was concerned
with, you know, the potential impact on privacy enhancing technologies like encryption, he said,
and anonymity. And he also said he took seriously the concerns from the LGBTQ plus groups that are still concerned that the bill could be used to censor content that could help trans or gay youth.
something of an overreach. He worries about how the bill uses a definition of mental health or mental disorders that relies on a psychologist group that he says is politicized. And then
Rand Paul is, I think, making a free speech argument that, you know, who's to say what
constitutes mental health harms and, you know, neither side should be happy with how this plays out
because, you know, it could be politicized on either end.
So you've got the classic, I have some concerns about the law, all the way to we shouldn't have
laws from Rand Paul. Is there mostly just consensus from everyone else then?
I think so. I mean, this bill had like 70 co-sponsors. So that's a huge amount.
And that was even before it got this vote on the floor of the Senate.
So I think there's just this huge appetite to do something to protect kids on the internet.
And there's just been this consensus around this bill that really built consistently over
maybe the past year.
It's a weird time in American political history.
past year. It's a weird time in American political history. I would say that feels like an understatement where in a furious sprint of a sort of reshuffled presidential campaign,
it doesn't seem like anyone knows what's going on. Is this going to make it to the House? Is it
going to get a vote in the House? What happens next? Now that it's been voted on in the Senate,
it's really on the House to figure out what to do next. And I think Speaker Mike Johnson has been open to it. He's seen in public statements, he has said he's kind of vag different beast than the Senate. You have so many more
voices. You have, you know, a lot of maybe different coalitions that will make up the House
that might have more varied feelings on this bill than we saw in the Senate. But at the same time,
when you see something get out of the Senate so overwhelmingly, I think you really have to
take it seriously in the House. If the House does vote to pass the bill, is President Biden going to sign it inside of this last little bit of his term?
Yeah, Biden is encouraging the House to take up this bill.
And he said that he wants the House to send it to his desk and that he would sign it.
Do we know how a potential President Trump or a potential President Harris would respond to this bill?
I don't know that Trump has spoken specifically on this bill,
but we did see Vice President Harris come out in support of COSPA.
So that's the sort of procedural situation.
Here's the bill. Here's where it's at.
There's a bigger question going on with all of this,
which is it feels like everyone thinks it's time to regulate the Internet.
We've had several generations of people grow up in the free-floating chaos of the Internet we have now.
The issue that has always come up is the Internet is mostly speech.
The First Amendment is very protective of speech in this country, I think, for good reason.
And there's not a lot of ways around it.
of speech in this country, I think for good reason. And there's not a lot of ways around it.
The two ways historically have been copyright law, which works to whatever extent. And then it seems like now the answer is, we're going to say we have to protect the kids.
And that will overcome the strict scrutiny test that courts apply when it comes to First Amendment
issues. Is there a compelling government interest? and does this narrowly achieve that interest? I'm assuming everyone in Congress thinks COSPA
narrowly achieves a compelling government interest here. I think that's correct. The Senate seems
convinced the interest of America's children is compelling enough to overcome any potential
First Amendment issues that this bill might face. And, you know, obviously there's plenty of tech groups
and, you know, these other groups like the ACLU that seem to disagree with that. But I think the
Senate seems pretty convinced that it can overcome those challenges. The other mechanisms for
regulating speech on the internet don't really come up and face those challenges, right? Copyright
law, we've just decided that Disney's interest to overcome the speech interest of a lot of people
on the internet. When it comes to other approaches, there's Section 230, which protects the platforms
from the actions of their users. And one way you can affect speech regulation on the internet
is by making companies like Facebook and Google and TikTok responsible for what their users do.
And that's what a Section 230 carve-out is, right? You're saying Facebook is responsible for a user posting X, and that gets them to moderate X. Is there a way to
achieve the goals of COSPA using those other mechanisms? I'm assuming not copier law, but
using something like a Section 230 carve-out? There would be interest from plenty of factions
of Congress to use Section 230 in that way. I think the problem
we've seen with Section 230 reform is that it's really hard to get even the most strident advocates
in favor of Section 230 reform to agree with each other on what that should look like. So I think
COSPA kind of got around that issue by creating a new bill that could approach this in a different way that I
think was able to rally the kind of political support needed to do that. That said, yeah,
I think in theory it's possible to add carve-outs to Section 230 for certain kinds of harms. I just
think politically that seems like a more difficult path. Are there other approaches to regulating the internet that seem like they might survive
First Amendment scrutiny? Will someone think of the children? That seems good. Like they
got that one. This is actually the argument that Hawaii Senator Brian Schatz made when we had him
on the show back in January. Compelling government interest and all the rest of it. But it's also
the public policy argument, which is like, can we please argue about everything else, but agree that an 11-year-old
shouldn't have their brainstem melted? Is there any other hook that might survive the courts?
Yeah, I think policies that really narrowly target certain design features or, you know, things that are not about the ordering of content or
things that could be argued as editorial discretion of the platforms. I think it might
be possible to regulate those kinds of things. I think, you know, with the Supreme Court weighing
in on Texas and Florida social media laws recently in the NetChoice cases, you know,
social media laws recently in the net choice cases, you know, that made clear that content moderation and curation is First Amendment protected speech. That's at least what it
seems like the majority of the justices believe. So I think it's going to be hard to uphold laws
that do seem to touch on those things. But when we're talking about design features or, you know,
maybe there's other ways to get at how platforms go about their business that doesn't touch on their ordering of content or their surfacing of content, that might be still ripe for policymaking.
We need to take another quick break. We'll be right back. We're back with Verge senior policy reporter
Lauren Feiner to discuss how attempts to rein in some forms of harmful speech have run into
similar issues when facing the First Amendment. There are other big problems on the platforms
right now, particularly as they relate to AI.
There's a lot of conversation and heat
around the notion that we should regulate deepfakes in some way,
particularly intimate deepfakes.
You have Google and Microsoft asking for the regulation.
You have some bills, like the No Fakes Act,
introduced in Congress.
There are other bills. How do those
add up? Are they using the same kind of mechanism? The problem there is that a deepfake of Donald
Trump, say, or Joe Biden or Kamala Harris is probably protected free speech. So you have to
just go regulate speech. But then you definitely don't want intimate deepfakes. You definitely
don't want teenagers being bullied by deepfakes at their school. What are the approaches to regulating that kind of speech
here? Is it the same as COSPO? We're going to find a first amendment hook? Are we trying a
different approach? Well, I think an interesting bill that we saw recently on the intimate deepfakes
issue is the Intimate Privacy Protection Act, which basically would create a Section 230 carve out for intimate AI deepfakes.
So essentially, that would mean platforms wouldn't be immune for hosting that sort of content.
And it's interesting that that bill actually does include the term duty of care, which is the same
kind of term we saw in COSA. And, you know, maybe that signals that this is something that's popular in Congress
right now. Congress is really looking to take kind of a sweeping initiative on kids' safety.
They don't want something that's going to be knocked down the next time that a business decides
to change how their platform operates. So I think, you know, creating some kind of immunity carve-out
seems to be another potential way that lawmakers might look at attacking some of these issues.
And then it seems like the other approach is creating a larger intellectual property framework for people's likenesses.
That's mostly in state law right now.
And the idea that we would have a federal likeness law so somebody clones my face, I could sue them.
This has already happened,
by the way. That's why I'm laughing. But the idea that we should have a federal law
and a federal framework, that seems to be the other approach to deepfakes in particular.
Yeah, I think so. We're still in really early phases of any kinds of AI legislation. And
so I think we haven't yet seen all of the ways that that's going to play out. But I think that is definitely likeness is something that lawmakers seem interested in.
The reason I'm asking is it just seems like there's two ways of getting around the First
Amendment.
One is creating these intellectual property regimes.
And the other one is saying you will be liable if you do a bad thing and will let the users
sue you for doing this bad thing or let people sue you for what your users do. And COSPA is right down the center of that.
And it's saying, actually, because the welfare of kids is so important,
we can just do this directly. And that seems unusual to me.
I think this gets to a fundamental question of the internet that I think comes up,
especially around conversations around 230, which is who is responsible for the harmful
content on the internet, whether that's people selling drugs online, AI deepfakes being shared
on the internet. Does it come down to the user who created and distributed that content or is it on
the platforms themselves? Maybe I think another question here is what is the role of both parents and other technologies that could aid parents in protecting their kids from the existing content on the Internet?
What is their role as well?
And that's a difficult thing to approach because no one wants to tell parents you have to do more.
And it is really hard to stay on top of all of these platforms. But I think that is something that comes up in these conversations is, you know, whose responsibility is it to keep kids safe on the Internet?
There are some famous carve-outs like FOSTA and SESTA, which said you can't have sex workers on your platform, basically. Did that accomplish anything? That was a really controversial carve-out, and it also was a really popular one in Congress.
But there's been some evidence that that carve-out hasn't even been used that much in the courts.
And there's also been a lot of pushback from the sex worker community that believes that
that carve-out actually made their work less safe because when they were using online
platforms, it was easier to vet clients or know who might be more or less safe or, you know,
communicate with other sex workers. There is this idea that maybe a piece of legislation meant to
protect one group could cause real harm in another group. So I think it's really tricky with all of these issues. I
think everyone's heart is in the right place here, but it's a matter of, you know, what really will
work in reality. The FOSTA example is an example of an unintended consequence. We had this great
intention, we passed a law, and it turns out it didn't work and actually might have made a problem
significantly worse. There are some similar potential consequences of this bill, and maybe they're
actually intended consequence. Marsha Blackburn, who's one of the lead sponsors of COSPA,
is out there saying, we just don't want our kids exposed to trans material, and this bill will help
do that. Is that being taken into account by other people who are supporting the bill and pushing it
forward? There was a comment that she made kind of early on in
during the COSA advocacy, and that was before a lot of the more recent changes to the bill. So I
think it's important to add that context there. But certainly, you know, people who oppose this
bill and think that it'll be used by politicized enforcers to go after, you know, trans youth content or just to scare platforms into not hosting that
kind of content. Certainly they point to comments like that to say, you know, this is really the
intent of the bill. Now, I think the authors of that bill would deny that that is the objective
of it. But, you know, I think it is something that you have to consider when we're in a politicized environment. So that's the whole bill. Let's talk about what happens now. So if
the House doesn't take up the bill before the end of the year, will it come up again in the
next Congress? It's hard to say. You know, I think in Congress, like, you have to kind of work off
of momentum. And this bill has a ton of momentum right now. And, you know, we saw that that helped
kind of propel it out of the Senate in this huge way. If it doesn't pass out of the House this
year, could it come back? Yeah, I think it could. It has a ton of supporters and it has, you know,
a really passionate base of parent advocates who I don't think will just put down the fight
if it doesn't go through this year.
But that said, you know, Congress has a lot of different priorities. And will this rise to the top once again, if it failed to go through the first time? It's hard to say, but I'd say the
possibility still remains. Both sides seem interested in regulating what happens on the
internet. I would say the conservatives are much more interested in directly regulating the platforms in various ways. If Trump wins,
or there are more conservatives in the House or Senate, do you think it's more likely or less
likely? I think this is an issue that really falls outside of the left-right binary. I don't know if
it's something that we would see more of or less of under a conservative or liberal administration.
I think maybe there would be different approaches, maybe the kinds of policies that surface change a bit.
But something like COSPA has received such a huge array of support that I don't know that it's necessarily going to change under one administration or the other.
But maybe we'll see tweaks in different ways. don't know that it's necessarily going to change under one administration or the other, but maybe
we'll see tweaks in different ways. Honestly, that makes it seem like the most unusual legislation
we've had in years. Definitely. I mean, it's not normal to have a bill receive 91 votes in favor
of it in the Senate, especially something that, you know, does come with some controversy, some
groups that are really strongly standing against it. But, you know, at the same time, I think you're
seeing a ton of support outside of Congress for it as well.
KOSPA has transcended the chaos of Congress. Do we think it can transcend the chaos of our
judicial system? Someone's going to sue, right? It seems inevitable. How do you think the courts
are going to handle it? Yeah, I would say it seems quite likely that someone would
sue to block this law. And I think it's not yet entirely clear how the courts will consider this. bills throughout the country that deal with kids' online safety or age verification has
successfully received preliminary injunctions in many of those cases on the basis of the
First Amendment, which is basically the court saying, you know, we think that on the merits,
this case will be decided in favor of net choice because we think it will harm the First
Amendment in an unacceptable way.
Will that be the case with something like COSA?
I think it has kind of a different approach
than some of the bills that we've seen.
And I think, you know,
protecting kids' safety on the internet
is a compelling interest.
It's just, is it something that the courts will say
is compelling enough to justify
any potential diminishing of free speech on the internet?
And what have the big platform companies said?
Obviously, NetChoice has an opinion, but have Google or Meta or any other ones said anything?
We haven't really seen the big tech companies come out with very clear statements on this.
You know, NetChoice is opposed, which is funded by many of those big tech companies.
We've seen a handful of smaller tech companies come out in support of COSA, like Pinterest, for example.
But I think, you know, the platforms that people probably will really be concerned about
or want to see how they handle this are, you know, Google with YouTube and Facebook.
I'm assuming TikTok as well, if TikTok is still around.
Yes, definitely.
If it's still around is the key question.
All right. Well, Lauren, I imagine we'll be tracking this for the next year, if not more.
We'll have to have you back soon. Awesome. Yes, I'd be happy to be back.
Thanks again to Lauren for joining us on the show. I hope you enjoyed it. If you have thoughts about
this episode or anything you'd like to hear more of, you can email us at decoderattheverge.com.
We really do read all the emails. You can also hit me up directly on threads. I'm at reckless1280. And we have a TikTok for as long
as there's a TikTok. Check it out. It's at decoderpod. It's a lot of fun. If you like Decoder,
please share it with your friends, subscribe wherever you get your podcasts. And if you
really like the show, hit us with that five-star review. Decoder is a production of The Verge and
part of the Vox Media Podcast Network. Our producers are Kate Cox and Nick Statt. Our
editor is Callie Wright. Our supervising producer is Liam James.
The decoder music is by Breakmaster Cylinder.
We'll see you next time.