The Highwire with Del Bigtree - FLORIDA BANS MINORS FROM SOCIAL MEDIA
Episode Date: April 6, 2024With a newly signed law following in the footsteps of Utah, Florida is set to become the most restrictive state barring minors from accessing social media. Now, other states are jumping on board in an... attempt to limit online harms to children’s mental health and safety. Is this the role of the government or parents?Become a supporter of this podcast: https://www.spreaker.com/podcast/the-highwire-with-del-bigtree--3620606/support.
Transcript
Discussion (0)
Over the last couple years, especially, it seems like a lot of people, a lot of the public has really noticed that governments are noticing and really taking inventory of their online activity and conversations and using legislation to solve some of the issues with this.
Recently in Florida, that's the latest state, to try to solve what a lot of people have concerns about minors accessing social media, kids under 14 years old.
And this is what it looked like in the news. Check it out. Okay.
Social media showdown in Florida.
It is being called one of the most restrictive social media bans in the country.
Governor Ron DeSantis has signed House Bill 3 into law,
banning children under 14 from having social media accounts on platforms considered to have addictive qualities.
It also mandates that social media platforms search for and remove the profiles of kids who don't meet the age requirement.
Being buried in those devices all day is not the best way.
to grow up. And you know there's dangers out there. Unfortunately, we've got predators who prey on young
kids. They know how to get and manipulate these different platforms. It's created huge problems.
This law does not ban specific websites. Instead, it zeroes in on features that are considered
addictive, like infinite scrolling and autoplay videos. What is still unclear, though, is just where
this law stands. The trade group, Net Choice, which represents several major social media giants,
slam the move as unconstitutional saying it violates the First Amendment,
the Equal Protection Clause of the 14th Amendment, and federal law.
It's completely stripping away parental choice for anybody who has a child under 14.
Proponents have argued that access to social media is harmful to children's mental health.
There's no bill powerful enough to keep these kids from social media.
It's not possible.
This is another one of those, Jeffrey.
We were just talking about TikTok last week, right?
like the sort of Chinese influence, a law being written to, you know, take TikTok away.
But this is different.
This isn't, you know, Ronda Santis in Florida doesn't seem to care who's behind making the app.
This is just straight up about the danger to youth.
And we have sort of, you know, shall I say the mommy state?
I mean, I find this interesting because so often conservatives call where the government steps in to take care of like household issues
as, you know, being the nanny or mommy state.
But in this one, it's a, it bowls it right down the middle, but it's about the kids' health.
So tell me what this is about.
Yeah, this is certainly an interesting conversation.
So Utah was the first state to really step into this space in 2023.
This was the headline here when they did that.
Utah governor signs laws curbing social media use for minors.
You go into that.
And basically the laws required all users to submit age verification before opening an account.
And for those laws, it was minors under age 18.
They need to seek parental permission for that.
this. So right now we have Arkansas, Ohio, Utah, and now Florida, they've banned minors from
accessing these accounts on social media. But you saw their net choice, the trade organization,
in that clip there that represents organizations like META. They're part of that. Google, Yahoo.
They have sued and won injunctions in Arkansas and Ohio. So it stayed those laws.
Okay.
They're expecting Florida is expecting a legal fight really fast on this one. But it doesn't go into
effect until January 2025 in Florida. That's HB3. So it's it's got ways to go yet. But as it said in the
clip there, the news reporting, all kids under 14, these social media companies have to immediately
eliminate their accounts. So they don't even get a choice there. 14 and 15 year olds, any parental
consent for that. And if anybody asked for that to be taken down, if the parents or the kids ask for
these accounts to be shut down, if the companies don't act, they can actually be sued personally
by these kids talking about some pretty hefty fines.
So I think what's interesting in this conversation,
because Del you and I,
we cover a lot of medical choice conversations.
One of them is the minor consent to vaccination,
often without parental choice.
We've seen a rush of legislators over the last several years,
gleefully trying to push these bills,
saying this would help public health.
But when you start reading into this conversation
about minors in social media,
we see something like this.
This is Associated Post,
reporting on DeSantis's law here. And you go into the post and it says, quote,
this bill goes too far in taking away parents' rights. A Democratic rep Anna Escamani said in a
news release. So you have this dichotomy of a little bit. It's almost hypocrisy if you want to call
it. Yeah. Like a shot with no side effects, no liability, all the things we've covered on the
show before. That's fine. But, you know, a TikTok video streaming account, we will hold, pump the brakes.
we really need to give these kids and the parents the power here.
Yeah, I mean, it's really, but, you know, I think you're making the argument that I suppose that we,
you know, I'm trying to think where I actually land on this, but you're right.
Does a child have the ability to decide for themselves what's good for them, you know,
and I think about, you know, sure, we want strong parenting in homes, parents should be making
decisions, but how many families where both parents or maybe single family homes are, you know,
out at work or coming home late and the child, you know, is with a babysitter or a child is maybe
even at home or, you know, how many times? I mean, I guess here's the question, right? If I want,
if I'm with them and I'm having a conversation with the adult, which we all do, you know, at a
restaurant, I just want to hand the iPad over and say, check out some social media. Do I not,
I guess in this case, in some of these cases, I don't even have that right to hand it off to
my kid and say, here, go ahead and do this. So this is where, like, again, it's these
slippery slopes when we look at our our rights right and you know as they say the camel's nose under the
tent do we really want government involved in these conversations here and it's it's an open question at this
point it seems like it's going forward so we're reporting on it but this conversation really started
to unravel during COVID and a lot of people missed it in the headlines but there was a wist some whistleblowers
there were some internal documents from meta formerly known as Facebook that were released to the
Wall Street Journal and they actually did an expose several articles on this here's
one of them in 2021. Facebook knows Instagram is toxic for teen girls company documents show.
And you go in here and it talks about this. Meta actually commissioned several studies and
presented these results internally to the company. It says 32% of teen girls said that when they
felt bad about their bodies, Instagram made them feel worse. The researcher said in a March 2020 slide
presentation posted a Facebook's internal message board. We make body image issues worse for one
and three girls said one slide in 2019, summarizing research about teen girls.
who experienced the issues.
It goes on to say this, teens blame Instagram for increases in the rate of anxiety and depression,
said another slide.
This reaction was unprompted and consistent across all groups.
That's a big problem for them, but here's an even bigger problem.
Among teens who reported suicidal thoughts, 13% of British users and 6% of American users trace
the desire to kill themselves to Instagram.
One presentation showed.
And so in summary, this is the issue they deal with because they had a lot,
The reason they're doing this is because they're losing a lot of people signing up for Facebook.
They're calling them aging out.
They're not, so they're going after the younger crowd, and they're saying, well, can we really do this?
And it says social comparison is worse on Instagram.
This is the problem they had, states Facebook's deep dive into teen girl body image issues in 2020,
noting that TikTok, a short video app, is grounded in performance while users on Snapchat
or rival photo and video sharing app are sheltered by jokey filters that keep the focus
on the face. In contrast, Instagram focuses heavily on the body and lifestyle. So that's where you're
getting what they're saying is a lot of these mental health issues and these comparison issues.
So these were internal memos where they all sat around, said, I don't know, what do you think?
We're driving one and three girls into anxiety and depression and roughly 6% of Americans
and 13% of girls in England are blaming their suicidal thoughts on our program. So we go forward,
with it? Should we continue to push forward and promote it to these children? All in favor,
aye, aye, aye, aye, aye, aye, let's do it. And it goes out. That's what we're to understand.
Yeah, yeah. These are all internal studies that were shown to members of Facebook. And this was
this was not given to the public until now, obviously, until 2021. So what that kicked off was the
lawsuits, big ones now because the attorneys general had all of this information. So you started
seeing headlines like this just recently. Meta sued by 42 attorneys general alleging Facebook
Instagram features are addictive and target kids. And this is where you get obviously all the
governors involved as well. But it says according to the federal complaint, meta did this via
the design of its algorithms, copious alerts, notifications, and so-called infinite scrolling
through platform feeds. That's that dopamine hit that a lot of people talk about. You're getting
this dopamine hit with this continuous scrolling, these videos that automatically play. It says the company
also includes features that the attorneys general alleged negatively impact teens mental health
through social comparison or promoting body dysmorphia such as likes or photo filters.
One of the part of that suit as well is the data collection, the personal data collection
on children under 13, which violates a whole other child online privacy protection act.
So that's in part there.
Be interesting to get discovery on that to see how they collect people's data, how far they go.
But that suit was sealed.
So we saw the headline and we knew, okay, they're suing them.
We have some quotes here.
But that suit had become unsealed.
And now we start getting really granular information what was in that suit.
We'd go to the Guardian here and it talks about that.
It says in this article, the complaint said that in 2021, META received over 4002,000 reports of under 13 users on Instagram.
But that 164,000, far fewer than half of the reported accounts were, quote, disabled for potentially being under the age of
13 that year. The complaint noted that at times Meta had a backlog of up to 2.5 million accounts of
younger children awaiting action. So, you know, they can use that excuse in saying, well,
we just, you know, we have all these users and we just, we have a little bit of a backlog. So we're just
going to keep them on for now. So it's obviously works in their favor according to the lawsuit.
But the challenge they have, and this goes on in this article to kind of explain this just to give
voice to the other side. It says with respect to barring younger users from the service,
Meta argued age verification is a, quote, complex industrial challenge. Instead, Meta said it
favored shifting the burden of policing underage usage to app stores and parents like Google
and Apple, specifically by supporting federal legislation that would require app stores to obtain
parental approval whenever used under 16, download the app. So they're saying it's a complex industry
challenge age verification. However, you know, looking at this conversation has a lot of tails
on it. So just putting it this other angle on this here, there's another conversation going on here.
It's represented in this headline. Kansas moves to join Texas and other states and requiring
porn sites to verify people's ages. So this is go age verification is going on. Right. And so
Louisiana was the prime mover in this space with this law taking effect at the start of this year.
But we have about eight other states that are going this direction as well for age verification.
And of those states, I mean, most of them, seven.
of the eight are Republican states, and there's 20 other states that are looking into introducing
legislation here. But a lot of the big companies are just cutting access completely off in these
states because they don't even want to try to comply with the law because of the penalties here
of trying to get age verification. Because if any kid is signed on to that and is found out,
they're talking every day they're fine tens of thousands of dollars. So a lot of these companies
are just going, we're done in Texas. We're done in Kansas.
this. You know, this is such an interesting topic. And as we were discussing it earlier before the show,
it's one of those that I'd be really curious where people, especially in our audience land,
but we put up a Twitter poll just a couple of hours ago this morning. And we asked this,
in your opinion, does the First Amendment protect a minor's right to all content on the internet?
Yes or no? Right now, 85% of those, I guess we have almost 1,400 votes.
85% are saying that the First Amendment does not protect a minor's rights.
I would love it if everyone in the audience would weigh in right now on Twitter at Highwire Talk.
And please share it.
I mean, I'm curious.
We can, you know, what does it look like when just our audience is weighing in over the next hour or so?
But then what does it look like if you share it with all your friends?
Does it change?
Does, you know, are we all in this together?
Do we believe that a parent really is in a position to be deciding what is right for their children?
And if so, does that mean that that child doesn't have their own First Amendment rights?
Certainly a very important question as we move forward in this modern world.
So if you want to get involved, go to Twitter right now at Highwire Talk.
I'd love to see your perspective on that.
We're going to do a lot more of this.
I want to start interacting with you out there in the audience to see what does our audience actually think about some of these conversations that we're having.
Super interesting, Jeffrey.
Another one is that, you know, sort of bowls down the middle.
It fights both spaces, right?
Do we want the government getting involved in our lives and protecting us inside of our houses or not?
Is it black and white?
It seems more and more on finding myself in a gray area, which isn't very comfortable for me.
