Tangle - The social media hearings at Congress.
Episode Date: February 5, 2024The social media hearings in Congress. On Wednesday, a group of executives from Meta, TikTok, Snap, Discord, X (formerly known as Twitter), and other social media companies testified in a Senate Judic...iary Committee hearing about the threats of social media and child exploitation. The hearing began with testimony from victims who said they or their children had been exploited on these platforms. Parents who had lost children to suicide held up photos of their children throughout the hearings.You can read today's podcast here, our “Under the Radar” story here and today’s “Have a nice day” story here.You can also check out our latest YouTube video about misinformation and fake news that has spread like wildfire in the three months since Hamas’s attack on Israel and the subsequent fighting in Gaza here.Today’s clickables: A few quick notes (0:52), Quick hits (2:29), Today’s story (5:01), Right’s take (7:56), Left’s take (11:59), Isaac’s take (15:39), Listener question (20:18), Under the Radar (23:10), Numbers (24:03), Have a nice day (24:59)You can subscribe to Tangle by clicking here or drop something in our tip jar by clicking here. Take the poll. Who would you say is the most responsible for mental health issues among minors using social media apps? Let us know!Our podcast is written by Isaac Saul and edited and engineered by Jon Lall. Music for the podcast was produced by Diet 75. Our newsletter is edited by Managing Editor Ari Weitzman, Will Kaback, Bailey Saul, Sean Brady, and produced in conjunction with Tangle’s social media manager Magdalena Bokowa, who also created our logo.--- Send in a voice message: https://podcasters.spotify.com/pod/show/tanglenews/message Hosted on Acast. See acast.com/privacy for more information.
Transcript
Discussion (0)
Based on Charles Yu's award-winning book,
Interior Chinatown follows the story of Willis Wu,
a background character trapped in a police procedural
who dreams about a world beyond Chinatown.
When he inadvertently becomes a witness to a crime,
Willis begins to unravel a criminal web,
his family's buried history,
and what it feels like to be in the spotlight.
Interior Chinatown is streaming November 19th,
only on Disney+.
Breaking news happens anywhere, anytime.
Police have warned the protesters repeatedly, get back.
CBC News brings the story to you as it happens.
Hundreds of wildfires are burning.
Be the first to know what's going on and what that means for you and for Canadians.
This situation has changed very quickly.
Helping make sense of the world when it matters most.
Stay in the know.
CBC News.
The flu remains a serious disease.
Last season, over 102,000 influenza cases have been reported across Canada,
which is nearly double the historic average of 52,000 cases.
What can you do this flu season?
Talk to
your pharmacist or doctor about getting a flu shot. Consider FluCellVax Quad and help protect
yourself from the flu. It's the first cell-based flu vaccine authorized in Canada for ages six
months and older, and it may be available for free in your province. Side effects and allergic
reactions can occur, and 100% protection is not guaranteed. Learn more at flucellvax.ca.
From executive producer Isaac Saul, this is Tangle.
Good morning, good afternoon, and good evening, and welcome to the Tangle Podcast,
the place we get views from across the political spectrum, some independent thinking,
and a little bit of my take. I'm your host, Isaac Stahl. It is Monday, February 5th.
Today, we are going to be talking about the social media hearings. I'm not really sure what else to call these, but basically a bunch of executives from some of the biggest social media platforms
in the world testified before Congress last week. We're going to talk about why and what happened.
Before we jump in, a quick heads up. We did not get to our little weekly roundup podcast with Ari and I this week.
We both had crazy travel schedules and lots of stuff going on, but we are sure to get
into some of the stuff that we covered last week as soon as him and I can sit down together.
I want to talk a little bit about this border policy proposal and some of the responses we got to it to our Friday edition.
And of course, now we have an actual border policy proposal.
We also have a lot of big stuff coming up this week.
We are bringing Bill O'Reilly, the former Fox News host, onto the Tangle podcast.
The Tucker Carlson before Tucker Carlson.
on to the Tangle podcast, the Tucker Carlson before Tucker Carlson. I'm very interested to sit down and talk with him a bit about Fox News and the state of the media, what happened with
him, what it's been like going independent in his career. So we're going to get into some of that
with him. That should be a really interesting interview. A lot of people coming up on deck
to come on the pod and lots of good stuff coming your way.
We're also getting our 2024 podcast series on undecided voters off the ground.
So a lot more coming in that department.
So just wanted to give all you faithful podcast listeners a little update on where things are.
So with that out of the way, let's jump in today with some quick hits.
This is a jam-packed quick hits today. So much news going on.
First up, President Biden won the South Carolina Democratic primary with 96% of the vote,
outperforming some polls in the state by as
much as 20%. Number two, Senate negotiators released the text of their Ukraine and Israel
funding bill that also functions as border security legislation. The $118 billion bill
includes $20 billion for the border and increases security, limits how asylum can be used to enter
the U.S., and provides funding for more judges and asylum officers.
Number three, the federal trial brought by special counsel Jack Smith
against former President Donald Trump for alleged election interference,
initially scheduled to begin March 4th,
has been delayed indefinitely while Trump argues in federal appeals courts
that he is immune from prosecution.
Separately,
more than $50 million of former President Trump's political fundraising was reportedly spent on legal fees last year. Number four, the U.S. economy added 353,000 non-farm jobs in January,
far exceeding expectations. Separately, the Federal Reserve opted not to cut interest rates, leaving them between 5.25% and 5.5%. And number five, U.S. forces struck over 85 targets affiliated with
Iran-backed militants in Iraq and Syria. Separately, U.S. and United Kingdom forces
continued to strike Houthi rebel outposts in Yemen.
News on Capitol Hill today, CEOs of the largest social media companies faced a bipartisan grilling during a Senate hearing on protecting children online. And one of the most dramatic moments came when
Meta's Mark Zuckerberg was pressured to face parents in attendance and apologize.
Fallout is growing from that contentious hearing on Capitol Hill addressing child safety on social
media. CEOs from five of the biggest platforms were on the hot seat yesterday. And at one point,
Meta's Mark Zuckerberg issued a rare apology to parents in the hearing room who lost loved ones.
The chief executives of Discord, Snap, TikTok, X and Meta face tough questions from senators on both sides of the aisle.
And a frosty reception from the families gathered in that hearing room.
Many wondering why the tech companies in Congress have failed to act while children still face danger on social media sites.
On Wednesday, a group of executives from Meta, TikTok, Snap, Discord, X, formerly known as
Twitter, and other social media companies testified in a Senate Judiciary Committee
hearing about the threats of social media and child exploitation. The hearing began with testimony
from victims who said they or their children had been exploited on these platforms. Parents who
had lost children to suicide held up photos of their children throughout the hearings.
They are responsible for many of the dangers our children face online, Senate Majority Whip Dick
Durbin said in opening remarks about the executives, their design choices,
their failures to adequately invest in trust and safety, their constant pursuit of engagement and
profit over basic safety have all put our kids and grandkids at risk. Perhaps the most notable
moment came during testimony from Meta CEO Mark Zuckerberg, who was being questioned by Senator
Josh Hawley, the Republican from Missouri. At one point,
Hawley asked Zuckerberg if he had personally compensated any victims or their families for
what they have been through. I don't think so, Zuckerberg said. There's families of victims here,
Hawley responded. Would you like to apologize to them? Zuckerberg then stood and turned away
from the senators and directly addressed the parents in the gallery and apologized.
I'm sorry for everything you have all been through, he said. No one should go through the
things that your families have suffered, and this is why we invest so much, and we are going to
continue doing industry-wide efforts to make sure no one has to go through the things your families
have had to suffer. Despite bipartisan consensus about the dangers of these platforms, members of Congress
have yet to rally behind any specific legislative solutions. In 2022, the Kids Online Safety Act
was proposed on a bipartisan basis but is yet to gain traction. The bill would require social media
platforms, video game websites, and messaging apps to take reasonable measures to prevent harm like
bullying, harassment, sexual exploitation,
self-harm, predatory marketing, and eating disorders. It would require every service
to automatically have the highest safety and privacy settings for any users under the age of
18. TikTok CEO Xu Qiu also faced questions about the app's ties to China. Qiu, who is Singaporean,
repeatedly reminded senators that
he was not Chinese and that TikTok is not available in mainland China. A very similar
app called Duyin is. Chu also repeatedly denied any link between the Chinese Communist Party and
TikTok and instead insisted that content critical of China was permitted and regularly circulated
on the platform. Today, we're going
to take a look at some arguments from the right and left about the hearings and then my take.
We'll be right back after this quick commercial break.
First up, let's start with what the right is saying. Many on the right think the time has come for Congress to intervene and regulate the business
practices of social media companies. Some criticize senators as being more interested in creating a
spectacle than addressing the issue at hand. Others call for strict new regulations like
banning social media use for anyone under the age of 18.
The New York Post editorial board says social media CEOs won't stop their products from harming kids until they're forced to.
TikTok's parent ByteDance pulled in $24.5 billion in revenue in just the first quarter of 2023, while Meta made more than $28 billion.
quarter of 2023, while Meta made more than $28 billion. So the child safety spending is barely a drop in the bucket, even just compared to what they spend on tweaking their algorithms to boost
the addictive factors, the board said. Most sites have a minimum age of 13 to join, but verification
is practically non-existent, so younger children easily gain access all the time by simply
certifying that they're old enough. That's the only safety for the company from lawsuits. As for the impact, social media sites
are bottomless cesspools of content that drive anxiety, depression, and body image issues in
America's youth. If any other product was causing such harm to children, it'd get yanked off the
market or at least regulated heavily. Big tobacco played dumb over the impact
of its product for decades, even while maximizing addictive properties too. How are TikTok and Meta
any better? Actually, they're worse. They deliver their dopamine rushes for free. We're not often
fans of government regulation, but social media giants have shown again and again they can't be
trusted. They'll choose the path of most profit, no matter the impact on mental health or society. A hearing is not enough. Congress and the administration
need to get serious about actual penalties for bad behavior. In the American Spectator,
Aubrey Gulick wrote, yelling at CEOs does not protect kids. The hearing turned into a perfect
opportunity for the politicians involved to denounce tech CEOs
to hold them accountable without actually doing anything about the problem at hand, Gullick said.
The heart of the issue is that parents are the ones handing their children smartphones,
not big tech CEOs. The first line of defense when it comes to protecting kids from online
exploitation by predators and from harmful social media posts is parents. Social media platforms
should be the second line of defense when it comes to protecting kids. Meta, X, TikTok, Snapchat,
and Discord should absolutely take responsibility for and work to prevent sexual predators on their
platforms. Regulation encouraging them to do so could even be good. Unfortunately, protecting kids
on social media is not as easy as yelling at Zuckerberg and his ilk as fun as that might be. In National Review, Rich Lowry argued that kids
shouldn't be on social media at all. Congress should press the brakes on the revolution that
has given Mark Zuckerberg and other tech titans an outsized role in raising our kids and require
that users of social media be age 18 or older. Surely it's not too much
to ask that Zuckerberg and co. make their fortunes exclusively off adults, Lowry said.
Congress has already imposed an age limit, just in the wrong place. The Children's Online Privacy
Protection Act prevents the companies from collecting personal information from children
under age 13, effectively prohibiting them from social media. But 13 draws the line
much too young. Let's say the research in everyone's intuition is wrong and social media
aren't driving worse outcomes for kids. What's the harm in staying off social media until they're
older? That kids will miss out on the latest absurd and perhaps dangerous TikTok trend?
That they won't get to envy people posting photos on Instagram to make
themselves look more interesting and beautiful than they really are, that they will take to
their families and friends more and engage in more activities in the real world, Lowry asks.
Once every teen isn't on social media, it becomes easier to stop teens from using social media.
All right, that is it for what the right is saying, which brings us to what the left is saying.
The left is troubled by the effects of social media on kids and calls for regulation to address the issue. Some argue the reforms these companies are enacting on their own fall well short. Others
doubt that any meaningful change will come out of
the hearing. The St. Louis Post-Dispatch editorial board said protecting kids online should be both
technologically and politically possible. The hearing showed how much bipartisan agreement
there is on the particular urgency of combating online child sexual exploitation, revenge porn,
social media harassment, and other scourges that have
made childhood a more treacherous landscape than it was before the digital age, the board said.
Unlike much of the overblown hysteria about, for example, the supposed censorship of conservative
opinion on the internet, child sexual exploitation and related threats are real. And both political
parties are increasingly insisting that the tech platforms have an obligation to deal with them. To the argument that fully filtering out even just dangers to
kids would be an impossibly huge order for the platforms, we would counter by noting the amazing
things social media companies can do today, the board said. Advanced algorithms, artificial
intelligence, and other mind-blowing developments indicate there's virtually no technological goals
these tech titans can achieve when properly motivated. And now, as before there even was an internet,
nothing motivates entrepreneurs like a threat to their bottom line.
Breaking news happens anywhere, anytime. Police have warned the protesters repeatedly,
get back. CBC News brings the story to you as it happens.
Hundreds of wildfires are burning.
Be the first to know what's going on
and what that means for you and for Canadians.
This situation has changed very quickly.
Helping make sense of the world when it matters most.
Stay in the know.
CBC News.
The flu remains a serious disease.
Last season, over 102,000 influenza cases have been reported across Canada,
which is nearly double the historic average of 52,000 cases.
What can you do this flu season?
Talk to your pharmacist or doctor about getting a flu shot.
Consider FluCellVax Quad and help protect yourself from the flu.
It's the first cell-based flu vaccine authorized in Canada for ages six months and older,
and it may be available for free in your province.
Side effects and allergic reactions can occur, and 100% protection is not guaranteed.
Learn more at FluCellVax.ca.
In CNN, Kara Lamo wrote that Zuckerberg's extraordinary apology should only be the
beginning. In the lead-up to the testimony, tech companies announced new initiatives to protect
kids, but it's not enough. Lawmakers and tech companies need to do much more to protect our
kids, Alamo said. While they may claim they're showing kids less harmful posts, it's still
unclear how they're identifying what is harmful. Tech companies should
work with experts in adolescent health to develop and share standards for content that can be shown
to kids so that potentially harmful posts about things like body image and mental health don't
appear on their feeds at all, and so content creators know what the rules are and can try to
follow them. Tech executives promised to protect kids in their testimony to senators, but they
didn't promise to do what's actually needed to safeguard kids in their testimony to senators, but they didn't
promise to do what's actually needed to safeguard kids' physical and mental well-being. To protect
kids on their apps, they need to create and enforce better standards for content shown to kids,
along with more human moderators, mental health resources, lessons for kids, and disclosures when
content has been manipulated. And lawmakers need to pass legislation to crack down on online sexual exploitation. These kinds of solutions would give parents something to
actually like. In the Washington Post, Adam Lischinsky described the result of the hearing,
and of others before it, as nothing. It all stacked up as another reminder that government
cannot ride herd on industry anymore. At least cigarette smokers
have long been warned by the U.S. Surgeon General that smoking is dangerous. Last year, the Surgeon
General issued an advisory linking social media with mental health concerns. Do the hardware and
software that have all sorts of repercussions in our lives and that of our children come with any
warnings or alerts, Lischinsky asked? Not to date, and that's usually the only takeaway
whenever Silicon Valley comes to Washington. Nothing. Congress and President Biden, too,
have been vocal about the problems with social media while doing approximately nothing about
them. Repeated efforts to weaken liability shields that benefit big tech companies,
as well as measures aimed specifically at protecting children,
have failed to pass Congress despite bipartisan support.
Alright, that is it for the right and the left are saying, which brings us to my take.
So first of all, I don't think there's any real doubt about the harmful effects of social media
on teenagers. Meta's own internal investigations have shown the harm its platforms have on teen
girls. Instagram in particular made girls feel bad about their bodies, increased their anxiety
and depression, and often led to suicidal thoughts. These kinds of experiences come from just using
the apps. They are separate from
other threats around these websites, like predatory users or bullying or sexual exploitation.
Their studies have also concluded that Instagram specifically heightens unhealthy social comparison,
unlike TikTok, which is centered around performance, and Snapchat, which promotes
talking and inside jokes but also makes teens vulnerable to bullying.
These kinds of differences are instructive in that they should remind legislators that no two
apps are the same, and the risks from their use can be very different. TikTok, for instance,
can be a bastion of misinformation, while Instagram is a place where a teenager might
develop an eating disorder. Of course, neither app has a monopoly on those issues,
but the research has fleshed out real differences between them. When I've written about mass
shootings, I've often talked about the blame pyramid starting at the top, then working down,
the shooter who makes the decision to inflict mass harm, the people around them who see the
warnings but do nothing, the failures of law enforcement to address a threat when it is
reported, and the laws in place that fail to give proper tools for preventing mass violence.
I think a similar blame pyramid can be constructed here. In this case, most people agree that minors
who end up the victims of exploitation or develop anxiety and depression are the least culpable,
as they are often just kids trying to find their way.
Minors who use these apps as tools for bullying certainly exist somewhere on this blame pyramid.
Instead, though, more blame can go to the incredibly influential social media personalities who create dangerous content, the parents who fundamentally are responsible for the things
their kids consume, the platforms who need to make algorithmic changes and implement policy to keep minor users safe, and the legislators who
have to think about ways to address this with a soft touch. As for the hearings, I'll be direct.
I don't think Congress should pursue a heavy-handed governmental approach here.
I've written before about my support for Section 230 and the necessity of an open and free
internet. Some of Congress's solutions, like the Kids Online Safety Act, might be getting bipartisan
support, but I think they constitute major overreaches and put dangerously broad limits
on internet companies. That's why digital rights groups like the Electronic Frontier Foundation
have come out strongly against them. Instead, there are two major changes I think
Congress can and should try to push. First is a disclosure on targeted advertising, which forces
platforms to inform a minor why they're being shown a particular advertisement. And frankly,
this should also happen for adults. This is a change supported by EFF and also has a powerful
effect of pulling the curtain back on how these platforms
work. Second is to enact small changes to Section 230 that actually give plaintiffs a pathway to
sue major companies if they're able to prove a direct connection between a criminal act
and the promotion of criminal behavior by the platform. Congress does have a role here,
but it should not be a major one. If the government is going
to invest in anything, it should spend money on continuing to raise awareness for parents and
teens about the threats these platforms pose. Teenagers already believe these platforms have
a negative impact on their peers, but about 9 in 10 feel they are immune from those harms.
That's in part because these apps often make teens feel more connected to
their friends and families, which is why they're popular in the first place. Educating teenagers
about their risks while emphasizing the ways in which these platforms can be used for good
isn't a bad use of resources, and already we are seeing cultural movements around less screen time
take off. The truth is, the best way to wrangle these companies is to force more
transparency and make it harder for them to profit off the teenagers in exchange for making their
platforms much more dangerous. However, the moment the federal government starts to enforce its view
on what is or isn't harmful is the moment we start to lose the necessary freedoms of the internet.
We'll be right back after this quick break.
All right, that is it for my take, which brings us to your questions answered. This one is from Sandra in Trophy Club, Texas. Sandra said, We have heard a lot about the deep state. Is there any evidence to point to this actually
existing? With the limited mental capacity of Joe Biden, I can imagine that someone or
some other group is truly in power and calling the shots. So it depends on what you mean by
the deep state. If you're asking if there's actually a shadow government operating behind
the scenes,
conspiring among the elites to pull all the puppet strings to its desire, and that
maybe that shadow government is also a bunch of Satan-worshipping pedophiles,
then no, I don't think there's a deep state. And if you think the government is bloated and
incompetent, but also capable of executing and concealing massive conspiracies or organizing
false flag events, then you should dump one of those viewpoints. And if you're stuck between
those views, then you might remember Hanlon's razor. That which can be attributed to malice
is more adequately explained by stupidity. Now, that doesn't mean a deep state of some kind
doesn't exist. Without the sinister overstones,
you could just call it the bureaucracy. There are dozens of federal departments,
all containing their own agencies, overseeing nearly two million civil servants. Overseeing,
managing, and leading that massive operation requires a lot of effort and diplomacy,
and sometimes some government employees are going to resist the person in charge.
Then there's also the judiciary and legislature, which were designed to check the power on the
executive branch and one another. Add into that state and local governments, then the heads of
cable news networks with their own agendas and media personalities selling their own worldview,
and maybe you get something close to what you are imagining. Do I think these people
in power, i.e. the corporate elites, meet up and share ideas and sometimes work in concert to
achieve certain goals? Of course. Do I think they all agree and have a unified vision for the future
and are very good at accomplishing those goals? Definitely not. The government is huge, and there
isn't one group or one person secretly in charge.
What's more accurate is that there are dozens and dozens of power players inside and outside
of government that are all vying against each other and scoring little wins for their agendas
in a giant system composed of hundreds of thousands of judges, wealthy donors, state
senators, corporate executives, mayors, media influencers, and heads of agencies all
seeing things their own way. Not being able to snap all of those people into attention isn't
the result of a secret conspiracy. It's just a facet of a large democracy. And it's a feature,
not a bug. That, to me, is the real quote-unquote deep state.
Deep State. All right, next up is our under the radar section. Joshua Schulte, a 35-year-old former CIA agent, was sentenced to 40 years in prison after being found guilty of espionage,
computer hacking, contempt of court, making false statements to the FBI, and possessing
child pornography. Schulte was convicted of being
behind the classified Volt 7 leak, which was disclosed by WikiLeaks in 2017 and revealed
how the CIA hacked smartphones and spying operations overseas. The leaks also showed
the CIA's efforts to turn internet-connected TVs into recording devices. Schulte helped create the
hacking tools while at the CIA. He was later convicted of
downloading more than 10,000 files of child pornography on his computer. USA Today has
the story, and there's a link to it in today's episode description.
All right, next up is our numbers section. The percentage of U.S. teens aged 13 to 17 who use
social media is 95%, according to a U.S. Surgeon General's advisory from 2023. The percentage of
U.S. teens who say they use YouTube is 93%, according to a Pew Research survey from 2023.
The percentage of U.S. teens who say they use TikTok is 63%. The percentage of U.S. teens who say they use Instagram is 59%.
The percentage of U.S. teens who say they use Facebook is 33%.
And the approximate number of U.S. teens who say they use YouTube or TikTok almost constantly
is one in five, while the percentage of U.S. parents with at least one child under the age
of 18 who would support legislation that would require parental approval for children under 16 to download apps is 79%.
All right, that is it for our numbers section. And last but not least, we have our Have a Nice
Day story for today. We recently ran a story in our Sunday newsletter about drug issues in the Kensington
neighborhood of Philadelphia, which prompted a response from a Tangle reader. Scott Shackleton's
grandparents immigrated to the United States and settled in Kensington. And last year, Scott got
back in touch with his roots to help. Partnering with a Philadelphia organization called Simple
Homes to purchase an abandoned home for $1,
raise $30,000 to renovate it, then donate the home back to the community. It is now owned by a woman raising her niece and nephew after their mother died of an overdose, Scott said.
She manages the food pantry for Simple Way and feeds over 90 families. Simple Homes has
the story about the family and there's a link to it in today's episode description. All right, that is it for today's podcast. As always, if you want to support our
work, please go to retangle.com forward slash membership, or just consider sharing this podcast
with a friend and telling them to subscribe. We'll be right back here same time tomorrow. Have a good one. Peace.
Our podcast is written by me, Isaac Saul, and edited and engineered by John Wall.
The script is edited by our managing editor, Ari Weitzman, Will Kabak, Bailey Saul, and Sean Brady.
The logo for our podcast was designed by Magdalena Bokova,
who is also our social media manager. Music for the podcast was produced by Diet75.
And if you're looking for more from Tangle, please go to readtangle.com and check out our website. We'll be right back. a world beyond Chinatown. When he inadvertently becomes a witness to a crime, Willis begins to
unravel a criminal web, his family's buried history, and what it feels like to be in the
spotlight. Interior Chinatown is streaming November 19th, only on Disney+.
The flu remains a serious disease. Last season, over 102,000 influenza cases have been reported
across Canada, which is nearly double the historic average of 52,000 cases. What can you do this flu Thank you.