Offline with Jon Favreau - The Movement to Protect Kids from Big Tech
Episode Date: December 13, 2025Julie Scelfo, founder of Mothers Against Media Addiction, sits down with Jon to talk about the impacts AI and social media are having on our kids…and what we can do to stop it. Julie breaks down wha...t change parents can effect vs. policy makers, the horrors kids are normalizing on social media, and the corruption at the highest echelons of government that are preventing safety features from being mandated. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Transcript
Discussion (0)
Offline is brought you by Quince. Cold mornings, holiday plans. This is when I need my wardrobe to just work. That's why I'm all about quince. They make it easy to look sharp, feel good, and find gifts that last. From Mongolian cashmere sweaters to Italian wool coats, quince pieces are crafted from premium materials and built-up without the luxury markup. Quince makes the essentials every guy needs. Mongolian cashmere sweat for $50. Italian wool coats that look and feel designer and denim and chinos that fit just right. Their outerwear lineup is no joke. Down jackets, wool top coats, and leather styles.
that are built to last.
Each piece is made from premium materials
by trusted factories that meet rigorous standards
for craftsmanship and ethical production.
By cutting out middlemen and traditional markups,
Quince delivers the same quality as luxury brands
at a fraction of the price.
It's everything you actually want to wear
built to hold up season after season.
Love Quince.
Been doing some holiday shopping.
Some Christmas presents I've got to get.
Oh, yeah.
And, you know, I was getting a few sweaters for some people.
And I got to say that Italian wool coat looks,
pretty nice as well.
I got a cashmere hoodies, great.
There you go.
Mongolian cashmere even.
From the steps of some of the mountains, so tall over there.
Not there specifically, but in that area.
Boy, they go so high.
It's like I'm there right now.
Ceiling of the world.
It's transported me there.
Ceiling of the world.
Get your wardrobe sorted and your gift list handled with Quince.
Don't wait.
Go to quince.com slash offline for free shipping on your order in 365-day returns.
Now available in Canada 2.
That's Q-U-I-N-C-E.com slash offline.
Free shipping and 365-day returns.
Quince.com slash offline.
I'm Robert Smith, and this is Jacob Goldstein,
and we used to host a show called Planet Money.
And now we're back making this new podcast called Business History
about the best ideas and people and businesses in history.
And some of the worst people.
Horrible ideas and destructive companies in the history of business.
First episode, how Southwest Airlines use cheap seats and free whiskey
to fight its way into the airline business.
The most Texas story ever.
Listen to business history wherever you get your podcasts.
One mother texted me and said, I don't know what to do.
I feel like my son is addicted to cocaine that I gave him.
And that is the helplessness that parents feel.
But it's not something that we can solve on an individual level, right?
So, you know, we as parents have to feed our kids healthy meals,
but we don't expect every parent to keep the food supply safe.
Like, it's not your job to go test the baby formula at the drugstore, right?
We have a system in place to make it safe.
And that's why we know these products, because all tech is is products, they need to conform to certain standards that they're safe.
I'm John Favro, and you just heard from this week's guest, Julie Shelfo, a former New York Times reporter who now runs an organization called Mothers Against a.
media addiction, which is an organization that seeks to protect children from harmful
technologies. We haven't covered this topic in a while here and came across Julie's organization.
They started in 2024. They now have 35 chapters across 22 states, and they have been
sort of leading the way and trying to pass some legislation to help protect kids from both
social media harms, and now they're also focused on artificial intelligence. This week, we just
found out that the Australia social media ban, which the first of its kind of any country in the
world. It banned social media for kids under 16. That has gone into effect. We are also recording
this just a day after Donald Trump decided to issue an executive order banning states from trying
to regulate artificial intelligence on their own. This story is very much in the news. And Julie
and Mama, her organization, are at the forefront of trying to help.
protect kids. They have worked on trying to pass some of the social media regulatory legislation
in some states. And they've also been working on educating parents. It's a really great group.
And Julie is very, very smart on all of this stuff. So we had a great conversation about how she
got into this, what kind of legislation and reforms she's pushing for, what it's like going up
against big tech and the resistance that they've faced there, what she thinks of the
Australia social media ban, the Trump artificial intelligence EO, and sort of what are her big
concerns going forward and what her and her organization, Mama, are trying to do to help protect
kids and pass some of these really important reforms. So it's a great conversation. I hope you'll
enjoy it as much as I did. Here's Julie Shelfa. Julie, welcome to offline. Thanks for having me.
Happy to be here. So it's been a big year for the growing movement of which you're a leader to protect
kids from too much screen time in social media. Last I checked, I think we're at 20 states that
banned phones and tablets during school, plenty of others with bans and a lot of local districts.
Australia just became the first country in the world to ban social media companies from
giving accounts to kids under 16. Lots of similar legislation is being debated in other countries,
including here. Obviously, still enormous challenges on the horizon, especially AI. I want to talk to
about all of that. But first, I'd love it if you could just talk about how you got here.
You were a reporter for the New York Times, among other stories. You covered the youth mental
health crisis. What was the moment you decided that you wanted to move into the world of
activism? Well, I guess you could say it was gradually and then all at once. You know, first I was
reporting on youth mental health and growing rates of suicide among teens. And then right after that,
we saw in the data that suicide was going up not only in teens, but in tweens, which were kids as young as 9 and 10.
And when I reported that story, you know, I'm also a mom. I've got three sons.
Learning that there were 9 and 10-year-olds who wanted to die in this country at such large rates just kind of changed something in me.
I mean, it's just pretty messed up that we have so many kids that are suffering like that.
And then as more and more whistleblowers came forward from meta and shared how they knew the products and the algorithms that they were using was causing harm and that different changes they would make would increase the harm.
And then when they told their bosses, we should probably, you know, fix this.
Their bosses were like, eh, that made me really mad.
You know, I have no problem with companies making profit.
I have no company, you know, no problem with innovation.
We need innovation.
But to do it on the backs of kids is just gross.
And that's when I decided I needed to do something about this.
And what was that transition from journalists to activists like for you?
You know, I mean, they kind of go hand in hand in the sense that I'm driven by facts.
Like I'm driven by what's actually happening.
You know, as a journalist for most of my life, I always didn't share what I thought.
I tried to really put my own feelings aside and, you know, just the facts, right?
And so as an activist, I'm learning to talk about myself more.
You know, it's weird and not natural to talk about myself and share my story.
So, you know, I've had to learn how to get comfortable doing that.
But, you know, in this particular case, what we're working for is protecting our kids.
And the evidence is so clear that it's not hard at all for me to just talk about the data and talk about what's happening.
I've met so many parents whose kids have been harmed by social media.
And you cannot forget their stories.
And I just don't understand how people who work at these companies can hear about it and then look the other way, you know.
Yeah.
Speaking of your own story, what's your experience been like with your three sons and their screen time?
Well, you know, when my oldest son was born, there were no iPhones.
And Mark Zuckerberg worked at a small company called the Facebook.
And by the time my middle son was ready for pre-K, every single, you know, I took him to school.
Every single parent had a phone.
and was on Facebook and was photographing their kid constantly.
And they didn't realize it, but they were inadvertently training their children that this is normal.
You know, we trained them that it's totally normal to share everything about your day.
You know, here's a picture of what you ate for breakfast.
And that's not normal.
Never before in human history did we do that, you know, since the invention of cameras,
we kind of pulled them out for special occasions.
We didn't, you know, photograph every moment.
And then with my third son, you know, I have never let any of my kids on TikTok.
I just don't think that it's, you know, first of all, when I reported on it and found that they listed law enforcement contacts on every content, you know, I thought, what is happening on this platform that law enforcement on every continent needs to get in touch with them?
So I never let them have TikToks, but I haven't been able to stop him from seeing TikToks because his friends share them, YouTube and Snap have gone to shorter and shorter reels.
So, you know, all the while that I've raised them, I've been confronted by the exact same issues that every single parent in America's had to face.
and, you know, I'm not a perfect parent.
I'm not a parenting expert.
There's no parenting mistake I haven't made three times a thousand.
So, you know, I gave my kids smartphones.
I waited until they were in eighth grade.
I let them get social media.
I let them join Instagram because I thought that was a great way for them to stay in touch with
their cousins.
But then I experienced the horrific things that can happen when your kids are exposed to
terrible information that's not developmentally appropriate and in a context it's not
appropriate. How old are your boys? My boys are now 15, 17, and 20. They're young men.
I have two boys. One is two, about to be two, and the other is five. So we are just getting ready for
intense screen kind of stuff that they're going to have to deal with. You'll see all these gray hairs.
I didn't have, I didn't have them. And, you know, there are things that, you know, when you have children,
I think we all want the same thing for our kids, right?
You want them to get to have a childhood.
You want them to have joy.
You want them to have fun experiences.
And when I began to see is that some of their friends who had devices earlier or more often, they were looking down.
They didn't learn how to make eye contact with people.
They didn't learn how to have verbal conversations.
And, you know, what might seem a little awkward or a little unfortunate when they're three or four or five,
becomes a big problem when they're eight or nine or ten. And it becomes an even bigger problem
when they're in middle school and in high school. So, you know, now I'm very, I'm still proud of my
children. I love them. I feel very lucky that they've had such wonderful teachers and that my husband
is such an incredible dad. And people are always telling us how great our kids are because they're
able to have conversations with adults. They're very capable. They know how to do things. They know
how to cook. They know how to clean, build, you name it. But the crazy thing is, I don't, I don't
know whether to say this, but I'm like, my kids are just normal. Like, that's what kids are like
if you make sure they're not on screens all the time. Like, they're not exceptional. They're
just doing the things that people have always done, you know? Just developing relationships and
skills in real life. So you start Mothers Against Media Addiction, I believe, in 2024. What is the primary
goal and what's the pitch to people who might be interested in joining? So Mothers Against
media addiction is a grassroots movement of parents and allies fighting back against media addiction
and creating a world where real life experiences and interactions remain at the heart of a healthy
childhood. We have a three-part mission. It's parent education, getting phones out of schools,
and demanding safeguard so that all tech products are safe for kids. If they're used by kids,
it needs to be safe for kids. And the reason parent education is so important is because
too few of us know how these products work, and too many of us have given them to our kids,
again, myself included, without realizing the potential risks and harms.
So when you go to a toy store and you're shopping for a gift for your kid's birthday,
if there is a doll on the shelf or any product on the shelf, and it said, you know,
there's a one in ten risk of your kid becoming depressed after using this,
or a one in eight chance that your child will develop an eating disorder,
or a one in ten chance they'll be suicidal.
Nobody would buy that for their kit, right?
But we didn't know that that's what social media was doing, and we didn't know that because these companies weren't disclosing what was happening on their platform.
They weren't being transparent about the data that even they were collecting, right?
So we want to make sure that parents know about the research that exists, about what kinds of experiences are best for healthy child development, and know about the risks of the products.
In terms of getting phones out of schools, we know that human-to-human interaction and experience is critical.
in the way the brain develops and is critical in childhood for the acquisition of
lifelong tools and skills that are going to make you healthy emotionally for life
and that are going to give you the tools that you need to become a good learner and a good
student. So we're not actually anti-tech at Mama. We think tech can be fun and it can be
helpful. We just don't think it's always automatically the right solution and that it needs to be
introduced in a thoughtful way. So we want phones out of school so kids can learn, they can spend time
with their friends, they can develop relationships with teachers, and that they have a chance to be
kids, right? You know, when you have FOMO all the time and you're spending every second wondering
about your snap streak and how many likes you have, you're not really concentrating on reading
in math, and unfortunately, we're seeing that in the nation's report card. So what made you
decide to use the words media addiction to define this problem as opposed to just saying kids spend
too much time on their phones? You know, that's a great question. And I think that,
one of the things that's been lost in our acquisition of products is that we have become a society
that's consumed with always getting the latest and the greatest and the new. And we love anything
that's got bells and whistles on it. Right. And so, you know, we're so addicted to media and
screens now that we have screens everywhere. We have them in our restaurants. We have them in our
bathrooms, we have them in our elevators, and actually the human brain is not wired to receive
a constant onslaught of information 24-7. And so it's not really surprising that adults who
consume too much media and information feel emotionally unhealthy. And it's not surprising at all
that children who are inundated with content and news and things that are scary and videos that
are violent and information about all the threats of the world are not emotionally well.
and that they're suffering. So, you know, calling it media addiction was a way of drawing attention
to both the individual addiction that we have from algorithms that are designed like slot machines
to give you continuous partial reinforcement and get the dopamine going so that you have to come back,
but also as a way to begin to talk about our collective society and what kind of society do we want?
Do we want a world where we let the machines and the apps and the screens, you know,
dominate everything or do we want to preserve what I think is so special and unique about being
human and that is our human relationships. Yeah. And it is um it's so much different because a lot of
people compare this to say well there was a panic around TV and you know that turned out okay which
I guess you could say did it really turn out okay that we were all addicted to TV because those
are other you know screens too but it is different like my my eldest is loves to build
Loves to build Legos, which is great.
It's like an off-screen activity.
But, you know, when he wants to watch a little YouTube,
he likes to watch, like, Lego building videos.
And so, like, put those on for a couple minutes.
And if I'm not there with him,
suddenly the algorithm on YouTube takes him from the Lego building,
and then he likes, like, the Lego building of the Titanic ship.
And then all of a sudden it takes him to videos of the Titanic sinking,
and then other ship sinking, and then planes crashing.
And I'm like, holy shit.
It was like five minutes.
And I came back, and suddenly he's asking me, like, scary questions about things that the algorithm just feeds them on YouTube.
And you realize that that is so different than just when we were kids sitting down in front of a TV program.
And you know what the TV show is.
And you understand what it's going to show your kids.
Like, you have no idea now what the screen's going to show them.
So, you know, I was a student of a guy named Neil Postman.
I don't know if you've heard of his work.
Big Neil Postman.
I saw that you where I was going to ask you about that.
I'm a big postman fan.
myself. Well, I don't think there's anyone who knew him or read him who wasn't a fan. But, you know,
there's not a lot of people like Neil. But, you know, the things you're talking about are the things
you wrote about and amusing ourselves to death. And in a way, this is just amusing ourselves
to death taken to the furthest possible extreme. We've allowed our culture to really be diminished
in terms of not prioritizing complex, long-form journalism, in terms of letting soundbites rule. And now
you know, there's so much spectacle in politics that it really interferes with the substance of what
needs to happen to run a healthy republic. But, you know, in terms of the YouTube that you were
talking about, you know, there's a TED talk that you should look up by James Bridal. It's at least
a decade old now. But he talks about the nightmare of children's YouTube and shows you how few
videos it takes to go from Dora the Explorer or Disney thing to the most twisted, you know,
disturbing, probably AI generated images that are exposing your kids to, you know, again,
like when I started Mama, I thought, wow, am I tip or Gore? Like, am I coming out? Like, you know,
am I just so much more conservative that I thought? But I'm like, you know, I accept that at some point
my sons are going to be exposed to some pornography during high school.
You know, my husband had a friend and in high school, they took him below the bleachers and
they pulled out a playboy and, you know, it was younger than you should be.
But I have a problem with big tech mainlining porn to our 12-year-olds.
And that's what's been happening.
The average first age of exposure to porn now is 12.
And many boys report seeing it at 10.
And let me tell you something.
you know, when your child is 12, you will know that they are not ready for that. And it's not
appropriate for them. And so, you know, we have allowed these big tech companies to operate their
properties in a real lawless way, even though in the history of this country, we have been regulating
mass media since the 1930s in the invention of radio. So it's really appalling that Congress has not
kept our laws up to date with the evolution of these technologies. And that's a
another reason Mama was formed is to say enough is enough, you know, we have to make sure that safeguards are in place.
Offline is brought you by laundry sauce. You ever use laundry sauce, love it? It's great. It smells like
heaven. What would heaven smell like, do you think? Smells like laundry sauce. Do you think everybody in
in heaven is pansexual? Good question. You think they even wear clothes?
Probably not. Probably don't need laundry.
Right.
But you know what?
Are those of you down here?
You need laundry sauce.
The holidays are in full swing, celebrations, gatherings, cozy nights in, and all the chaos that comes with the season.
One thing that doesn't take a break, laundry.
Whether you're hosting family, heading out for parties, or just trying to keep up with the holiday rush, one thing's for sure, laundry doesn't quit.
The laundry sauce signature and essentials package is your secret weapon for making laundry feel just as festive as the rest of the season.
And this season, it gets even better with Himalane Kashmir.
Their first ever hypoallergenic fragrance created to be gentle on even the most.
sensitive skin. With a subscription, you'll never run out. You can get any laundry sauce package
delivered on your schedule with flexible options and exclusive perks. No heavy bottles, no emergency
trips, just about everything you need exactly when you need it. Smell amazing. Stay stocked and
make this holiday season your best smelling one yet. They even offer a full moneyback guarantee.
If you don't get better smelling, cleaner laundry, you get a full refund, no questions asked.
For a limited time only, our listeners get 20% off your entire order when you use code offline
at LaundrySauce.com.
That's 20% off at LaundrySauce.com with promo code offline.
After you check out, they'll ask where you heard about them.
Don't forget to drop our name.
Trust us, your laundries never smelled this good.
You've said that Mama is modeled in many ways after Mad, Mothers Against Drunk Driving.
How is the fight against media addiction analogous to the fight against drunk driving
or smoking against a big tobacco?
Great questions. I think that like mothers against drunk driving, we are really standing up to a massive public health threat. And that's what this is. And we are making change in our homes, our communities, and across the nation. So, you know, we're not even two years old. We started in one Brooklyn living room. Now we have 35 chapters in 22 states. And we know that the change we need has to happen both at a policy level and at a cultural.
level. And so that's what Mothers Against Drunk Driving did that was so impressive. They got us all to
think differently about drunk driving. I was born in the 70s. I grew up in the 80s. And I think that
the adults, you know, had a few drinks, took a ride. And if you tried to stop somebody, it was more
of an ego thing or a manly thing. I'm fine. I'm fine. And Mothers Against Drunk Driving helped us all
see that it was really a public health issue. And that if you witnessed somebody getting into a car
and doing something, you were putting everybody at risk. And they helped, they changed the
policies. They raised the drinking age. They made it so that we understand bartenders are
responsible. If you see someone that's inebriated, you have to stop. You can't keep serving
them. And we want people to think about the entire media environment and the entire experience of
childhood and what we're allowing for our kids and creating for our kids. So that's what's similar
about it. And I think the analogy to big tobacco is even more close because with big tobacco,
you know, smoking was so entrenched in our society. I mean, people have forgotten. We used to have
smoking lounges in high schools, right? I mean, and you think about that now and you're like,
that's crazy. But we had that because, first of all, the product was so addictive. Second of all,
it was so commonly used. And third of all, the tobacco companies had taken so many different measures to
make smoking acceptable. They had gone to Hollywood. They had convinced them to change the scripts so that
all the sexy leading men and women were smokers. They had created Joe Camel. And so the undoing of that
required just a lot of work on multiple different levels. We had those huge hearings where the
tobacco executives were called up in front of Congress. We had public education campaigns. We had
lawsuits. There were research reports that came out. And that's the same sort of way that you're seeing
us mobilize and everyone in the tech reform space mobilized. There are hundreds of lawsuits brought by
thousands of families whose children have been victims of harms on social media. You are seeing
many, many films come out to educate people about the risks of these things. You are seeing
mommas and other people rise up to demand a change in what their schools are doing, what their
communities are doing. And then you're seeing policymakers take it into their own hands. And because
Congress has been so slow to act, we have state lawmakers really stepping up and taking measures
to protect kids. I thought a lot about why this is different than some of the other harms,
either to kids or to people in general. And, you know, like with cigarettes,
It is, you can at some point say, all right, well, this is an addictive, you can draw the line between the actual nicotine and it's, there's a chemical and so there's an addiction there with drunk driving.
You can say, okay, well, we can show, you know, biochemically that alcohol impairs you and that's, and with screen addiction and media addiction, it's, I wonder if sometimes people don't, people have a harder time understanding the addictive quality.
because you think, well, it's my own choice.
I get to go read information.
And there is a healthy way to consume information
via screens and media via screens, right?
And it's also, that's why it's a little different
than the violent video games or violent movies,
like the tipper-gore stuff from the 90s, right?
Because you're like, okay, well, that's content,
it's bad content, is it gonna have an effect?
We don't know.
But this is trickier because,
It's not necessarily that the access to the information, the vast amount of information on the
internet, is the problem.
It is the algorithms themselves, which I wonder if it's just harder for people to sort of
perceive that this is addictive in a way because we're all subject to it.
I love how you said that.
I think you said it really well.
I think it is harder for people to perceive, and that's why it's been so long in coming
to get everyone to understand what was.
was happening. I think we ourselves as adults were addicted. I know I'm addicted. And that's part of the
problem. You know, so we're guilty of it. We're modeling that for our children. They're repeating it.
But because we can't see it and because it seems like we should be able to control it, you know,
I think our first instinct was to blame ourselves and, oh, it's our fault as parents for not limiting
our kids and we would tell them, okay, these are your limits and you have to get off. But that really
isn't fair because as one mother texted me, and again, this was another reason I came to be an
activist because after I wrote my story, so many people started reaching out to me for help.
But one mother texted me and said, I don't know what to do. I feel like my son is addicted
to cocaine that I gave him. And that is the helplessness that parents feel. But it's not something
that we can solve on an individual level, right? So, you know, we as parents have to feed our kids
healthy meals, but we don't expect every parent to keep the food supply safe.
Like, it's not your job to go test the baby formula at the drugstore, right?
We have a system in place to make it safe.
And that's why we know these products, because all tech is is products, they need to conform
to certain standards that they're safe.
And, you know, one other thing, if I can just say a little more about what you said about
not perceiving, I mean, that's another thing that's so frightening about what's happened,
is that it's happened under our own roof.
So, you know, children can be in their own room and you think they're safe at home, but you don't know what they're looking at.
They could be sitting next to you and you don't know what they're looking at.
And we know what they're looking at, first of all, because the tech companies have begun to tell us.
So this last year, META put out a press release sort of touting how great it is and all the work that they're doing to limit suicide and self-harm content on their platforms.
And buried at the end of the press release, they acknowledged taking action on 12 million separate pieces of
suicide and self-harm content that year in a three-month period. So they were acknowledging
48 million separate pieces of suicide and self-harm content. And we've known for centuries that
suicide can be contagious, and that's not something you want to expose kids to. So, you know,
kids are just being exposed to all kinds of things. They're being exposed to it in private,
whereas if you're with your child watching a movie and you witness a car accident or something
traumatic or somebody gets killed, you know what they're seeing, they're going to see your reaction,
you can talk about it. They're viewing it privately, or they're viewing it privately next to a
birthday party photo or a cat video. They're also being told that it's not something bad. It's
just sort of normal for them. And with my own kids, my oldest son learned about the death by
suicide of a friend's sister over Instagram. And then a year later, he learned about the suicide
of a friend on his baseball team on Instagram. And when he told me, you know, I said, honey,
I'm so sorry. I'm so sorry about your friend. And I'm so sorry you learned this way on
Instagram. And he said, mom, that's just normal. And it kind of took my breath away. And I had to
explain, no, it's not normal to like learn about the death of your teenage friend through a social
media app. Yeah, because when you experience and learn about trauma like that, you need a human
around, especially if you're a child. You need a parent or a guardian or someone around who can
answer your questions, who can help you through, who can see if you're okay. And when you
experience these things alone, then, of course, that's going to have a profound effect that is
much more negative than if you hear about it with a parent around or someone who can help you
through it.
Offline is brought you by Indicloud.
Indecloud is your online dispensary for better nights, smoother mornings, and
calmer holidays, now with zero sugar, zero calorie THC sodas, alongside your favorite
gummies and pre-rolls, all federally legal THC, DEA certified, lab tested, and shipped
discreet because you can't pour from an empty cup, but you can sip your sanity back.
Wow.
Wow.
That's beautiful.
Wow.
Really leaves you thinking about it.
So we like to get high.
There it is.
There's that speechwriter's touch.
Once in a while.
Yeah.
Take the edge off.
Absolutely.
You know?
Absolutely.
I've got to take that edge off.
Sand it.
Sand it down.
You get home.
It's been a long day.
Change your mental, you know, your frame.
You're trying to go to bed.
And you're like, I just, you know.
How can I go to bed?
My brain is filled with all this edge.
How do I get it off?
You know what?
Get the edge off.
Indicloud will do it.
Your boss wants one more quick call.
Not a crooked media.
Yeah, no.
Your family wants one more favor.
Yeah, that is very true.
Sotas for staying grounded,
gummies for shutting out stress,
and even sleep blends for ending the day right.
See, that's my deal.
That's where you're at.
Plus $70 ounces.
That's the only gift that shows up on time.
I do not think I need an ounce of anything.
This is how you end the year,
clear-headed, not wiped out.
If you're 21 or older, visit indecloud.com and use code offline for 25% off plus free shipping.
That's indecloud.com code offline, 25% off, free shipping and a better way to unwind.
Enjoy responsibly.
And thanks to Indicloud for making the reset part easy and a little more refreshing.
What is the most important win you guys have had so far?
Oh, gosh.
know, I can't tell you how thrilling and humbling it has been to see all of these people come
forward to lead Mama chapters in their community. You know, 35 chapters, 22 states. We have helped
pass bills all over the country. An early win was in New York. We passed the Safer Kids Act
or the Stop Addictive Feed Exploitation Act
and that made it so that social media companies
who have users in New York who are under 18
cannot make algorithmic design that's addictive.
It also prohibits them from sending overnight notifications
because we know from some of the social media platforms
that they have 12-year-olds, 13-year-olds who are online
at 1 a.m., 2 a.m., 3 a.m. on school nights
and lack of sleep, we know, is connected to depression.
So that was a big win.
We're very proud of the AI bill that Governor Newsom signed in California.
Again, like that type of legislation is really leading the way.
We're proud of what happened in Utah.
They passed the Digital Choice Act, which gives users control of their own data.
It provides the right to delete, which is an important privacy right, especially for children.
And I think I'm most proud this year, you know, when
they tried to put this clause in the one big beautiful bill that would have prohibited states
from regulating AI. We beat that back and the Senate voted it out 99 to 1. We and other members
of our coalition did it again recently when they tried to slip this provision into the
Defense Spending Act. We got it out. So that was great. But just yesterday, on December 11th, Trump issued an
executive order. I don't know. You want to talk about that? Yeah, no, we've been talking about this on
the show for a little bit as well. And he basically just, the provision that they've been trying
to insert into law, he just decided to do an EO. I don't think it has a ton of force. I mean,
basically what's going to have to happen is the Justice Department will then have to sue one of the
states that, like, California that have passed this AI regulation. And look, I mean, I'm no lawyer here,
but it'd be one thing if the federal government said,
okay, well, it's interstate commerce and a state can't we can't have a patchwork of 50 different kinds of regulations.
The federal regulation has to supersede that, but they haven't done anything on a federal level,
at least legislatively.
So it's hard to say, we haven't done anything yet, but also you can't do anything while we're waiting to do something.
Yeah, I mean, you know, this before the EO was signed,
President Trump talked about protecting kids. And if they wanted to protect kids, they would
pass federal legislation. But they haven't. They won't. We'd like that to happen soon.
And, you know, no one asked for this, right? No one wants this. Every single credible poll shows
that Americans oppose limiting the state's ability to regulate AI by roughly three to one.
So, you know, the push for this is coming from big tech and from big tech alone.
State Attorney Generals are pushing back.
Nearly 300 state legislators have formally opposed it.
And so far, 17 Republican governors have gone on the record against it.
So, you know, this is a major swing by Silicon Valley that's wildly out of step with the American public and even with the Republican base.
So, you know, we've already won this issue in Congress.
You cannot tell states what laws they can pass.
And I think the White House knows this.
And so I think the goal is just a chilling effect to try to slow down and not pass state laws because they're just afraid to get sued by David Sacks in the White House.
Well, I was going to ask, like, where have you met the most resistance and for what reasons?
Because it is an area where maybe one of the only areas today where there's a lot of bipartisan support and sort of and not just in the like, not just among like moderate Democrats and moderate Republicans.
but you get some pretty conservative Republicans, the Josh Holly types,
with some pretty progressive Democrats who are joining forces on this,
and you see it in very blue states and very red states passing.
But, you know, there's obviously resistance, and there's big tech,
and they have a lot of money, and they have a lot of lobbyists.
But what arguments are they making to some legislators that, you know,
persuade them that regulations on tech and protecting kids isn't such a good idea?
Yeah. So we get almost no pushback, John. You know, I don't have to convince anybody to join Mama. I don't have to convince anybody that we need Mama. Everybody agrees that there's too much media addiction. These products are not safe for kids, and we have to fix it. With the exception of some tech companies that are making money hand over fist, and they're the ones that are spending tens of millions of dollars on their lobbying efforts to prevent legislation. And they do it sometimes in really
tricky, dirty ways. They go into states, and every time a bill is proposed, you know, they have the
same playbook now. They've used small business owners. They say, oh, you know, if this law passes,
it's going to, you know, affect your small business. They've used the LGBTQ community. They've made
the argument that if you pass a bill, you're going to somehow put trans lives at risk. They've even
said that, which is absurd. Yes, people from marginalized communities often find support online, and that
will continue, even if you have these safety regulations, because these regulations are not about the
whole internet. You're not getting rid of everything. They're just about the social media platforms in
particular, right? So if I had a trans child or a gay child, I would want them to find community,
but I would much rather them be in a chat group run by the Trevor Project or the Audrey Lord
project and not administered by Mark Zuckerberg or one of these guys who really doesn't care about
their well-being. So we haven't seen, you know, any direct pushback. Nobody really disagreed.
with our argument. But money is an issue. You know, with the Kids Online Safety Act last year,
it passed 99 to 1 in the Senate. It went over to the House. It had 64 co-sponsors, completely
bipartisan. And Speaker Mike Johnson refused to bring it up for a vote. He said it wasn't good
legislation. Steve Scalise said, oh, no, no, this could possibly stifle speech. His Republican
colleagues didn't understand. They asked him to explain himself. Then it was an
that a $10 billion dollar AI data processing plant was being built.
Guess where?
Louisiana.
So, you know, did that have something to do with it?
I don't know.
You know, and since that time, President Trump has discussed that the value of META's investment
in Louisiana, I think, is closer to $50 billion than $10 billion.
So, you know, the amounts of money here are just unfathomable.
It's really incredible.
but, you know, so they keep hiring lawyers, they keep filing lawsuits, you know, and I was in a room once with a meta spokesperson who, you know, shared how deeply Medicare's about kids and want safety measures in place. And I said, well, does that mean you're going to drop your lawsuit against Attorney General Rob Bonta because they sued California to stop the implementation of the age appropriate design code, which simply require social media platforms to operate with a duty of care to children and make sure the platforms.
are safe. So, you know, everybody in America, except a handful of people who are profiting wildly
from these companies, wants some safety measures in place. And the argument that they use are
disingenuous, but the big one is free speech. They keep using the free speech argument.
Yeah, I was going to say what I've seen, not from tech companies, but I see from some,
I would say, libertarian-minded folks, you see some of this from ACL. You see some of this from ACL.
you types and a few progressive journalists here and there, you know, you mentioned sort of marginalized
communities, not basically they see social media as sort of a social lifeline and, you know,
they can't forge these connections at home for whatever reason. And so they're finding these
communities online. It's really important to them. And then I also hear about this fear of
censorship and surveillance. And if there's age,
verification or if there's digital ID or stuff like that where basically kids are and you know if it's
for adults too but people are giving more information over and what if the government has that and
do we really want these social media companies and platforms in the business of content moderation
and is that a free speech issue or who gets to decide um i don't know what you like what is so your
best arguments about all of those concerns. And Inlook, and I know that, I mean, I know that a lot of
this isn't about necessarily some of this legislation is not about content moderation, but it's
about design and amplification, which is much different. But I'll let you, I'll let you take this
one. Okay, well, sure, I can do all that in 30 seconds. Take your time. Take your time.
It's okay. Well, you know, there's a few different issues here. So first of all, I think the language we use
is really important. So we call it social media, but it's actually mass media publishing. And in this
country, we have been regulating mass media since the invention of radio. You know, when radio came about,
we just come through World War I, we understood that if you gave an individual, especially a very
charismatic individual, this unprecedented power to speak to large audiences in an instant, that that could be
really dangerous. And that was why we capped how large a radio size could be, how many people could
own how many big radio stations and newspapers in the same town. Later, with television, we limited
you couldn't own a TV and a radio and a newspaper because the media environment,
the information environment, shapes public perception and that affects our democracy. So to call
social media, something different than publishing, I think, has opened up all of these problems,
where if we just recognize that when you put something out in the world and you put something
out at scale, you have responsibility and accountability. And our historic institutions
of journalism spend a lot of money making sure that what they do is careful. I'm sure on your
podcast, you spend a lot of time and money to make sure that nothing gets said that would inadvertently
you know, defame somebody or libel somebody. And there's space in there for errors, right? Like,
as a journalist, I've made mistakes. I've misspelled people's name and we've run a correction.
But that's different than knowingly putting out false information. So, you know, I think we need to go
back to a model where we look at social media as published speech or advertiser-supported speech
because that's not the same as free speech. And the First Amendment gives us free speech, but it doesn't
give you the right to publish speech. It doesn't give you the right to broadcast speech or mass
mediated speech. And so we need to sort of think about it differently. And I think, you know,
in journalism, we have concerns about data privacy too. And it is less than ideal for these
companies to have all this information about our children. But we know they already have the
information. They've been collecting it. You know, Europe had GDPR and we were so far behind them and we
don't have anything similar. I know that created a million headaches for companies, but it also
protected basic human rights in law. And we don't have that in this country, right? You know,
there's nothing protecting us and your right to your privacy and your children's privacy. I think
that's why so many parents are upset that schools have introduced so much, quote-unquote,
ed tech into the classroom because that has collected your child's data and used it in ways
that parents often don't know about. But since these companies are already
collecting it. It really, I think, is disingenuous when they say, oh, you know, it's not good for kids
if we collect that data. I mean, give me a break. They're already collecting every time your daughter
posts a selfie and then deletes it real quick because she doesn't like it. They know that's the
perfect time to show her a beauty ad. And, you know, there's so many points of data they already have.
I think we have to accept that there's going to be some data collection and ensure that it's
regulated property like automatic delete, non-transfer to,
other companies, all those kinds of ways of protecting data that other countries have already
figured out.
Offline is brought you by Wild Alaskan Company.
Look, you have all kinds of challenges buying seafood.
You're like, ah, am I going to get it from this place?
Is it going to be good?
Right.
Is it going to be right?
Right.
And you know what?
You know when you don't have to worry about that?
When?
When you buy seafood from Wild Alaskan Company?
Oh, it's great.
It's the best way to get wild caught, perfectly portioned, nutrient-dense seafood delivered directly to your door. Trust me, you haven't tasted fish this good. I love wild Alaskan. Got all kinds of fish from there. Great salmon. Great salmon, great tuna. It's 100% wild caught, never farmed. This means there are no antibiotics, GMOs, or additives, just clean real fish that support healthy oceans and fishing communities. Their fish is frozen off the boat to lock in taste texture and nutrients like omega-3s. Wild caught from Alaska, every order support.
sustainable harvesting practices, and your membership delivers
flexible shipments, expert tips, and truly feel good seafood.
If you're not completely satisfied with your first box, Wild Alaskan Company will give you a full
refund. No questions asked, no risk, just high quality seafood.
Not all fish are the same. Get seafood you can trust. Go to Wildelaskin.com.com
slash offline for $35.com.com slash offline for $35 off your first order.
Thanks to Wildelaskin Company for sponsoring this episode.
I assume you're a fan of Australia's new social media ban for kids under 16.
Do you think something like that would work here in the United States as well?
And are you guys pushing for something similar?
You know, I am a fan.
I think it was a huge mistake that we introduced social media to our children,
not only because of what they were exposed to.
how it makes them feel, but because of all the time that we've lost where they could have
been doing other more enriching things.
So, you know, in a purely libertarian society, we would let anybody smoke, but we've decided
as a society, no, like you have to be at least 18.
And I think that is a smart thing, and I would love to see us do that.
Could it happen in the United States?
I'm not sure.
I haven't seen the political will for that.
I think it's more likely that we will implement safety regulations first.
So we are, that's not our top priority right now to get it fully banned, but we certainly would like to see that.
What do you think is the reforms that are most politically viable in the U.S. right now?
Well, you know, again, the Kids Online Safety Act, or the version of the Kids Online Safety Act we had last year, we know it's politically viable.
It passed 99 to 1 in the Senate.
So again, you know, this is such a bipartisan issue.
I mean, this is like the last bipartisan issue in America.
basically everybody agrees. So it's really about not allowing big tech to dominate the narrative
with this idea that we need to win the race against China. Of course we need to innovate and of course
we need to lead. And I believe that the folks who work in tech in this country are so smart.
They can figure out how to do that while also keeping children safe. So I think there is a lot of
political will right now to get some things passed to protect kids. We're seeing at the state
level. We're seeing it at the federal level. It just needs to happen more quickly because we can't
repeat what we did with social media. There have been no laws to limit or regulate social media
since 1998, which is before social media existed. So it's crazy. Well, I always think about
sort of fighting the last war in terms of AI now. And I
wonder, like, what are your biggest concerns around artificial intelligence and the potential
harms to kids? You know, obviously, the first thing that comes to mind is the, you know, chatbots,
and I know there's been some legislation around trying to protect kids from, you know,
signing up for chatbots and having chatbot friends too young. But, so talk a little bit about
that. And then are there other concerns around AI that you guys are thinking about?
So, gosh, where do I even start about my concerns with AI?
You know, so I have so many concerns.
I mean, I'm concerned about, I don't even know where to start.
So, you know, I'm concerned about how quickly kids are adopting it.
I am concerned about how quickly adults are adopting it and not thinking about how that is affecting school.
I mean, you know, my kids all go to different schools so I can talk about it without anybody knowing when I'm talking.
about one of my children's schools has in their honor code that children are not allowed to use
AI for their homework, but the teachers have been giving them assignments with a message on the
bottom that says this was generated in part with AI. So talk about a mixed message. So my biggest fear
about AI and the rapid adoption of AI is how quickly it is causing us to behave in stupid ways.
You know, the invention of literacy, in my opinion, is one of the greatest achievements
of human civilization. And I think we often forget that the state of mass literacy hasn't been
around that long. It's only been a little more than 100 years that we've had 80% of the
population able to read at above a fifth grade level. And that's something I would like to
continue into the future. And if we are not prioritizing a culture of literacy, and it was the
culture of literacy that gave us written laws. It was the culture of literacy that allowed
science and medicine and all of these wonderful things and what's been happening is we've been
eroding that we've been eroding that with our visual culture we've been eroding that with
short form content we've been eroding it with the diminishment of our attention spans and we know
how you know the adult attention span is now like less than a goldfish like it's eight and a half
seconds and and what that means people are reading less they're not taking the time to think
through complex ideas that require reflection and depth.
And if we're not equipping our students to be able to do that, that really scares me for the
future.
So, you know, I guess if we had buckets of all my AI fears, there's like the bucket of immediately
harmed to children who are developing intimate relationships often with chatbots and
are getting advice about how to harm themselves.
There is the medium-term fears of children developing relationships with AI and displacing important experiences in human-to-human relationship that they need and important experiences with learning that they need to become critical thinkers and get the building blocks to be able to think and read and do math at higher levels.
And then there's the longer-term fears about how if we allow machines to just take over so many parts of our lives, what that does to us as a society, what that does to workers, what that does to the environment, you know, with these data centers that they're building in the middle of deserts that don't even have water.
So, yeah, there's plenty of fears to go around, John.
Yeah, I mean, for me, too, it's just like, I think that much like social media, you can be lulled into this sense that AI can, you know, provide a supplement or replace human interaction in a lot of different ways.
And I think the temptation there is that there's no friction with AI, right?
And that it tells you everything you need to know immediately, tells you what you want to hear.
it is obsequious it is sycophantic and so if children learn that anything they need they can just ask their
ask AI and they'll get it right away and that there's no friction there and they don't experience
the difficulty of navigating relationships and the awkwardness and all that kind of stuff you do you do
lose something essential about the human experience and and that that I worry about quite a bit and I
I also think that all the economic incentives for the AI company, much like the social media
companies, is to keep you using the product no matter what.
And to keep you using the product, that means it has to tell you what you want to hear,
give you everything you want, satisfy all of your desires, and not really care about the
consequences of doing so.
And it's the same with using tech in the classroom, right?
Like when these tools, and when the calculator was introduced, people were afraid that, that, you know, would diminish our math capabilities.
And by the way, look at what our math capabilities are as compared to when we first started measuring them, right?
Like, since the arrival of technology in classrooms, there's been this concomitant decline in reading and math scores.
And it's not rocket science.
We know that embodied experiences contribute to learning in a deeper and in a different way.
way. So, yeah, I mean, we have these really big questions in front of us about what kind of
future we want. And, you know, as a mom, as a human being, you know, I want a human first future
where we continue to cherish and enjoy the experiences of being one another that are not
mediated by machines or by, you know, any form of technology. Do I want a world with no technology?
No, that's not what I'm saying. I think, again, it can be really helpful.
It can be fun, and there's a lot of places for it.
But if our kids are spending 10, 12, 14 hours a day on screens, you know, what about the physical problem?
I mean, we've seen a global issue with myopia, with more people becoming nearsighted.
In Taiwan, they decided they were going to try and fix this, and they passed a law that every child had to go out and play for two hours.
And in a year, they solved the problem, right?
Because children's eyes need to focus at lots of different levels in order to build those muscles.
So, you know, the solutions here are actually quite simple.
We just have to implement them together.
Julie, thank you so much for joining and thank you for everything you're doing.
You know, I do talk about this all the time, but I do find hope in that the support for these reforms is so broad and deep across so many different.
demographic groups and people across the political spectrum. And I do feel like we are at
an inflection point in the last couple years where the tide is turning against the big tech
company. So appreciate all you're doing to lead that fight. And thanks again for joining.
It was great talking to you. Yeah, you too. I mean, thanks for your interviews. I was catching up
on your podcast this week. And you've had so great interviews on it. Well, thank you so much.
I appreciate it. All right. Take care. Thanks for having me.
Two quick housekeeping notes. In case you missed it,
Pod Save America is going down under for the hopefully just visiting tour 2026.
We're headed to Auckland, New Zealand on February 11th, and then three cities in Australia
after that, Melbourne on February 13th, Brisbane on February 14th, and Sydney on February 16th.
With everything going on in America these days, will we feel the pull of the Commonwealth
countries and decide to stay for good? You'll just have to join us to find out.
Tickets are on sale right now for more details and to grab tickets. Head to
cricket.com slash events. Also, the newest book from Cricket Media Reeds is coming out on January
27th, 2026. It's called Hated by All the Right People, Tucker Carlson, and the Unravelling of
the Conservative Mind. It's by one of our favorite political journalists, New York Times
magazine writer Jason Zengarly. So, why a book about Tucker Carlson? And why now? Why not? Because
Tucker Carlson is everywhere right now. And the key to understanding our current political moment is the
value increase of moral outrage over truth. Of course, no one has done more to accelerate that
than Tucker Carlson. And hated by all the right people, Jason Zengarly gives a fascinating,
informative look at Tucker's political evolution and how his rise traces the rise of the MAGA
movement. The book comes out January 27th, but if you pre-order a copy of hated by all the right
people now, you can get 15% off with the code Jason 15 at crooked.com slash books.
As always, if you have comments, questions, or guest ideas,
email us at offline at cricket.com,
and if you're as opinionated as we are,
please rate and review the show on your favorite podcast platform.
For ad-free episodes of offline and Podsave America,
exclusive content and more,
go to cricket.com slash friends to subscribe on Supercast,
substack, YouTube, or Apple Podcasts.
If you like watching your podcast,
subscribe to the Offline with John Favreau YouTube channel.
Don't forget to follow Cricket Media on Instagram, TikTok,
and the other ones for original content, community events, and more.
Offline is a Cricket Media production.
It's written and hosted by me, John Favreau.
It's produced by Emma Ilich Frank.
Austin Fisher is our senior producer.
Adrian Hill is our head of news and politics.
Jerich Centeno is our sound editor and engineer.
Audio support from Kyle Segglin.
Jordan Katz and Kenny Siegel take care of our music.
Thanks to Delan Villanueva and our digital team
who film and share our episodes as videos every week.
Our production staff is proudly unionized
with the Writers Guild of America East.
