The Current - How smartphones took over our lives and how you can take back control
Episode Date: August 19, 2025Got bad phone habits? We know being on our phones too much isn’t good for us, especially for kids. Yet we all keep scrolling and scrolling and scrolling. Kaitlyn Regehr, author of Smartphone Nation,... explains how our devices are designed to be addictive, and shares practical, effective tips for what parents can do to help kids build healthier habits.
Transcript
Discussion (0)
Guess who just bundled their home and auto with Desjardin insurance?
Well, look at you, all grown up and saving money.
Yes, I am.
Mom told you to do it, didn't she?
Yes, she did.
Get insurance that's really big on care.
Switch and you could save up to 35% on home insurance when you bundle home and auto.
Dejardin Insurance, here for your home, auto, life, and business needs.
Certain conditions apply.
This is a CBC podcast.
Hello, I'm Matt Galloway, and this is the current podcast.
On the mark, get set.
We're riding on the internet.
Cyberspace, set free.
Hello, virtual reality.
Interactive appetite, searching for a website,
a window to the world, got to get online.
Take the spin, now you're in with a techno set.
You're going surfing on the internet.
When the internet was brand.
and knew this infomercial wet family's appetites to take a spin.
Now that I've gotten on the internet, I'd rather be on my computer than doing just about anything.
It's really cool.
The internet gave us a whole world of exciting new possibilities.
So I guess this is a story of how it changed our lives.
Maybe it will yours too.
We installed the internet on our computer just a short time ago.
and I haven't been able to get the kids off it ever since.
Not only do they play the typical computer games that all the kids enjoy,
but their curiosity for learning has skyrocketed.
Having the internet in our home has had a great impact on our lives.
Little did they know what the impact would be on our lives three decades later,
but even then, in the 90s, the kids' guide carried a warning.
As a parent, I've never been happier than when my children ask their friends over for an internet computer party.
I'd like to add a word about safety, though.
You have to remember the Internet is not a regulated environment,
so the quality and accuracy of various informational offerings can differ quite a bit.
There may even be a concern if your children should access some of them.
So go online with them on the Net,
or set your Microsoft Internet Explorer browser to only accept G-rated sites.
The time of seeing the Internet as this place of endless benefits,
especially for kids, is long gone.
And today, that warning is more pressing than ever.
Caitlin Regair is an associate professor at University College, London,
specializing in digital literacy and the cultural impacts of social media.
She's also the author of the new book, Smart Phone Nation,
why we're all addicted to our screens and what you and your family can do about it.
Caitlin, good morning.
Good morning.
I know you've heard that infomercial a few times,
but what do you think when you hear it today?
I mean, thank you so much for playing that.
it's something that I open up my lectures at University College London with because I think
it's so important to remember the altruism and the promise of the early internet and some of
the reasons why we just dove in head first and kind of in a way we don't have with anything
else decided to try it first before we test or regulate it. And, you know, I would,
would argue that we have now allowed this technology to, you know, move far, far too quickly. It's
become much more complex and sophisticated since those early days. And with that, we need a lot
more caution. Yeah, okay. We've got lots to talk about here this morning. But I am curious,
you have your own young children. So what is their relationship to screens? Yeah. So the guidance
that parents have been given for the last 10 years, really, is something called screen time, which
is this idea of controlling the dose to about one to two hours of day. And the thing about that
guidance is that it accounted for the need to be physical and that time on screens probably
meant you weren't moving around so much. But what it didn't account for was the mental health
implications. And so screen time has only accounted for quantity of consumption, but not quality of
consumption. And so when I'm having these conversations with my very young children, I'm not just
talking to them about the time spent on screens, which is still important, but also the quality
of the content that they are engaging with. Is it educational content? Is it regulated content? Is it
content that we can have a conversation about, that we can collectively view together? Or is it
content that closes them down. They are not on iPads. They are not on headphones. You know,
they are not on algorithmically driven platforms. And even from a really early age, I'm starting to
talk to my daughters about what healthy quality consumption looks like, not just quantity.
And you're also talking, I understand, about algorithms. You went into your child's
kindergarten class and talked about that. How did that go over? So I thought, you know,
what, I'm going to see if I can teach what an algorithm is to a kindergarten class. And I went
into my daughter's class and I brought in a cardboard box with a hole cut out of it and I put it
on one kid's head and I said, you are a screen. And then I had a piece of yarn and at the other
end of that yarn, I handed it to a little girl and I said to her, would you like to see a picture
of a cat or a dinosaur? And she chose cat. And so I ran a picture of a cat. And so I ran a picture of a
cat along the string to her and then she got another cat and then she got another cat and another cat and
another cat and then a scary cat what i got all the kids to say was algorithms start to make our
choices for us and we should want to make our own choices and if kindergarten age children can
understand that so can we all and actually so should we all i think we have a responsibility to and so if we
understand that at the age of five, and judging by what you've just told me, it's possible.
How can people control their algorithms, which I know you talk about a lot in the book,
is a key piece to sort of regulating what you're fed?
I start with the premise that almost everything we consume is regulated.
The food we eat, the medication we take, the cars we drive.
We have consumer protections around these things.
We don't have the same types of consumer protections in the digital space.
base in large part because we are not the consumers of tech. We are the product, or rather our
time and attention is the product which is being sold to advertisers. And what my work looks at
is the way in which through this attention economy, this need to hold people's attention,
hate, harm, and disinformation is often algorithmically prioritized because disinformation,
than truth and harm or things that hook into our most vulnerable points will often hold us there
just a little bit longer. And it's that extra engagement, that extra attention, that extra time
that advertisers are paying for. What I then talk about is, first of all, you need to decide
whether you are happy with that. Are you happy just consuming whatever is fed to you by way of our
feeds? And if you're not, there are some things you can do.
The one thing I talk about is first figuring out that you yourself are in some kind of a silo.
I think we don't like to acknowledge that, that we are consuming often completely different content than the person sitting next to us on the bus or even our own sofa.
And so one thing I suggest is that we open up our most frequently used application, whatever that is, and scroll through a normative period of usage with our partner.
Now, if that feels super cringy, because to a lot of people it does, it feels like horrifying.
It's worth considering why that is.
These are hyper, hyper-personalized forms of viewing.
It is not like watching television with our family.
It is not like going to the movie theater with the community.
So if your partner says to you, huh, you're getting a lot of ads for plastic surgery.
Is that the best thing for you?
And then you decide.
You decide what you want more of and what you want less of, and then you actively start to game your algorithm, or what some call practicing algorithmic resistance accordingly.
At what age do you start talking to your children about that, for instance?
Can teenagers, I mean, I'm sure they can understand, but will they follow through and game the system in such a way?
When I'm talking about gaming your algorithm, I'm talking about actively searching for things you want, quickly moving past things you do.
don't, you know, don't engage with things that make you feel bad. And then, you know, really
cleaning up your feet or what I call a deep clean or a spring clean of your social media
applications where you actively unfollow things you don't want to see. Now, yes, teenagers can do
this if they are already on social media. It's worth talking to them about it. One thing I will
say is that because these feeds are so personal, you need to really think about whether it's
too personal to ask your teenager to share this. Of course, you will learn a lot about an individual
based on what is being fed to them. And that has implications in terms of identity, in terms of
sexuality. So you have to decide what's right for your family in terms of this sharing. If it feels
too much, if you have an older teenager, you can also do a different exercise where you bring in
three pieces of content every week, one that made you feel good, one that made you feel bad,
one that made you question something, and you have a conversation about what you're seeing.
And all of this, what we're trying to do is to open up these silos so that's not so hyper-personalized
and actually decide that what we see on screens is something that we talk about. It's not something
that we have to hide away up in our bedroom, under the covers, into the wee hours of night.
I think it's in those silos, too, and you raise this in the book, the concern about some of the
voices on gaming's devices, for instance, that you'll meet people that you don't know and
they're suddenly talking about racist viewpoints that you weren't aware of and suddenly you are.
How can all of this become harmful when you're in a silo and you may not be acknowledging that
you're in that silo?
So what you're talking about is when these silos become echo chambers.
So, and that's when we're worried that people are microdosing on harm.
And so it's not about one post one time. It's about a continuous stream of content that starts to normalize certain ideas. And this is not just the case for teenagers. You know, adults are not immune to this. We are all moving into echo chambers. But you're right. It's also not just happening on phones. You brought up gaming. You know, if your kid is gaming on a headset where they're playing with other people, they can be talking to adults. There also can be geolocator devices on.
games that actually show people where your kid is. And so we might not like that this harm is
out there in the same way we don't like that there are cars that, you know, our kid could get hit
by, but we do teach them to cross the street and look both ways. And we do educate ourselves
about those harms so that we can effectively help our kids. And that's, I think, where we need
to get to with this. We are gathered here today to celebrate life's big milestones. Do you promise to
stand together through home purchases, auto
upgrades, and surprise dents and dings?
We do. To embrace life's
big moments for any adorable co-drivers
down the road. We do.
Then with the caring support of Desjardin
insurance, I pronounce you covered for
home, auto, and flexible life insurance.
For life's big milestones,
get insurance that's really big on care
at Dejardin.com slash care.
On the 80th anniversary
of the liberation of Auschwitz
comes an unprecedented exhibition
about one of history's darkest moments.
Auschwitz, not long ago, not far away,
features more than 500 original objects,
first-hand accounts, and survivor testimonies
that tell the powerful story of the Auschwitz concentration cap,
its history and legacy,
and the underlying conditions that allowed the Holocaust to happen.
On now exclusively at ROM.
Tickets at ROM.ca.
Yeah, we're talking here
about kids who are unwittingly putting themselves in harm's way, potentially, by being in
these various situations. But also, I think parents sometimes unwittingly put their kids in
harm's way through something I know we now call sharenting. Can you tell me a little bit about
what that is? Yeah, so sharentine is the process through which parents share their children.
Often these discussions are centered around young people and the age in which they should or
should not get a smartphone. But a lot of the datification of kids is happening much, much earlier.
And that goes from, you know, the very early birth announcement. And in fact, that child may have been
shown as an ultrasound scan or even a pregnancy, a positive pregnancy test before that. You know,
that is how early we've begun to share our kids. And it's important to think about what the
implications are for that. One is that if you have a public account and you are sharing your child
on that public account, there are some risks involved in that. I mean, there have been campaigns
about back-to-school photos where you take a picture of your kid and you say the school that
they go to every day with their name and that that gets posted publicly. There is, of course,
risks associated with that. But there's also risks for their adult self. So Barclay's Bank has come
out to say that by the year 2030, two-thirds of the identity fraud will be carried out against
children who have already been shared online. And so I think it's really important for us as parents,
before we place blame solely on teenagers and their usage, is that we actually start to think about
our own usage around our really young kids. And we also start to expand the frame of what
digital engagement looks like. It's not just about phones. It's also about tablets. We know that it's
somewhere between 80 and 90 percent of kids from the age of three are on YouTube, an algorithmically
driven platform that often sits outside of the social media debates. And that we are actually
programming kids to engage with short form, often low quality content from a very
early age. I have to just say this because I was reading your book and I read the section about
YouTube and I have twin seven-year-old boys and they go on kids YouTube all the time when we, you know,
when we say they can have some screen time and I talk to them about, you know, I would really
like you to watch something with a narrative. Well, what's a narrative? So we had that conversation and
then they were talking about maybe we can do YouTube Tuesday. So then, so now we're sort of thinking
about doing something. So I did have that chat with them. I am thrilled to hear that. You are starting to
engage them in critical digital literacy from a very early age. And that's fantastic. That's
where we want to get to. That is where we want to get to. You know, one thing I do think about a
lot, too, is, and this is a conversation, of course, I am not yet having with my children,
but how do you get into the conversation about online porn? Because that is, I think,
when you talk about a cringy thing, I think that's number one on the list. Okay. Let's talk about
porn. So unfortunately, we are all going to have to have that talk with our kids. And unfortunately,
it's a much more complicated talk than the one our parents had to have. We are not talking about
a rolled up playboy under the bed. We are talking about short form, unfortunately often very
violent, quite extreme performances of sex. And this is also part of the attention economy when we talk
about things having to get more and more extreme in order to hold our attention. This is part of that
and that kids are using pornography as an educational resource and there is a link between
pornography consumption and a rise in domestic abuse in teenage relationships. That is something
that we should be very concerned about. And so how do we have the conversation? Okay,
I like to talk about Marvel films. What I
I suggest is that we say, this is a performance. Just like a Marvel film that shoots lasers out of their
eyes, which is a performance, this is a performance. And it is not representative of most loving,
consensual, happy, healthy relationships. And so it should not be used as an educational device.
And you might see this and it might feel strange and it might be upsetting even.
in and we can absolutely talk about it. But no, it is not reflective of most normative relationships.
I want to just go back to some, you know, let's get off porn here for a minute because, of course,
there are a whole host of other reasons why being on a screen for too long and the quality of what
you're watching is important to think about too. But so as kids head back to school and, you know,
the discussion is often, you know, trying to get to bed on time. I know the Atlantic just had a piece
called a tech rule that will future-proof your kids, and that is never let your kid take
their phone to bed.
But are there other tricks and tips you might offer parents and kids on what they can do
to ensure that they are managing their digital diet, as you call it?
I think that the, and that's the work of Jonathan Haight, who says that phones should be out
of the bedroom, and I think that that is a good one.
So that is this idea of moderation, and the moderation of use debate, which is around
bands, around this idea that phones shouldn't be in schools or in bedrooms is a good one. It is
one prong. I talk about three prongs. It is one piece of the puzzle. So we all should be
moderating our usage. I think we should all be using less screens. In addition to that,
we need education and also regulation, robust regulation. And the education piece is something that we can
be doing within schools and we can advocate for more of that, but it's also something that we can
be doing in our home. And part of that is having these conversations. And the other part of it
is to actively just make decisions and make rules within your home around what healthy engagement
might look like. So you brought up the digital diet, which I talk about where I have a kind of
a food style pyramid, like a food guide, to help us think about what quality consumption is.
So things that are educational, things that are communication, and things that are not healthy
engagement, things that are passive. So generally, we're trying to be active participants
and not passive products. The other thing we can do is you can set it up so that your phone
flips onto gray scale when you walk through the door, when it clicks on
your Wi-Fi. So gray scale is a really easy way to make your phone less sexy and seductive.
What is that? What does it do? Okay. So what you do is you go into settings. You'll be able to go
into color filters. You turn it onto gray scale and it will take the color out of the phone.
Okay. And what will happen is you just will want to look at it less.
I can see that. And that is just one little thing you can do.
to make it less addictive. I mean, these are very addictive. They are built to be addictive.
These screens that we stroke. And through that stroking, develop feelings of love and dependency.
You know, if you want to feel less dependent, one easy fix you can do right now. You can do it today
is turn that phone onto Grayscale, particularly when you're around your family.
Mm-hmm. What do you think, then, is the corporate or regulatory responsibility in this?
Yeah. So those are two different things. We talked a lot about this idea that we are not the consumers of tech that advertisers are. And I think that we can be encouraging the brands that we love, that we support to invest more ethically in tech, that they, you know, in ways that don't feed into this attention economy, that there is some content that they, you know, don't want anything to do with and that they can be more outspoken about this.
From a regulatory standpoint, most of the regulation has been around moderation.
That is to say, we clean up content after it's out there.
So you clean up a spill after someone's fallen and hurt themselves.
And what I would like to see is for regulation to be much more proactive rather than reactive.
Now, what we get generally when we talk about this is this free speech debate, right?
that that would infringe on the rights of the poster and that one of the beautiful things about the internet is free speech.
I mean, less we forget, two decades ago, if you wanted to say something publicly, your avenues to do that were very few.
You could go on Speaker's Corner, right?
I remember.
Yeah, that's right.
You could write a letter to the editor of a newspaper and hope.
that they published it. But really, the opportunities for speech were very minimal.
And so when Elon Musk talks about this digital town square that he wants to facilitate,
that is what he is hooking into, this kind of beauty of the internet.
Here's the problem with that debate is the digital town square only works if everyone has an
equal place on that square. And that is less and less, you know,
true because algorithms are so much more sophisticated and they are becoming more sophisticated.
So the actual amount of content that we see is very, very minimal.
So when we talk about the rights of the publisher or free speech, which is generally the debate
against regulation, I like to say this.
I'm not interested in publication.
I think it's fine. If someone wants to publish something, fine. I'm interested in dissemination.
That's what I'm concerned about. I am not so concerned about whether someone has the right
to post their suicide journey, but rather whether Meda has the right to algorithmically offer
that suicide journey to a child. So I think what we should be focused on here is not a
rights to publication, but rather rights to dissemination. And that's what we really need to get
to the bottom of is how content is fed algorithmically. And that's where we need to focus our
attention. Caitlin, we don't have much time left. But before we go, I just want to ask you,
your kids are young. And you've been researching this. You've written books and papers and you
know all about this stuff. But how concerned are you really for when your kids are teenagers and
how they get through all of this and how you help them get through this.
So can I say that there is growing public interest and awareness in this?
And there is power in that because ultimately, policy follows the public will.
And if the will is there, policy and brands and corporations will follow.
And so I think that there is a lot of hope there.
I think enough of us, if we decide that we are worth more than our eyeballs and our children's eyeballs on screens, there is possibilities for change.
It's a good note to end on.
Caitlin Reggear, thank you very much for this fascinating conversation.
Thank you.
Caitlin Reger is an associate professor at University College London.
Her book is Smart Phone Nation.
why we're all addicted to our screens and what you and your family can do about it.
You've been listening to the current podcast. My name is Matt Galloway.
Thanks for listening. I'll talk to you soon.
For more CBC podcasts, go to cbc.ca slash podcasts.