On Purpose with Jay Shetty - Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media
Episode Date: December 5, 2025What time of day do you scroll the most? Have you tried setting limits on your screen time? Today, Jay dives into one of the defining questions of our digital age: is the algorithm shaping who we beco...me, or are we the ones quietly teaching it how to shape us? He reveals how every click, pause, and late-night scroll acts as a subtle signal, tiny instructions that train the system, which then turns around and begins to train us. Before we even realize it, our insecurities become fuel, our curiosity becomes comparison, and outrage becomes entertainment. But Jay also reminds us that we’re not powerless, our agency hasn’t disappeared; it’s just buried beneath layers of habit. With calm, practical guidance, he shares how we can take our feed back into our own hands, break the doom-scroll cycle, and actually reprogram the digital environment influencing our minds. Whether it’s choosing who you follow more intentionally, setting healthy boundaries in the morning, sharing more consciously, or reconnecting with real-world anchors, Jay shows that we’re not just participants, we’re contributors to how the system works. And when we change how we show up, everything around us begins to shift as well. In this episode, you'll learn: How to Retrain Your Algorithm in Minutes How to Recognize When the Algorithm Is Steering You How to Build a Healthier, Calmer Feed How to Use Social Media Without Losing Yourself How to Strengthen Your Digital Self-Control You weren’t meant to be overwhelmed by noise or pulled into constant comparison. You were built to create a life rooted in values, peace, and purpose. So take a breath, make one mindful choice at a time, and let it guide the next. With Love and Gratitude, Jay Shetty Join over 750,000 people to receive my most transformative wisdom directly in your inbox every single week with my free newsletter. Subscribe here. What We Discuss: 00:00 Intro 00:31 Even the Algorithm Has a Glitch 03:04 4 Subtle Ways the Algorithm Shapes You 07:59 How Your Clicks Create the Pattern 09:45 What a Social Network Looks Like Without All the Noise 13:08 Doom-Scrolling Can Give You Anxiety! 14:47 Solution #1: Bring Back Chronological Feeds 15:10 Solution #2: Take a Moment Before Hitting Share 16:06 Solution #3: Demand Algorithmic Transparency 16:29 Why Emotional Mastery and Critical Thinking Matter 19:11 5 Simple Ways to Reset Your For You PageSee omnystudio.com/listener for privacy information.
Transcript
Discussion (0)
This is an I-Heart podcast.
Guaranteed Human.
Hey, I'm Nora Jones,
and I love playing music with people so much
that my podcast called Playing Along is back.
I sit down with musicians from all musical styles
to play songs together in an intimate setting.
Over the past two seasons,
I've had special guests like Dave Grohl, Leveh, Rufus Wainwright,
Mavis Staples, really too many to name,
and there's still so much more to come in this new season.
Listen to Nora Jones is playing along
on the IHeartRadio app, Apple Podcasts,
or wherever you get your podcasts.
Hi, friends.
Sophia Bush here, host of Work in Progress.
This week, we had such a special guest,
a mentor, a friend, a wife, a mother, an author,
attorney, advocate, television producer,
and now she adds podcast host to the list.
Michelle Obama is here.
Sophia, I'm beyond thrilled to be able to sit down and chat with you.
Listen to Work in Progress on America's number one podcast.
Network, IHeart. Open your free IHeart Radio app, search work in progress, and listen now.
Hey there, Dr. Jesse Mills here. I'm the director of the men's clinic at UCLA, and I want to tell you
about my new podcast called The Mail Room. And I'm Jordan, the show's producer. And like most guys,
I haven't been to the doctor in way too long. I'll be asking the questions we probably should
be asking, but aren't. Every week, we're breaking down the world of men's health from testosterone
and fitness to diets and fertility. We'll talk science without the jargon and get your real answers
to the stuff you actually wonder about.
So check out the mailroom on the IHeart radio app,
Apple Podcasts, or wherever you get your favorite shows.
Is our destiny coded in the algorithm?
If you feel addicted to social media,
this video is for you.
If you feel glued to whatever's on your feed
and can't stop doom scrolling,
this video is for you.
And if you're worried about how social media is rewiring your brain,
this video is for you.
Don't skip it.
The number one health and wellness podcast.
Jay Shetty.
Jay Shetty.
The one, the only.
Jay Shetty.
I wanted to start today saying one thing.
The algorithm isn't as smart as we think it is.
But the deeper I went into my research,
the more I realized something unsettling.
It's stronger than me, stronger than you,
stronger than all of us,
because it knows our weaknesses.
But here's what I also found.
Even the strongest system has a glitch.
The algorithm doesn't just know us, it depends on us.
And if we learn how it feeds, we can decide whether to starve it or steer it.
When you Google the words, will I ever, the first thing that comes up is will I ever find love?
The second is will I ever be enough?
And the third is Will I Am net worth.
We go from love to a worth to money really quickly.
But this search for love worth and belonging is what the algorithm exploits.
But not in the way you think.
Picture this.
It's midnight.
Think of a girl named Amelia, lies in bed, phone in her hand.
She posts a photo.
Nothing dramatic, just hoping someone notices.
The likes trickle in, her friend's comment, she taps on another girl's profile.
Prettier, thinner, more followers.
She lingers, she clicks, she scrolls.
The algorithm pays attention.
The next night, her feed feels different.
More flawless faces, more filters, more diets, more lives that look nothing like hers.
Curiosity turns into comparison.
Comparison turns into obsession.
and soon every scroll feels like it's whispering the same three words.
You're not enough.
Until one night, she doesn't see herself anymore.
She only sees the mirror, the algorithm, is holding up to her.
This isn't just Amelia's story.
56% of girls feel they can't live up to the beauty standards they see on social media.
90% of girls follow at least one social media account that makes them feel less beautiful.
But here's the real question.
Did the algorithm build that mirror or did she?
Was it coded in Silicon Valley or coded in her own clicks?
Let's look at the algorithm first.
What do algorithms actually do?
Number one, they watch.
Every pause, every click, every like, every share, even how long you hover.
over a video or comment.
TikTok tracks watch time down to the second.
If you re-watch a clip, it's a super strong signal.
Number two, they predict.
Using your history and the behaviors of millions of people like you, algorithms predict,
what are you most likely to engage with next?
If people who watch fitness videos also tend to watch diet hacks,
you'll probably get diet hacks.
Number three, they amplify.
the posts that get more engagement, especially emotional engagement, are pushed to more people.
Number four, they adapt. Every click retrains the system. Your feed tomorrow is shaped by what you do today.
YouTube's recommendation engine is called a reinforcement system. It's literally designed to learn from your actions in real time.
The most accurate model is a cycle. First of all, we click what?
feels good, familiar or emotionally hot. Two, the algorithm learns and serves us more of that
to keep us there. Number three, we become more entrenched and less exposed to alternatives.
And number four, outrage and division spread faster because anger is more contagious.
In plain words, the algorithm isn't a mastermind. It's a machine that asks one question
over and over again.
What will keep you here the longest?
It's like a maximum security prison.
So how do we get trapped?
First, the nudge.
Think Netflix autoplay, TikTok Infinite Scroll.
The design that says, don't think, don't choose, just keep watching.
That's how you start a Korean baking show you didn't even know existed.
And three hours later, you're crying over a documentary on Penguins.
A study found disabling autoplay led to a 17-minute shorter average session,
showing autoplay measurably extends watch time.
It's not a choice disappearing.
It's a choice so well hidden, you don't realize you never made it.
Second, the loop.
Yale researchers found when people post moral outrage online,
people reward them with likes and retweets.
that person now posts even more outrage the next time.
It's not the algorithm, it's us, it's real people.
As one researcher put it,
we don't just consume outrage, we start producing it
because outrage performs better than honesty.
And third, the push.
Mozilla's YouTube Regrets Project from 2020
found that volunteers who started with neutral searches
like fitness or politics
reported being steered
toward extremist
conspiratorial or misogynistic content.
71% of the videos people regretted watching
were never searched for.
They were recommended.
The UCL Kent study from 2024,
in a recent algorithmic model study,
accounts on TikTok were shown four times more misogynistic content
on the For You page
within just five days of casual scrolling.
What does this do to men and women?
Women get more insecure about their appearance.
Men get more exposed to misogynistic content.
Women experience more anxiety and self-doubt.
Men become more lonely and disconnected.
Women compare their lives to others and feel they're falling behind.
Men compare their status to others and feel like they're being left behind.
Both end up in the same place on social media.
isolated, exhausted, and shaped by the same machine.
The algorithm will do anything to keep us glued.
There is a huge incentive issue for the algorithm,
because in one study where they chose not to show toxic posts,
users spent approximately 9% less time daily,
experienced fewer ad impressions, and generated fewer ad clicks.
The algorithm's goal is not to make us polarized.
It's not to make us happy, it's to make us addicted and glued to our screens.
It is showing you what people like you are engaging with, assuming you will stay as well.
We talked about what the algorithm does.
Let's look at what role we play.
Our clicks build the cage.
False news stories are 70% more likely to be retweeted than true stories are.
It also takes true stories about six times as long.
to reach 1,500 people as it does for false stories to reach the same number of people.
Algorithms don't see truth or lies. They only see clicks from people like us.
Want to make a real difference this giving season? This December on purpose is part of Pod's fight poverty,
podcast teeming up to lift three villages in Rwanda out of extreme poverty.
We're doing it through GiveDirectly, which sends cash straight to families so they can
choose what they need most. Donate at GiveDirectly.org forward slash on purpose.
First time gifts are matched, doubling your impact.
Our goal is $1 million by year's end enough to lift 700 families out of poverty.
Join us at give directly.org forward slash on purpose.
Number two, false news spread six times faster than true news
because shocking content sparks more clicks than shares from us.
So the algorithm promotes it further.
The content must already have emotional potency.
An algorithm won't manufacture.
depth or resonance from nothing. It can't make it go viral. Number three, for major media outlets,
each additional negative effect word in a post is associated with a five to eight percent increase
in shares and retweets from us. And four, Facebook study showed that even when given the opportunity,
users click links confirming their bias far more often than opposing one. Liberal's
chose cross-cutting news 21% of the time, conservatives 30% of the time. Here's the twist. The algorithm
doesn't pick sides. We do. It just learns our choice and builds a fortress around it. The danger
isn't that we have no choice. It's that we don't notice when our choices are being shaped
for us. So let's do a thought experiment. Why don't we create a social media platform without these
incentives, one that doesn't play these games with us. They already tried that. And what I'm
about to share with you shocked me the most. A new study out of the University of Amsterdam tested this
by creating a stripped-down social network. No ads, no recommendation algorithms, no invisible
hand pushing content. Researchers released 500 AI chatbots onto the platform, each powered by OpenAI
and gave them distinct political and social identities.
Then they let them loose.
Across five separate experiments, amounting to 10,000 interactions,
the bots began to behave exactly like us.
They followed those who thought like them,
they reposted the loudest, most extreme voices,
they gravitated into echo chambers,
not because an algorithm pushed them there,
but because that's where they chose to go.
It also found that users who posted
the most partisan content tended to get the most followers and reposts.
Researchers tried interventions, dampening virality, hiding follow accounts, even boosting opposing
views. But nothing cracked the cycle. The most they managed was a 6% reduction in partisan
engagement. In some cases, when they started hiding user bios, the divide actually grew sharper
and the most extreme posts gained even more traction.
The implication is chilling.
Maybe it isn't just the algorithms that warp us.
Maybe social media itself is wired against our better nature.
Think about it like a fun house mirror.
It doesn't simply reflect who we are.
It stretches our fears, it magnifies our biases,
and turns our flaws into a spectacle.
As humans, we can live consciously or unconsciously.
We can choose our stronger selves or our weaker selves.
When we choose our weaker self, humans are not just curious.
We're programmed to measure ourselves against others.
Comparison is our oldest survival instinct.
Envy is the emotional fuel.
The algorithm didn't invent it, but it does exploit it.
When we're tired, overwhelmed and exhausted, humans are not ruled by curiosity.
were ruled by comparison, and envy is the price of admission.
The algorithm didn't create envy, it just turned it into an economy.
Now, why do we do this?
The first is negativity bias.
Evolution tuned us to notice threats more than opportunities.
Missing a berry was fine.
Missing a snake was fatal.
Number two, outrage is social currency.
expressing outrage signals loyalty to your group.
It tells others, I'm one of us.
And in polarized context, this isn't just emotion, it's identity signaling.
Clicking rage is clicking belonging.
Number three, cognitive efficiency.
Negative content is often simpler.
This is bad, they're wrong, we're threatened.
The brain prefers cognitive ease over nuance.
complex balanced content demands more effort negativity feels immediate digestible and actionable
so what do we do about this doom scrolling increases cortisol anxiety and learned helplessness
in that state people feel like they have no agency which can reinforce the sense of doom
so we have an incentive issue for the platforms because they're just trying to keep us glued
and we have a lack of mental resilience for us.
Put those both together, that's what we're experiencing right now.
So what do we do about the incentive issue?
People often ask me if I think AI will ever have a soul.
And my response is, I don't know if AI will ever have a soul.
I just hope the people building AI have a soul.
The people who created these algorithms will lose millions
or billions if they adjusted the algorithm.
Would they do that?
Will they recognize or think they have a responsibility?
It's a really interesting thing to think about
because it's almost like we're making something
that is becoming us.
It's almost like Frankenstein, that idea
that whatever system we build has a part of us in it.
If you build a company, it has a part of you in it.
There's an energetic exchange as well.
So what does that feel like when you're building a platform that millions and billions of people
use? The truth is we can't afford to just diagnose the problem and I get intrigued by that sometimes
when people just want to diagnose the problem but we need to find solutions. And here are three
changes social media companies could try. The first is platforms should offer chronological feeds
by default, not buried in settings, and give users transparent control to toggle between
chronological and algorithmic. Facebook's own studies show chronological feeds reduce political
polarization and misinformation exposure, though engagement does drop. The second thing they can do
is actually probably my favorite. Add friction before sharing. For example, read before retweet
prompts, share limits, cooling off periods on viral posts. Imagine you couldn't share something
until you had read it in full. Imagine you couldn't share something until you'd watch that video
in full.
It's 5.23 p.m. One of your kids is asking for a snack. Another is building a fort out of your
clean laundry, and you're staring at a half-empty fridge and thinking, what are we even going
to eat tonight? Or you could just hello-fresh it. With over 80 recipes to choose from every week,
including kid-friendly ones, even for picky eaters, you'll get fresh ingredients and easy step-by-step
recipes delivered right to your door. No last-minute grocery runs. No, what do we even have
fridge staring? And the best part, you're in total control. Skip a week, pause anytime, pick what
works for you. It's dinner on your terms. The kids can even help you cook. Yeah, it's going to be
messy. But somehow, they tend to eat the vegetables they made themselves. Try HelloFresh today and get
50% off the first box with free shipping. Go to Hellofresh.coma and use promo code
Mom 50. That's Hellofresh.coma promo code Mom 50. Hellofresh.com. Hellofresh. Canada's
number one meal kit delivery service. Hey, I'm Kelly and some of you may know me as Laura Winslow.
I'm Telma, also known as Aunt Rachel.
If those names ring a bell, then you probably are familiar with the show that we were both on back in the 90s called Family Matters.
Kelly and I have done a lot of things and played a lot of roles over the years, but both of us are just so proud to have been part of Family Matters.
Did you know that we were one of the longest running sitcoms with the black cast?
When we were making the show, there were so many moments filled the joy and laughter and cut up that I will never forget.
Oh, girl, you got that right.
The look that you all give me is so black.
All black people know about the look.
On each episode of Welcome to the Family,
we'll share personal reflections about making the show.
Yeah, we'll even bring in part of the cast
and some other special guests to join in the fun and spill some tea.
Listen to Welcome to the Family with Telma and Kelly
on the IHeart Radio app, Apple Podcast, or wherever you get your podcasts.
A combat surgeon with secrets.
world built on power and privilege, and the most unexpected creative duo of the year.
As an actor for so many years, I would always walk into other people stories. And I thought,
well, why don't I give it a shot, you know, and try it right up my thought. This week,
bookmarked by Reese's Book Club goes live from Apple Soho in New York City with Reese Witherspoon
and Harlan Coben, the powerhouse team behind Gone Before Goodbye. Now a New York Times bestseller.
I think we both knew right away that this was going to happen.
It's a conversation about fear, ambition,
and what happens when two master storytellers collide?
I've never seen a woman in kind of a James Bond world.
Come for the chills and stay for the surprises.
And find out why readers can't put it down.
Listen to Bookmarked by Reese's Book Club on the IHeart Radio app,
Apple Podcasts, or wherever you get your podcasts.
Twitter's 2020
Read Before Retweet Experiment
led to a 40% increase
in people opening articles before sharing.
WhatsApp's forwarding limits
dramatically slowed misinformation in India.
This could actually make a difference
because not only are we misinforming others,
we're under-informed ourselves.
If you're retweeting something just based on the headline
and have no idea what's inside of it,
we're now propelling ideas that we don't fully grasp and understand.
And number three, require algorithmic transparency and independent audits.
Companies must publish how recommendation systems prioritize content
and allow external researchers to study the impacts.
The EU's Digital Services Act is already moving this way.
requiring large platforms to open their algorithms to scrutiny.
Now, what do we do about the human nature issue?
I want to share with you one of my favorite stories.
A student once asked the Buddha,
what do you gain from meditation?
He said nothing.
The student asked, then why do you meditate if you gain nothing?
The Buddha replied,
I don't meditate because of what I gain.
I meditate because of what I lose.
I lose anger, I lose envy, I lose ego.
If the algorithm is made of us, then changing it doesn't start with code.
It starts with character.
We have to remember that we are wired for generosity, but educated for greed.
When will we finally start teaching emotional mastery in schools?
how long before we start teaching critical thinking at an early age?
Maybe the real test isn't to build a happier network, it's to build happier users.
We built a machine to know us, and it became us.
When we started on purpose, there were only three things that went viral.
Cats and dogs, sorry to put them in the same group, babies, and people taking their clothes off.
I had the innocent intention, the naive vision of making wisdom go viral.
Today, we do over 500 million views across platforms every month.
Not playing into rage bait, not trying to get vividly, be angry.
What does it show me?
It shows me that people will choose healthier options if they're available,
if it's presented to them in a digestible way.
People will choose a salad if they know why it's.
it's better for them, and if it's available and has a great dressing. It's our role to not play into the
fear and find ways to make love more attractive and accessible. It's so easy to sell fear. It's so
easy to sell negativity. It's so easy to sell gossip. But the truth is, why sell the things
that sell people short? Why not provide them with alternatives that are healthy, strengthening, empowering,
that give them the tools to make a difference in their life.
Here's the good news.
Algorithms do not fully decide your fate.
They're predictive, not deterministic.
They rely on your past clicks,
but you can override them by searching,
subscribing to diverse sources
and consciously engaging with content outside of your bubble.
So I want you to take a look at a new account I started in the 4U page.
The 4U page is pretty simple.
It's beautiful imagery, it's introducing me to a couple of scenery, and as I scroll down,
you start to see more of what the average person would see.
The 4U page, as you go deeper, shows you everything from political podcasts, shows me people
working out, shows me influencer content.
Now I'm going to show you how easy it is to change your 4U page.
Because this page is so visual, I'm going to do it through finding quotes.
And also, you know, I love quotes.
So I'm going to go follow some quotes.
I'm going to like some quotes.
Going to like another quote.
I'm going to hover over it for a while.
This is really important to actually hover over the quote,
to actually read it, to actually be present with it.
And now I'm even going to share a quote with a friend
who's now going to think they have an issue
because I just shared some wisdom with them.
When I refresh, check on my 4U page.
It's pretty much all quotes.
through three to four simple steps I transformed my four you page. This is almost a cleansing,
filtering process that I recommend you do. It's simple. I want you to follow five people
you wouldn't usually follow. Agency isn't eliminated. It's eroded by habit. People who
intentionally curate their feeds, limit usage or diversify inputs show significantly less
polarization. The second thing I want you to do is hover over and comment on five pieces of
content you want to see more of. Your offline life still matters. Real books, real conversations
and communities can counteract the digital echo chamber. And number three, I want you to share
five pieces of content. You wouldn't usually, and see how that changes your algorithm.
Number four, don't look at your phone first thing in the morning. It's like letting
100 strangers walk into your bedroom before you've brushed your teeth or washed your face.
You would never do that in real life. Don't do it online. And five, be present with joy.
Celebrate your friends wins and accomplishments. Stop overreacting to negativity and underreacting
to joy. We remember the bad times more than the good times because when we lose,
we cry for a month, and when we win, we celebrate for a night.
Here's what I want you to remember.
When you like something, you're telling the algorithm, show me more of this.
When you hover over something, you're saying to the algorithm, I pay attention when you show me this.
When you comment on something, you're saying, this is really important to me.
And when you share it off the platform, you're saying, fill my feed,
with this. You're co-creating your algorithm. You're actually coding it. One of my favorite thoughts
comes from F. Scott Fitzgerald. He said, the test of a first-rate intelligence is the ability to
hold two opposed ideas in the mind at the same time and still retain the ability to function.
one should for example he said be able to see that things are hopeless and yet be determined
to make them otherwise that second part is so needed right now that's what our stories need
accepting that things are tough things are really hard and at the same time reminding each other
that you can make a change you can transform your life you can take accountability you can
take action, you do have agency, reminding the world that extraordinary things have always been
achieved by a group of ordinary people. I'll leave you with this. Imagine you walk into a party.
At first, it looks fun, people laughing, music playing, stories being told. But then you know it's
something strange. Everywhere you turn, someone's doing better than you. Someone richer, someone
prettier, someone with more friends, more followers, more success. You walk into another room,
and this one feels worse. The room is full of arguments. Everyone's shouting, no one's listening.
And the louder and angry as someone is, the bigger the crowd around them. That's when it hits you.
You never chose to come to this room. You were invited by the algorithm. That's the cruel genius
of social media. It doesn't force us into comparison. It discovers we're already
drawn to it. It doesn't create division. It learns that anger holds our gaze longer than
joy. The algorithm didn't create outrage. It turned outrage into entertainment. And here's the
question only you can answer. When you pick up your phone tonight, are you walking back
into that same party? Or will you finally leave? Thank you for listening. I hope you've subscribed
Share this episode with someone who needs to hear it.
And remember, I'm forever in your corner, and I'm always rooting for you.
If you love this episode, you will also love my interview with Charles Duhigg
on how to hack your brain, change any habit effortlessly, and a secret to making better decisions.
Look, am I hesitating on this because I'm scared of making the choice because I'm scared of doing the work?
Or am I sitting with this because it just doesn't feel right yet?
This week on Dear Chelsea with me, Chelsea,
handler. Nicholas Sparks is here. I would imagine that you've gotten a lot of feedback about setting
a standard of romance that a lot of men can't measure up to. I have heard stories. At the same time,
I've had seven marriage proposals in lines to sign my book. Really? Get up to the table,
doodle drop to his knees. And I'm like, dude, you're in a Walmart in Birmingham, Alabama, you know.
Listen to Dear Chelsea on the Iheart radio app, Apple Podcasts, or wherever you get your podcasts.
What do you get when you mix 1950s Hollywood, a Cuban musician with a dream,
and one of the most iconic sitcoms of all time?
You get Desi Arness.
On the podcast starring Desi Arnaz and Wilmer Valderrama,
I'll take you on a journey to Desi's life,
how he redefined American television and what that meant for all of us watching from the sidelines,
waiting for a face like hours on screen.
Listen to starring Desi Arnaz and Wilmer Valderrama on the IHard Radio app,
Apple Podcasts or wherever you get your podcast.
What happens when Reese Witherspoon calls up the king of thrillers,
Harlan Cobin, and says,
let's write a book together.
I was asking him basically to let me into his secret thriller writing world.
This week, bookmarked by Reese's Book Club goes live from Apple Soho in New York City
for the ultimate storytelling mashup.
Reese Witherspoon and Harlan Cobin on their new thriller, Gone Before Goodbye.
Can you think you're going to read for 10 minutes?
And next thing, you know, it's four in a morning.
Get the story behind the season's most addictive read, already in New York Times bestseller.
Listen to Bookmarked by Reese's Book Club on the IHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
This is an IHeart podcast.
Guaranteed human.
