Deep Questions with Cal Newport - Ep. 371: Is it Finally Time to Leave Social Media?
Episode Date: September 22, 2025We seem to be stuck in a purgatory when it comes to the worst of the social media platforms. We know they’re not great, but it’s hard to muster enough motivation to bother to leave. In today’s e...pisode, motivated by the assassination of Charlie Kirk, Cal puts on his technology critic hat and offers a step-by-step technical explanation for why these platforms are strictly and incessantly making your life worse and why the only reasonable response is to quit them. He then answers listener questions on the topic and reacts to some recent news about the US Congress getting involved.Below are the questions covered in today's episode (with their timestamps). Get your questions answered by Cal! Here’s the link: bit.ly/3U3sTvoVideo from today’s episode: youtube.com/calnewportmediaDeep Dive: Is it Finally Time to Leave Social Media? [0:40]How do I find everything I should be blocking on my kid’s phone? [38:59]Is it okay that I make YouTube Videos? [43:39]Shouldn’t we focus on using social media better? [49:13]CASE STUDY: A musician and social media [58:00]CAL REACTS: US House panel asks internet CEOs to testify after Charlie Kirk assassination [1:10:42]Links:Buy Cal’s latest book, “Slow Productivity” at calnewport.com/slowGet a signed copy of Cal’s “Slow Productivity” at peoplesbooktakoma.com/event/cal-newport/Cal’s monthly book directory: bramses.notion.site/059db2641def4a88988b4d2cee4657ba?reuters.com/world/us/us-house-panel-asks-online-forum-ceos-testify-after-charlie-kirk-assassination-2025-09-17/youtube.com/watch?v=gjVm0K_ETX8Thanks to our Sponsors:calderalab.com/deepmiro.comgrammarly.com/podcast1password.com/deepThanks to Jesse Miller for production, Jay Kerstens for the intro music, and Mark Miles for mastering. Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
Transcript
Discussion (0)
Two days after the assassination of Charlie Kirk, I sent a short essay to my newsletter.
I wrote it in about 15 minutes, but it was about as raw or as pointed as I ever get.
The essay started like this.
Those of us who study online culture like the use to phrase, Twitter is not real life.
But as we saw yet again this week, when the digital discourses fostered on services like Twitter or Blue Sky or TikTok do intersect with the real world,
whether they originate from the left or the right, the results are often horrific.
This should tell us all we need to know about these platforms.
They are toxic and dehumanizing.
They are responsible as much as any other force for the unraveling a civil society that seems to be accelerating.
All right. Then later in the essay, I reached the following conclusion.
After troubling national events, there's often a public conversation about the appropriate way to respond.
Well, here's one option to consider.
quit using these social platforms.
Find other ways to keep up with the news or spread ideas or be entertained.
Be a responsible grown-up who does useful things,
someone who serves real people in the real world.
In the over two decades or so that I've been writing that newsletter,
I don't think I've ever received as large or as positive of a response as I did for that post.
I'm at something like 200 email replies and counting in nearly 100 comments.
The vast majority of this feedback is saying the same thing.
Thank you.
This was the final push I needed to finally quit these apps.
Now, that's got me thinking.
Our biggest problem with social media right now is not convincing people that it's bad.
People know this.
People don't love these platforms.
They don't defend them the way they used to.
But they're stuck in a sort of purgatory where they know it's bad, but they're not sure exactly why,
or whether it's something that maybe could be fixed or maybe something that's worse for other people than it is for them.
so they stay where they are.
They stay in the status quo
waiting for someone to push them
into a different pattern.
Well, today I want to provide that push.
I'm going to put on my technocratic
professor hat and provide a precise
and detailed explanation of
how exactly the worst of these platforms
impact us and why these problems
are fundamental to these technologies
and not something that we can easily fix.
I want to give you the ironclad case
for why the most reasonable response for most
people is to do what I recommend in my essay, to quit or good.
As always, I'm Cal Newport, and this is deep questions.
Today's episode, The Case for Quitting Social Media.
All right, so we need to start our conversation here by defining some terms.
The phrase social media can mean a lot of things, and then that creates confusion because
you might have one thing in mind than someone else has something else.
So let's get really specific.
Today, I'm going to be referring to a specific class of social apps that I call curated conversation platforms.
So these are platforms in which you have a lot of people joining in conversation, be it through text images or video, using curation algorithms based on optimizing engagement to help select what each individual user sees.
So some obvious examples of curated conversation platforms include Twitter and Blue Sky and threads and Facebook.
Those are classic.
There's other things that people call social media that don't fit into this particular category.
So probably like Pinterest doesn't fall into there.
It's not a conversation platform so much as you're looking at things that people posted.
Snapchat doesn't fall into here.
It's a conversation platform, but it's not algorithmically curated.
It's you talking to groups of other people.
WhatsApp wouldn't fall in here either.
Again, it's not algorithmically curated.
as conversation, but not a massive platform.
And then there's some platforms where, depending on how you use them, could fall into this
category or not. So I think things like Instagram or YouTube, depending on how you use them,
could fall into the category of what we're talking about today or not.
So just to be clear about our definitions.
Now, I want to focus on curated conversation platforms in particular because I think they lead
to uniquely awful harms.
I also think they are the social platform most responsible for the,
the rise in violence that we've been seen recently, be it political or otherwise.
All right. So let's talk about these curated conversation platforms. So what are the harms?
If you ask people, hey, what's wrong with like Twitter, blue sky, or threads? Like, what are the
harms that we care about? They'll eventually probably list some combination of the following
three things. I'll put these up on the screen for those who are watching instead of just listening.
They're probably going to say, okay, distraction. That's a common harm. This is the one I talk about a lot in
my books on this show. It's the idea that there's an addictive element to these platforms so that we
use them more than is healthy and to the detriment of other things that we seem to be are important
in our life. Another harm people will often mention if you talk about these platforms is demoderation.
This is where you begin to lose any sense of moderation around a given issue. It's where you become
increasingly strident, see less room for good faith or even basic humanity on the other side of the
issue in question. There's a lot of tribalism when it comes to demoderation. You become part of a tribe
that's fighting some sort of epic fight.
So because of these stakes,
there's a lot of sort of furious policing
of your own tribe.
The third harm we often hear about these things
is what I call disassociation.
This is where things really get scary.
This is where you begin to make a complete break
from normal associations with human community.
You lose connection to normative ethical constraints.
This is the state in which violence to others
or to yourself becomes a possibility
or where you become numbed
to seeing such violence being perpetrated.
against others.
There's really two major flavors of dissociation.
There's one that's based around overwhelming rage.
Well, I am so mad, I have to take action regardless of the consequences.
The other flavor of this association where it leads is nihilistic withdrawal.
The idea that the world is so dark and unredeemable and meaningless that you might as well cause some chaos on the way out.
This nihilistic variation of disassociation is one that we've really been feeling the consequences of recently.
Earlier this year, the FBI began using a new term when talking about mass violence incidents.
They call it nihilistic violent extremist.
It's become so common to have violence caused by a sense of nihilism that there's a new term for it that the FBI uses.
Part of the issue with nihilistic withdrawal is that if it is online platforms that have driven you to disassociation, you begin to think,
oh, all that matters
is attention and respect
on online platforms.
Here's Derek Thompson writing in an Atlantic piece
that he wrote a couple years ago.
For many socially isolated men in particular,
for whom reality consists primarily of glowing screens and empty rooms,
a vote for destruction is a politics of last result,
a way to leave one's mark on a world
where collective progress or collector support of any kind feels impossible.
Right?
So online disassociation can lead to violence because you're like, why not?
And the only thing that really matters is views.
and this will get me attention.
All right.
So these are the harms that people would normally talk about.
If you then say to someone,
all right, given these harms of curated conversation platforms,
why are you still using them?
Why don't more people quit?
Well, this is because if you ask people how these harms affect them,
they're going to give an answer that looks something like this.
And Jesse, we can take this off the screen for now.
They'll say something like,
I may suffer a bit from distraction.
Like most people will cop to that.
Yeah, I use these more than it's useful,
but I could probably get that under control with, you know,
some better habits or I can break my phone or something like that.
Then they'll say typically, yeah, the demoderation thing,
that is a problem, not really for me, but for the other tribes.
Like whatever groups I'm affiliated with, like, we're just being reasonable.
We're reasonably upset and we're kind of right.
But the other, I get it.
Other people on the other side, they have problems with demoderation.
Probably the fix for that is depending on what side of the political
spectrum I'm on. Maybe it's something about more
aggressive content moderation. So if they don't see
the crazy stuff, they won't act so demoderated.
Or maybe it's a community notes.
Like, well, if we could just correct things that are
false, then people would be a little bit more reasonable. But that's
kind of a other person problem. And then
disassociation, well, people would say, yeah,
that's scary for sure. Like, look at these
nihilistic extremists. That's
scary. But that's not even
really happening on the platforms I use.
Like, when I think about, whenever I hear about
these mass shooters, you always
hear about them being on like
rumble or kick or on some
sort of like gropeper server
on Discord, these sort of weird
semi-dark web type
platform. So maybe we should
shut those down, but what does that have to do with
me? That's sort of
how people respond to these harms.
And I think that response goes a long way
to explaining why we're
sort of stuck, why more people aren't making
major changes about the role of these technologies
in their lives.
But here's the thing. Is that
interpretation of these harms right?
I don't think it is.
What I want to do here is outline a different way of understanding how these curated
conversation platforms hurt you.
And it's going to be a way of understanding them that is not going to let you off the
hook so easily.
All right.
I'm going to do some drawing here.
So beware.
If you're listening, you might want to turn on to the YouTube version to see my brilliant art.
All right.
So what I have up here on the screen is the three harms we talked about.
distraction, demoderation, and disassociation.
The right way to think about how we interact with these
is to imagine that there is a slope.
So I'm going to start drawing a slope.
I'm going to start, I get my R's head on.
Starting with distraction, we start sloping downhill.
And then we'll switch over to demoderation.
That slope continues.
then finally we get to disassociation
that slope continues
and I'm going to draw here
a sort of dividing line
between
you know label this here
mainstream
oh look at that handwriting Jesse
mainstream internet on top
dark web below
all right so let me explain this beautiful picture
I just drew. It's pretty colorful actually
I give myself a lot of credit for that
it's a slope
sloping downhill
it starts in the distraction zone
then the slope goes through the demoderation
zone and then finally the slope
goes through the disassociation zone
I've drawn a ground across this
slope diagram so right around the place
we go from demoderation the disassociation
the slope goes underground
above ground is like mainstream internet
below ground now we're in the world
sort of of the dark web
so let me explain my model
here. When you first start using
these type of curated conversation platforms,
it's pretty easy to fall onto
this slope that is into the zone of
distraction. So you sort of fall onto the top
of the slope where you're like, you know what, I'm using this
probably too much. I'm using this a lot.
Whatever my platform poison
of choices, I'm kind of on this more that it's healthy, right?
Like, oh, this is not good. You know me.
I'm addicted to my phone. So we're kind of in that
territory. Here's the thing, though.
You start sliding
And you're in this distraction zone long enough.
Gravity is going to pull you down into the demoderation portion of the slope.
And now you're not just using these services too much.
You're demoderating.
You're losing your ability to see the other side in any sort of good faith.
You find yourself in dehumanizing tendencies.
I just hate those people.
I can't anyone who supports this person.
I just can't even imagine that.
They are wrong.
And wait a second, wait a second.
You're on my team.
You said the wrong.
thing there. Don't you realize we have, we're at war here. We have to close ranks.
All that begins to happen and it gets worse and worse as you slide down the slope.
If you continue down the slope and the pressure is pulling here, you'll end up in the
disassociation zone where you say, you know what?
Screw it, right? The world is unredeemable.
And once you're down here is where people bounce off of the primary mainstream platforms,
you begin to bounce off those into like whatever mixed
of things like gab or kick
or rumble
like now you kind of like end up
roaming around these barely regulated
sort of wild west semi-dark web type
services and that's where like the really bad stuff happens
this is I call this the slope of terribleness
this is actually the right way to
understand these harms they're all connected
you start with one but gravity's
pulling you down towards the next and then towards
the next. Now your question
might be can we fix
this? Maybe if we just fix these services
there's a way that we can maybe put up some roadblocks on the slope of terribleness and prevent this from happening.
But I want to argue that actually this is not the side effect of some mistakes in the way these services were implemented that can be fixed.
But actually this slope of terribleness and the tendency to slide down it is actually fundamental to how these technologies function.
Let me get a little bit into this.
All right.
Let's start with distraction.
You start using a curated conversation platform.
part of the problem is the curation means that the algorithm is looking for engaging material.
What actually catches your attention?
Whatever that is, it will get pretty good at finding that it represents it as sort of weighted data points in a multidimensional space.
But it figures out what type of content is engaging for you because that's how they make their money.
Now, the issue with this is the stuff you're seeing on the phone is very compelling.
This will pretty soon begin to overwhelm your short-term motivation centers in your brain.
So we have these circuits in our brain that are making predictions.
They're looking at activities that are available to you in the moment, and they're making
predictions based off past learning.
What are the different outcomes going to be of these various short-term activities?
And if the brain has learned, oh, there's a really positive outcome that's going to happen
from a particular activity, you get a flood of dopamine in that motivation circuit, which activates
it and then makes that activity seem irresistible, right?
This is something we often get wrong about.
dopamine. I noticed this in like the governor of Utah, Spencer Cox's comments in the
aftermath of the Kirk assassination. A lot of people think of dopamine as a pleasure chemical
of like, oh, once you do something, you get a splash of dopamine and it feels good. No,
no, no, it's a motivation chemical. So these short-term motivation circuits have learned through
past experience. For example, when I eat that cookie, there's all of these things I feel that
are good. And the circuits learn from that and they say, great, when I see a cookie again,
dopamine will activate that circuit because that circuit recognizes the cookie and give me a sense of intense motivation to eat it.
Well, algorithmically curated content really plays to those circuits very well.
And if you're using the phone all the time, you build a very strong prediction circuit that when it says a phone is nearby,
dopamine, dopamine, dopamine, dopamine in that circuit saying pick up that phone and look at it, we're going to see something that's going to cause a reaction that's interesting, be it pleasurable or outrage, but it's going to be interesting and interesting.
an interesting reaction.
That is very, very strong.
And the problem is,
the phone is always with you.
So that circuit is just constantly being bathed in dopamine.
It becomes very, very hard.
You can resist it sometimes.
But it becomes very hard to consistently resist it.
So you go back to using those services again and again and again.
That's just a basic sort of feedback reinforcement loop.
It would be like trying to quit cigarettes.
You're like, look, I don't want to smoke,
but I'm going to carry in my hand at all times a pack of cigarettes.
a pack of cigarettes that's right there.
That's a very hard way to cut back on smoking because those circuits are constantly being bathed
in dopamine saying that will give us a reward.
You need to smoke it.
All right.
So just fundamental brain chemistry, you have algorithmically curated content accessible
in your hand at all times.
You're going to use that more than you want to.
So that's how you begin sliding down that distraction slope as soon as you begin in any
sort of regular way using one of these curated platforms.
Why does this bring us into demoderation?
Well, two things happen with these platforms once you start using them all the time.
Another side effect of the algorithmic curation is the echo chamber effect.
And we say this sort of loosely like, oh, don't be in an echo chamber.
But I'm saying it in a sort of a techno-determinous sort of way.
It is an unavoidable side effect of an algorithmically curated information experience.
The way these algorithms actually work, and I don't want to be too technical, but basically they have these huge,
multi-dimensional point.
So just large vectors of numbers that describe.
It's an embedding of content into, think of it as like a bunch of little descriptions.
I have a thousand categories I can use when I'm trying to describe something.
And I take a thing and I go through and give it a number in each of these categories.
And that combination describes it really well.
And what it does is it learns spaces, the multidimensional spaces in which these preference vectors exist.
It sort of learns these regions.
where content here does really well with this person.
That creates an echo chamber.
I mean, it's not, no one's sitting there programming,
create echo chamber, execute,
or there's an echo chamber knob.
It is just an unavoidable consequence of using this type of,
I'm going to observe your engagement and then use that to try to figure out
where in the space of possible content things create the most engagement.
So you're going to start seeing a lot of stuff that's similar.
So if there's a something you lean towards on an issue,
you're going to be seeing a lot more of stuff about that side of the issue.
And you're going to see stuff that's more and more engaging within that space,
which typically would be the stuff that's a little bit more stride in.
We like that.
Okay.
The other issue is, so we're hearing kind of the same stuff.
Algorithmic Echo Chamber.
These are conversation platforms.
Remember I mentioned that's important that it's people conversing,
not just like you consuming content from experts.
That is going to fire up in your mind the tribal community circuits.
yeah, these are these people that
we're talking back and forth. I'm in community. The social
circuits come up. So you have echo chambers
plus tribal community circuits
fired up and these things come together
and they create in your mind
this virtual tribe that has a very strong cause.
That's where demoderation comes from.
It's not a lack of willpower on your point.
It's not a sort of knob
that can be turned in the algorithms that run
these social networks. It is an unescapable
consequence of how these platforms
function.
Those tribal community circuits are incredibly powerful because throughout most of our modern history as a homo sapiens.
So let's go back two to 300,000 years.
This is how we survived.
Our tribe was everything.
The other tribes, we had to be very, very suspicious of it.
You are not from my tribe.
My tribe is what matters.
If you're not, I worry that you are going to kill me or steal our food.
I do not like you.
And I need to be, there's nothing more compelling than those tribal community circuits.
you can think of like the entire history of human civilization.
Yuval Harari's good about how we used abstractions to do this,
is an effort to try to overcome the tribal community circuits that are so deep in our mind.
And this is basically the point of the book Sapiens,
if you want to sort of go down a rabbit hole here,
is that humans' abilities to organize around abstract causes in tribes
basically hijacked our tribal circuits and allowed us to expand a scope to which they applied them.
So now they could apply to like our entire.
country or everyone who shares a religion as what allowed humans to work together in large numbers,
which was key to our success.
That is Yuval Harare's argument.
But the point is, our success came from figuring out how do we actually, like, hijack, overwhelm,
or otherwise redirect these tribal circuits because they are so strong.
Curated conversation platforms just get back to Paleolithic, let's go, right?
And that's where that demoderation is so strong.
And it's not because, oh, that's a bug in the system.
We need to turn that off so I can get the real value out of these platforms.
It is the value for most people.
That is what's creating the most engagement.
That's the thing that's feeding into the addiction loop on it.
So it's very difficult to avoid demoderation if you're using these things a lot.
All right.
But now here's the problem.
The more increasingly demodered you get.
And look in the mirror.
You know you are.
Like whatever issue it is you care about.
These tribes are almost by the way random what you end up echo chambered in.
It's like your slight preferences plus whatever the end up.
algorithm pushed you in. Like, it could be pretty arbitrary. But just think about right now,
whether it's like it's political and you're, you're like on the left and thinking about MAGA people
or you're on the right and you're thinking about like the blue hair people or, you know,
you're like a super health mom who's thinking about, man, these pharmaceutical companies are
poisoning my kids because they don't care or whatever the issue is, right? Think about the push
to dehumanize that other, that other group. Like, man, they're just terrible. Look at the stuff.
from reading. I just cannot imagine how they exist. When you are in that mindset, you keep
sliding down the slope of terribleness. That is what eventually gets you to disassociation,
because you get to a point where you say, the other thing, the site I don't like is so terrible,
and yet nothing is happening about it. We're not, you know, cutting off half the country or whatever
it is. Like, they're not seeing that I'm just right and changing their views to align with mine.
And no one seems to care or whatever the issue is. And it's not.
nihilism, either nihilism or rage follows.
Like, I just, I can't fathom this and it sort of breaks it.
To make matters worse, especially if you're younger, a side effect of a conversation platform
is that it's taking the place of actual human interaction.
So you're also going to become more actually socially isolated.
So huge demoderation plus more of social isolation, whether you're young.
This also happens a lot with older people as well who are sort of retired and older in life.
Man, that gets you to disassociation.
Before you know it, you're showing up at comet pizza, you know, someone's got to free the kids from the secret caves underneath or your nihilistic shooter.
One thing leads to another and it is built in to the technology.
So we can't fix this.
You can't, this is not, there's no version of Twitter that doesn't have this because the point is, is the two things that drive this is the fact that it's people coming together, it's virtual interaction, that it's algorithm.
curated. You turn it off. You don't have those services. You can't have a version of those
services without conversation and algorithmic curation. That is the service. That's like saying to
Pizza Hut, look, we don't want to put you out of business, but is there a way you can exist without
selling pizza? Like, that's what we do, right? This is what these conversational algorithmic
platforms do. You have a huge number of people come together onto a platform to have conversation.
You have to have algorithmic curation because what else are you going to do? You have 500 million people
submitting tweets a day. You have to curate.
it. And if you're going to curate it, you're going to curate it on what people want to see.
And you put those forces together, you slide down the slope of terribleness.
So there is no technological fix to this.
It is an unavoidable side effect of these technologies.
All right.
So you're probably saying, yes, but I'm not going to end up at the bottom of this slope.
I've used these things.
Most people don't.
A lot of people use these services, but they don't end up on rumble or groyful, you
you know, Groyper Discord servers, right?
And it's true.
You can resist the pull of gravity.
Well, let's think about what that costs.
I'm going to bring up this picture one more time on here.
So we have the slope of terribleness here.
So, yes, I'm going to draw a little picture here.
It is true.
You can, you know, at some point, hold on.
I don't know.
Does that look like a mountain climber holding on, Jesse?
Yeah.
Doing something.
All right.
So you can arrest your fall at some point.
but here's the issue.
There's two issues with this, right?
Number one, imagine over here we have a scale.
Think about this as like a scale of flourishing, right?
The higher up you're on the scale, the more you're flourishing as a human.
I'll label that flourishing.
I'll pull that down there.
So here's the problem.
Yes, you stop yourself somewhere on the slope of terribleness,
but the farther you've gone down, the lower you've gone down on the scale of flourishing.
So maybe you arrest your stop somewhere here in the demoderation area,
but you've arrested your life in a place that's worse.
You're flourishing less than if you weren't on the slope at all.
The make matters worse, resisting further slide down this slope requires a lot of mental energy.
To not be demoderated when you're using the services all the time requires like a real application of will.
When you're demoderated, then not to make that final slide towards disassociation requires a lot of will.
to be fully disassociated and resist the call to come join other disassociated people who understand your disaffection on these sort of dark web type services, that takes a lot of mental energy.
So yes, you can maybe prevent yourself, and most people do, of course, from getting to the bottom of the slope of terribleness, but they get stuck in a place where they have reduced flourishing and is eating up a lot of mental energy that you could otherwise be spending on stuff that matters to you more in your life.
So why is this worth it
To be less happy
And to waste huge amounts of mental energy
Needed to prevent yourself from descending
Into what is essentially a sort of living hell
Once you realize the real dynamics at play here
It becomes hard to justify why
You're still spending time
With these curated conversational platforms
So you want to stay up to date on the news
Subscribe to actual media
You want to hear information
that you think is otherwise being suppressed
All right.
Subscribe to newsletter.
Subscribe the podcast.
Do the old-fashioned work of here is a real individual.
Here is why I think they are a good person to listen to.
Here is their credentials.
Here's how they convinced me.
And I will listen to them writing out a thing once a week.
And there's no curation.
And it's not conversation.
Great.
Do that.
The internet can help you here.
You don't have to be on the curated conversation platforms.
You want to be a part of something.
That's great.
Being on those platforms just tricks you into thinking you're a part of something.
You're not.
you imagine often if you're engaged in combat on these platforms.
You imagine you're at the front of the crowd marching on that bridge from Selma.
In reality, you're in an attention factory.
Punching in with your time clock while your billionaire overseers laugh as their net worth goes up due to your efforts.
They wouldn't be a part of something.
Actually go be a part of something, not on a screen.
And maybe you're using it because your board find better entertainment.
Even on your phone, there's better high quality entertainment.
Do whatever.
Literally almost anything else.
I know it presses these buttons.
Find other things to press the buttons.
You can listen to the podcast,
read newsletters,
read books, watch movies.
You have streaming media.
You have literally something like a half a trillion dollars
worth of capex expense and entertainment available at your fingertips for like a $30 a month.
Like there's plenty of other things you can do to not be bored that doesn't put you on the slope of terribleness.
All right.
So that's the way I see these things.
So what we do here like we always do is move.
on to our takeaways section.
I see we have a little takeaway music here.
We do.
Let's hear this.
All right, so let's get to my takeaways here.
When it comes to these curated conversation platforms,
we've been telling ourselves a reassuring story.
Yeah, they're not great, but most of the worst harms don't impact me,
and, you know, there's some nice benefits.
What's the rush to change?
But in reality, these harms are all connected to a common slope of
terribleness. You use them and you'll start sliding down this slope. It'll make your life
worse as you move further. You'll likely stop yourself eventually before you get to the hellish
bottom, but this resistance requires energy, mental energy. You could be spending on things
that matter or people you love. This tradeoff is not worth it. I want to end this deep dive
by reading from the conclusion of the essay that I published on my newsletter a couple weeks ago.
Here's how I ended that essay. To save civil society, we need to end our decade-long experience
with curated conversation platforms.
We tried them.
They became dark and awful.
It's time to move on.
Enough is enough.
All right.
So there we have it.
There's a lot more coming in this episode.
I'm going to talk to Jesse about his reactions to this deep dive.
And then we have a collection of questions from my listeners about exactly this issue of social media in their life.
So we'll get into how people are grappling with this.
So stay tuned.
But first, we just need to take a quick break.
to talk about one of the show's sponsors.
So here's the thing about being a guy and aging.
When you're young, you don't think about your skin.
You spend time in the sun, like maybe only occasionally cleaning the grime off your face with a brillo pad,
and you still end up looking like Leonardo DiCaprio in growing pains.
And then one day you wake up and you're like, wait a second, I look like a grizzled old pirate captain.
Why didn't anyone tell me I'm supposed to take care of my skin?
Well, I want you to avoid this fate, and how are you going to do it?
With Caldera Labs.
Their high-performance skin care is designed specifically for men.
It is simple, effective, and backed by science.
Their products include The Good, an award-winning serum packed with 27 active botanicals
and 3.4 million antioxidants per drop.
I counted.
It's true.
I thought it was like 3-3 million, but I counted.
It is 3-4.
So good for them.
The eye serum, which helps reduce the appearance of tired eyes, dark circles, and puffiness.
And the base layer, a nutrient.
rich moisturizer that's infused with plant stem cells and snow mushroom extract.
Look, this stuff works in a consumer study 100% of men said their skin looks smoother and
healthier.
So men, if you want to look more like Growing Pains era Leonard DiCaprio and less like a grizzle
pirate captain, you need to try Caldera Labs.
Skin care doesn't have to be complicated, but it should be good.
Upgrade your routine with Caldera Lab and see the difference for yourself.
go to caldera lab.com slash deep and use deep at checkout for 20% off your first order.
I also want to talk about our friends at Miro.
Let me tell you, my team here and my media company has been enjoying using Mero.
Mero is a shared web-based workspace that teams can use to work together more effectively.
We use it here to keep track of our upcoming podcast episodes and newsletter topics.
So we have this master table of ideas.
And the cool part is we can brainstorm right there in the shared document so we can sort ideas and put things in the sticky notes.
I can embed Google Doc straight into it.
So like the script for an episode will just show up right in the planning document where we're talking about it.
But partially why I'm excited today is that Mero has been integrating AI features into this product in a way that I think makes a big difference.
For example, you can now use Mero AI to turn unstructured data like sticky notes or screenshots into dietici.
or product briefs or data tables and maybe make prototypes of an idea in minutes.
There's no need to toggle over to another AI tool and write out long prompts.
The AI is integrated right into the Miro Canvas where you're already working.
You need to check this out.
So help your teams get great done with Mero.
Check out Mero.com to find out more.
That's MIRO.com.
All right, Jesse.
Let's get back to the show.
All right.
So what do you think?
Slope of Terribleness?
Is that going to catch on?
I like the colors.
I'm a great artist, is what I think the, what it is there.
I mean, what I was trying to do there, Jesse, is, you know, this is what I noticed in the reaction to my emails, the email I sent out.
People are like, yeah, I mean, yeah, it's bad, but they needed a push.
And it's one of these things where just people knowing, like, I think this thing is bad, doesn't always lead to action.
They need to know why is it bad.
And in this case, this is what I think is going on.
this is why it's bad, is because you're constantly being pulled down the slope.
And it's like hard to, it feels bad.
The farther you go, the worst your life is.
And the stop from going to hell, you have to like expend all of this energy.
And the question is like, well, why?
What about Instagram, YouTube and Pinterest?
How do they factor in?
Yeah.
So, I mean, what I was saying earlier were, you know, it depends on how you use these things.
The things that made the slope inevitable was it's conversational and its algorithmic
curation.
So, like, that's why I didn't include Pinterest is probably not there because it's not really
conversational, right?
Like, you were looking at a board that someone else made, but you don't feel like you're,
so it doesn't activate those community circuits necessarily.
YouTube or Instagram, I think it depends on how you use it.
So Instagram, and I guess LinkedIn would be similar.
One way that you use it, it tends to have a higher concentration of a sort of like expert
community.
Like, oh, Instagram is where I go because there's like a comedian.
I want to see their funny videos and like a really fit guy and I want to see fitness advice or like authors.
I know a lot of authors on there.
Like, yeah, I like, I like their writing.
I want to see, you know, like clips from Ryan Holiday or whatever because I like his writing.
So there's more of that non-interactive element to a lot of way people use Instagram.
So you can get the compelling aspect, which is a problem.
But without that, we're all kind of conversing about things or giving our takes.
It can arrest that slope way earlier.
Now, some people, if you're using reels and like,
getting to certain corners of Instagram,
it can have that problem.
YouTube's sort of the same thing.
Like a lot of people use it as I listen to a podcast on there.
There are certain creators I like and I use it like a cable channel.
Like, hey, I want to see like the latest Cal Newport podcast or something.
Again, there's no interactive conversation.
But if you're like on YouTube streams and you're in the comments,
like there's ways to use it where you can get some of the slope of terribleness effects.
But it's really like the offenders that drive people all the way down the slope into the place
where like nihilism or rage-based violence comes out,
the pure algorithmic conversation platforms.
I don't know.
I think those are probably the worst offenders.
You've kind of been mentioned in this theme for multiple years.
Did you used to get a ton of hate mail about it?
Yeah, I mean, it's weird how this changed, right?
Like it used to be, because I've been on this beat for a long time,
what's going on?
Why do we all feel like we have to use social media?
I don't think early on I was saying it's going to lead the violence.
That's kind of that things got worse.
But I never bought this idea that these were ubiquitous technologies that we all have to use.
Because I was a big early internet booster.
I liked the early internet.
I like the early internet. I like the idea that three companies were going to take over the internet.
And to use the internet meant we had to use an app by like from one of like a small number of companies to, you know, enrich a very small number of people.
That was sort of like anti-internet.
So a lot of early internet people were very suspicious of social media.
I used to get yelled at.
People say, no, no, no.
They'll defend social media.
meeting. Like, no, these things are bad. I mean, it's bad to not use it. These things are great. It's, you know, whatever it is. It's how you're going to meet new people and new ideas. It's where activism exists. It's going to bring democracy to the Middle East. And then as things got worse, I really used it around 2016, 17, people stopped defending them. They're like, well, yeah, there's some problems here. But I still get some benefits. But, you know, if you don't use them, good for you. Right. And now I think with the political violence, this, again, clearly.
being driven by these. Now people are like, no, I don't, I have, let's go. Let's get, I have no, I'm, I agree like these are bad, but they're stuck. And I think the reason why they're stuck is this analysis of like, there's all of these harms. Most of them don't affect me. They can probably be fixed, but the things that affect me are more minor. And like, so I, I'm in no hurry to leave. I probably should. Because when you see these as isolated harms and I'm only affected by the distraction harm, not these, you're like, yeah, whatever.
I'm distracted by a lot of things.
But when you see them connected as a slope,
you're like, no, no, no, no, that distraction's pulling you down a slope.
And it's going to keep pulling you down there and it's going to get worse and worse
until you have to expend a huge amount of energy to try to stop your descent.
So it's interesting how it's moved from you are dangerous for saying social media in any way as bad.
Two, I completely agree with you, but like I need a push to actually take action on it.
So I've been talking about this for a long time, but what I've been focusing on
has really shifted.
I mean, my early Facebook articles were just me responding to people what they thought were like airtight arguments for why you had to use Facebook.
I had these early essays where I would just go through their arguments.
I don't think that's true or like that's not, there's other ways to do that.
I mean, it was basically, you know, a lot of, I don't think you need to use Facebook.
People like, you are crazy.
And I would say explain why I need to use it.
And they're like, well, this or that or this.
And I would just very calmly be like, I mean, that's not a great argument, you know.
That doesn't make me want to be.
It's weird.
It's this, what I'm clicking in relationship status and there's these like buttons.
Like, this is weird.
You need a better argument than this.
So that's how I started.
Pictures of dogs.
Pictures of dogs.
Poking.
A lot of poking going on early Facebook.
I was like, I'm a grown.
What are poking each other on a computer screen?
Come on.
people are you gotta use these things because it's it's uh i mean it took like six minutes for people
to be like all hiring and business happens on social media and there's no other way that people
will ever discover you hire you or or you'll ever find a client again it took like six minutes
before people were convinced if you do not have videos on instagram there is no way your dental
practice can succeed like we got there so quickly i was like well how do we ever have commerce before
before we had social media.
All right.
So I want to hear what you,
the audience has to think as well.
So, Jesse,
we pulled some questions here
from the audience about this topic.
Yep.
Let's hear what you,
my listeners,
wanted to know or are wondering about this.
All right.
First question's from Sarah.
My middle-aged son has a smartphone.
I told him we'd be using
access controls on it
until he goes to high school.
I know the obvious sites
and apps to block,
but I worry that I'm missing
some of the more fringe options
that young men seem to get lost on.
Is there some,
up-to-date list somewhere on what I should be blocking.
You know what, sad?
I think you skipped the word and said middle-aged son.
But you know what?
Like, that would also be a relevant question.
My middle-aged son is on social media too much.
I mean, that's kind of part of the problem.
Middle school age.
But I could see an older mom being like, yeah, my middle-aged son is in the basement.
And he's like the captain of his, I don't know video games well.
His Mario League.
what's the one where they
Command, gun, whatever,
call of duty.
He's just on call of duty and doing social media all the time.
All right.
So middle school age son,
this is actually a good question, right?
Is when you got to the bottom of that slope of terribleness,
where the transition into violence happens
is where you then sort of ski slope off the bottom
into these much less regulated platforms
that just don't care.
They're usually funded by things like gambling,
advertisements and whatever.
They're funded in sort of shady ways, and they get closed down a lot, and then other ones open up, right?
And this is kind of a problem.
It's like, okay, if I'm running the blocks, like, here's the sites you can't use.
How am I supposed to know about, like, the latest sites where, like, disaffected young men are finding themselves?
I have a couple points here.
One, I don't know.
Your model for phone moderation with your kids is not actually the model that I would typically recommend.
So just to cover what I would typically recommend would be no phone in middle school, no smartphone.
You can have a dumb phone.
Logistics, sure, not yours, but you can take it if you need to take the bus or something and you need it in case of emergencies.
But the model I would normally push is no smartphone until high school.
That smartphone is going to be really locked down until 16, and then we'll take a lot of that off.
And that's when you can begin.
You've gone through puberty.
You've established your social identity.
You've established your social circuiters.
You're just older.
And now we're going to start giving you sort of more free reign.
All right.
So during that period where you have a smartphone but it's highly restricted, how do you find these sort of sites that are at the bottom of the slope of terribleness?
If you understand the slope of terribleness, I think it changes the way you think about this because no one starts at those sites at the bottom.
Right.
No one is like, oh, I got a smartphone.
I can't wait to get on Gab.
I can't wait to go to some weird Discord server where like Nick Fuentes is going to be talking about Groyper movement, something like that.
you get there by sliding down the slope of
terribleness. You start with distraction.
You're the 12-year-old who's talking about, you know,
all these memes you know about.
You're kind of like clued into sort of fun internet culture.
And then that drags you into demoderation.
You get into these influence communities.
And now you begin to get really caught up in some particular issue or not.
And because you're young, you're like, yeah, man, social groups is everything.
I need my group.
I need to fit in.
I found my group.
This is firing up all of my developing like sort of like early puberty brain
networks, this is my group, and then you get disassociated eventually because you're like,
I'm so worked up now that like I now, you know, I'm isolated because I'm on here all the time
and I'm so worked up about this.
I feel like disassociating.
And that's when you hear someone in one of the mainstream communities say, hey, hey, hey, hey,
the real conversation about this is over here.
Like they're the man, and you can fill in so many other things for what that means.
They're going to censor us, but this is where the real conversation is.
And then that's when you make your way over to the really dark places.
So if you are blocking the mainstream curated conversation platforms for your 14-year-old, they're not going to enter the slope.
So you don't have to worry as much about these sort of weird.
Those are weird, weird places.
If you wander there fresh, it's like Harry Potter and Nocturn Alley.
How do you like that reference, Jesse?
I like it.
He's like, this is no good.
This is no good.
The witches have on the weird hats and they're selling the like,
bad spells and Malfoy's doing meth.
Like, I got to get out of this alley.
It feels like that.
It's like, oh, stranger danger, uncomfortable.
It's a weird place.
It's not Disneyland.
But you got to get there down the slope.
So what I'm trying to say there is if you are preventing until your kid is ready for it, access
to the curated conversation platforms, they're not going to probably make their way
to the weird things.
You have to kind of, the slope has to pull you there.
It's not a place where people like to go.
All right.
Who do we got next?
Next question is from Eric.
I have several different jobs,
but the one I'm struggling with big time
is my job of making YouTube videos.
Two questions.
Should I be doing this at all?
I'm thinking about your disqualifications
and so good they can't ignore you.
And if it is okay,
how to organize the deep work required
to come up with an idea and script,
filming it,
and then editing and publishing it.
All right.
So yeah, YouTube,
because again,
we said YouTube is in this sort of
interesting liminal space, right?
So question number one,
should you be making YouTube videos at all?
as all of my YouTube watchers know,
I think it's a terrible thing to do.
I guess it depends on the channel.
If you were doing, like,
maybe if your channel was about,
you have like a football field
with one of those like long-term measuring things
they use to see with the roller
so you can see like knock off yardage
to see how far someone throws a ball.
And what you were doing was you were punting puppies
from the in zone
and seeing who could punt the puppy.
be farther and it was all. Yeah, that you probably shouldn't be doing that YouTube thing.
But if you're doing something useful on YouTube, like a deep dive on Harry Potter's Nocturn Alley,
I have a lot of questions. Then maybe, you know, it's okay. So here's how I would think about
YouTube. Video alone is not bad. I think video independent media in video versus audio, I think
independent media itself is something that is valuable. And there's three main forms of independent
media. There's writing. We'll see this in things like blogs or newsletters. There's audio,
which you see primarily in podcast, and there's video. Now, YouTube is the easiest platform
to use to sort of make your video scene. And I don't think it by itself is a problem.
But what you need to be aware of, if you're creating independent video on YouTube,
you have to be very wary of capture. So there is an algorithm lurking on YouTube. It's lurking
in the recommendations.
And you see this as a creator
in those view numbers
and those subscriber numbers.
And it really, the thing
you have to be so careful about
is capture, which says,
I am going to just
explore the slope
of how do I make those numbers
higher and higher and higher.
And then you end up
an algorithm land.
Now again, YouTube tends not to be
as dangerous for people as these other
platforms.
It can be, but it tends not to be.
But that can put you into,
now you're chasing.
the algorithm, it puts you in the sort of weird sort of engagement farming type of land and
you don't like what you're creating, you don't even feel that strongly about.
And it could be sort of hollowed out.
I mean, this is what like Mr. Beast did, is he just straight up followed what makes more
people watch.
And then he kept ratching that up and ratching that up.
And where he ended up as the highest subscribed YouTube channel on YouTube is these like
multimillion dollar videos where every seven seconds is aimed at like there's no block of
more than seven seconds where there's not something compelling being promised to you or that
promise being delivered on like do do do do do do do do do do visually amazing stakes are high right so
like you know i don't know it's not terrible but it's kind of weird where you end up if you follow
the algorithm or you end up like a lot of people like in political waters like okay i'm going to get
more stride in on this issue because more people view it you know i'm going from i'm mr
health advice to like if you look at a vaccine
your child's hair is going to fall out.
Because you get more and more like attention for it.
So you got to be careful about capture.
But if you're producing something you're proud of
and you want it to do well,
but you're not willing to completely follow the algorithm,
then you know, I think it's okay.
In terms of how you actually scheduled doing YouTube creation,
Jesse will attest to it.
I mean, it's just time consuming, right?
Don't underestimate the principle that's at play here
is the principle from slow productivity of work at a natural pace.
It takes a long time to do this.
Don't pretend like it doesn't.
Just because a YouTube video is quick to watch,
don't delude yourself.
Like, yeah, I'm going to put aside an hour before work
and then after work on Friday,
and that's how I'm going to run my YouTube channel.
You have to think of it like I'm building a chair.
It'll take a lot of time.
I get this wood.
I got to plan it and shape it and glue this.
Wait for that to glue.
I'll clamp this.
There's a lot of skills involved to come back to it.
it's going to require a lot of time for me to do.
So that's the other thing to keep in mind.
To do this well requires a lot of time.
It's not, you know, you're not going to just stumble onto a camera and everyone rushes
to watch you.
It is brutal.
I mean, you'll attest to that, Jesse.
It's like impossible to get people to watch things.
It's, you know, and you really, you really got to do a good job.
So, no, I'm not mad at you for making YouTube videos.
Don't get algorithmically captured.
And be realistic about how hard it is.
It is a major, major commitment.
and if you don't have that time, don't pretend like you can squeeze it in.
I always tell my kids, you don't want to become a YouTuber.
I was like, it's a full-time job.
It's like, that's a hard job.
That's a hard.
It's hard to make money.
It really is brutal.
It's like stressful.
You have nothing else to fall back on.
There's so many easier ways to make the amount of money that like most full-time
YouTubers make.
But they have this idea that it's easy and that everyone's making millions of dollars.
I'm like six people are making millions of dollars.
And it's not going to be.
But like you said, if you're not using it to,
make money, you can take the slow productivity approach and
occasionally put out videos. Yeah. Yeah. I mean, I like independent media.
But yeah, if you think you're going to become the next Mr. Beast, it's a problem.
All right, who else? We got another question? Yep. Next up is Mark. I can't help feeling
like it's too late to put the social media genie back in the bottle. It's fundamental how do we share
information on the internet. Don't we need to focus instead on how to use it better?
Yeah, I get this question a lot.
when I give talks about social media, especially to young people or to parents of young people,
is the like, come on, what we need is to learn how to like use whatever it is, you know, responsibly.
I don't buy it here.
And here's what I think the fundamental mix-up is.
The social media companies completely hijacked the Web 2.0 revolution.
So the Web 2O, for those who don't follow like internet trajectories,
Web2O was about the World Wide Web making it easier for information to both be published and consumed, right?
But mainly making it easier to publish information.
So instead of having to go and manually edit a website or to have a sort of complicated database driven,
like I'm going to pull out the Amazon listing and put on the screen.
But basically it was hard if you weren't had a serious website or a serious web developer.
It was hard to update or put information on the web.
And the Web 2.0 was like a collection of different advancements.
Typically, it was a lot of stuff that was actually advancements in the JavaScript language that ran in the new version of HTML.
That made it easier for like things like I can type something into a form and it shows up and everyone can see it.
So blogs were like one of the first early Web 2.0 things.
It was like, yeah, you can publish information on the web without having to code, without
having to hire like a web developer, right?
So we got things like blogs and like are these kind of early social network type
things like friendsters where like different people could post things.
We had places where you could build websites online using an online interface without code.
And so that was the Web 2 revolution was basically the Web 1.0 said, here is a place to post
information that anyone can consume it.
It's a common protocol.
All you needs a browser.
And anything that's been posted, you can, you can, you can, you can, uh, you can,
you can consume. Web 2 is like now anyone can produce stuff to be consumed as well. Great
revolution. Super important. One of the more important revolution in the last hundred years.
Then the social media companies came along and said, oh, people publishing information is interesting.
Because that could be a lot of information. If you have a lot of people publishing stuff,
you can create amounts of information that you would never be able to hire people to come anywhere
near there. If you want to hire people to create content,
like newspaper writers or whatever, it's expensive
and it's like pretty limited
how much stuff you've produced. And this was, these models
were going on. You had like Nick Denton and Gawker
and people saying like, hey, can we hire like 20 people
and have them write 10 blog articles
a day and then we can try to monetize it?
But the social media company say, that's amateur hour.
You're hiring like 10 people at Gawker to publish
20 things a day and you have 200 things published
a day that you're selling ads on. Why don't
we get 200,000 people
for free to write information?
for us.
This is the real secret of Web 2.
So they took over Web 2 by building these beautiful websites, right?
Facebook's triumph early on was in part because they, Mark, and the original programmers
he worked with really understood these cutting edge new Web 2 type of capabilities.
They understood the new technology, things like Ajax, which was allowed for asyncranie
and JavaScript.
So you could have a static website and pull information.
put without having to reload the whole web page.
They were on top of that technology.
Early Facebook websites was like beautiful Web2 products.
That was like very, very easy for you to publish stuff for people to see.
They took over Web2.
Forget blogs.
Forget like interactive creative websites.
Just come to our gardens, our social media gardens.
And it's easy there to publish your own material.
Then they figured out another thing, which was people get bored publishing their own
material because what happens is, and anyone who had a blog in like 2005 knows this, well,
what happened is you would publish stuff and no one cared.
It was the equivalent of like putting stuff on the bulletin board at the coffee shop.
No one's going to read your essay.
So people would get bored.
Yeah, I published stuff.
No one came.
So I'm going to go, you know, play a video game, right?
Like people didn't care.
So the second innovation the social media companies had was we'll offer you attention.
We'll have the social compact, like Facebook.
I write about this in deep work.
Facebook was like, we'll have this sort of collectivist type of social compact that everyone will post nonsense.
But everyone will look at the people they know is nonsense.
In fact, we'll bring it to you because we organize it on these walls and you friend people and we can tell you like, hey, so-and-so just said something.
And then you go over and do this performative thing where you're like give a thumbs up or like you leave a comment.
Like, that's great.
And later they'll like, but hey, I like that or whatever.
And now what you're offering people is it's easy to publish content and you'll feel like people are paying attention to you.
and boy, that is powerful.
People love that.
And that's how social media took over Web 2.
Anyone can publish and will make you all feel like you're important.
And so everyone started doing it.
And now you have massive amounts of content that you can monetize with advertisements, right?
I'm not going to hire 20 writers.
I have 200 million people publishing all day long because their friends will like on it.
And we can put ads in the middle of all that.
So that was how social media took over Web 2.
and then social media had its own revolution
where after everyone was using these services,
they said, we're no longer going to use this model
of people are going to pay attention to you.
Now we're going to bring an algorithmic curation
and say, well, once you have this app here,
we're just going to pull from the billion people
who are using this, we'll pull stuff that's going to press your buttons.
And now what we're selling you is like,
eh, it's not your friend liking something.
You're kind of bored with that.
But we'll just give you something.
Every time you open this, it's really interesting.
Once we already captured you,
we'll switch you over to algorithmic engagement
because now you'll use it 10 to 20x more per day
than when you were just looking for likes
because you only look for likes
after you publish something.
That took over all a web two.
But that's not fundamental to the internet.
That's a very successful,
but very specific business model.
People publish for free
and in exchange,
algorithms show you the most compelling
stuff that other people published.
It is a very specific business model.
That is not synonymous with the internet.
If you shut down
the five major social media companies tomorrow.
The internet would still be there.
And a lot of other cool things could emerge.
Web2 technology still exist.
Apps and sites where you can gather with like-minded people.
You can publish information.
You can receive information.
You can find niche experts or communities.
You can find cool tools.
All of that is still there.
The underlying protocols that allow anyone who can access any of the networks and networks
that makes the internet can talk to anyone else and read anything else that's accessible from this network.
All of that is still there.
This is just one thing you can do with it
and one that happens to create a slope of terribleness
that we all hate.
So no, I don't think social media is fundamental.
I think the internet is.
So I'm not, I don't buy.
We have to just accept this particular use of it.
The other thing I would argue is, okay, let's say you did believe
that you needed to use social media to be successful in the world.
These are low friction consumer-facing products.
They are very simple to learn.
They're made to be something you can pick up immediately.
You can learn how, you know, TikTok,
takes like nine seconds to learn how it works.
I did this for The New Yorker last year.
I signed up for TikTok and documented like what the process is.
So it's not something you need to teach people about.
And in terms of like, oh, let's teach you how to use this well.
Good luck.
You're not going to compete with the slope of terribleness.
You're just not going to compete.
It's just more compelling.
You can say, don't use this too much.
Good luck.
Reward circuits.
Dopamine saturation.
You are going to use it too much.
You can say, you know, don't like be too strident.
You know, listen to people from the other side.
luck. You're talking about
100,000 year old Paleolithic
circuits that are being pushed on
again and again and again.
That advice ain't going to stick, right?
So,
I'm not a believer.
Like, we have to use it,
or we need to be practice how to use it,
or if we just give people literacy,
they're going to use these things better.
I mean, it's, you know, good luck.
We got a case study here.
Case studies where people write in
to talk about their experience, about using type of stuff
we talked about on the show. We found a case study that was
about someone's experience, leaving social media, it seemed appropriate.
Get some case study music to get in the mood here, Jesse.
All right, today's case study is from Scott.
Scott said, I stumbled upon your excellent work after purchasing deep work on a whim
and wanted to congratulate you on both defining and providing workable solutions for one
of the biggest challenges of our always connected modern age.
I was one of the first musicians to use social media to build a large following in the early
2010's as a pianist and founder of a popular music group.
For a long time, I was a champion of social media tools, in particular, the way they granted
artists the ability to build a worldwide audience while bypassing record labels and other
traditional industry gatekeepers.
I even got a book to you a lot of it.
Fast forward to now, and it's clear to me that social media has actually had a net deleterious
effect on the arts.
Just as we see in knowledge work, engagement-based algorithms prioritize the cheap, dopatiority
hit of a shallow content rather than the kind, deep, thoughtful, artistic work that nourishes
the soul. Some of the true masters of their craft today are reduced to performing inane
TikTok dance challenges or parroting trendy ideological talking points instead of creating the brilliant
work that they are uniquely suited to create. I could go on and on, but I'm certain I'm
preaching to the choir here. I deleted my own personal socials, with the exception of substack,
in 2021, and I think it has greatly improved my creative output at the expense of my own
cultural relevance. I think that
it is the trade
it is, I think that
it is the tradeoff that artists
and academics and everyone else will have to
be incentivized to accept if we were
to reverse what the last decade or so
of McLuhan's Global Village on steroids
has rot.
All right. So I think that's a great case
study.
For this musician, social media was everything.
A great case for it.
Worldwide audience, get past
the gatekeepers, go straight, you know,
get music straight to people.
Great Kevin Kelly arguments,
1,000 true fans.
Then he's like, it got worse,
and the net fact was a negative.
He quit it.
He's never been,
never been, like, prouder of his work.
One thing I want to point out in there,
so what's going on here?
A lot of people repeat something similar,
and I mentioned this in my last answer,
but just to emphasize it one more time,
what social media is has changed.
A lot of the things that we really liked about social media,
the access to new people,
the access to new ideas, right?
that got worse when the companies made what I call the algorithmic turn.
So again, as I talked about before, the early social media model was not so heavy on algorithmic
curation.
It was about conversation.
It was about it's just interesting enough to talk to interesting people, to follow interesting
people, the friend interesting people.
But it wasn't so much about algorithms.
And it's because the original goal, if you're an early stage social media company,
was user growth.
User growth is built on experience.
You need to be something that feels
like it's super culturally relevant
and offers a good experience
so people will sign up.
They wanted those numbers to grow.
It didn't matter how much you used it.
They needed you to subscribe.
They needed like users.
It was less important because they weren't even
really selling ads yet
that you used it all day long.
They just wanted everyone to have to use
a service like Facebook.
So it used to be at first,
again, let's do the evolution real quick.
early Facebook and social media was people I know.
I friend people I know.
We like each other.
There's no like button, but we comment on each other's stuff.
And we feel it's interesting and we feel like there's a little bit of attention.
Like I'm publishing something and people care about.
That feels good.
We know it's a little fake, but whatever.
Then they went towards, let's get famous people on here and interesting people on here.
Twitter was really big on this as well.
And then it was like, yeah, it's not just about your friends.
you can also follow or friend
really interesting,
like creative people
or thinkers or artists.
And again,
that's the stuff you would see.
So like now I see what my friends are up to,
but also I see what this like political thinker is saying.
And that's like an interesting person.
What are they reading?
What are their ideas?
Like,
that was pretty interesting.
No algorithms, right?
Twitter was reversed chronological.
Here are all the things that have been tweeted
by people that you follow in order of when they tweeted
with the most recent thing first.
Facebook was a similar thing, right?
Like, here is your, at first you had to go to
other people's walls.
You'd go to their sites and they gave you this feed.
But the feed at first was just strict.
Here is reverse chronological.
This person, this friend just did something.
That friend just did something.
And some of these friends could be interesting people.
That was sort of the sweet spot era where it was non-alorithmic, but you had a lot of
interesting people on there.
And it was like a cool hang.
It was like a cool hang.
I could just follow on Twitter or on Facebook.
Someone who was like an author I admired or a musician or something, kind of see what
they were up to.
This was kind of cool.
Like, what a cool technology.
Like, this is when people were, like, pretty happy about this stuff.
Then they had the algorithmic turn.
The algorithmic turn, which began, like, around 2012, right around 2012 to 2015.
This was the turn for these companies that got public.
They hired the grownups.
You know, Zuckerberg brought on Sandberg, et cetera.
And they said, look, we have to actually make money.
And they make money, we have to care about how much these users are using this.
and we want them to use it all the time.
And the iPhone is now big, right?
2012 was this tipping point where we had more than half of the adult population using smartphones in the U.S.
It was this key tipping point.
Everyone has these phones now.
We can move this over to apps.
They now have constant access to these services.
So now we need to worry about engagement.
And that began the algorithmic turn.
We are going to select with algorithms or with the assistance of algorithms what you see so that
you get more consistent brain rewards and those short-term reward circuits are going to get more exposures to positive learning and then you're going to look at it more.
It's going to be this virtuous loop.
That makes us a lot of money.
And we've been in the algorithmic turn for the last decade, increasingly pushed towards algorithmically curated content.
You need that piece for the slope of terribleness to be there.
You don't have that piece.
These companies would disappear because you can't make money in 2010 Facebook, right?
It's not going to be worth a trillion dollars anymore.
but all the terribleness goes away.
Without the algorithmic curation,
it's like, you know, it's a cool hang.
And sometimes there's nothing interesting on there,
but sometimes someone has published something cool,
and I learned I got an early drop of a Taylor Swift song
or something.
This was great.
It was like a cool hang.
I wasn't going to look at it a thousand times a day,
but it was a cool hang.
As soon as we made the algorithmic turned,
that's when social media turned.
So that's why we hear stories like this.
We're like, man, this was great.
And now it's not.
What happened?
Curation's what happened.
once we began that shift, that's why we have this feeling of like, why do these things make me feel dark?
Because we are at the top of that slope of terribleness.
We're all happy up there.
And then at some point we looked up, they're like, I'm sliding.
And that wasn't happening before.
So it's good to understand, I think, that these changes that happen.
All right.
So now we know what you think about this issue.
in our third and final segment
we can say well what is
regulators going to do
what is the government going to do
well there's some news about this that I want to get
into about some recent news about
Congress taking on some of these issues of social media
I want to give you my reaction about this but first
you have to take another brief break
to hear from a sponsor
let's talk about grammarly
in the knowledge economy the ability to communicate
clearly is everything
not only does it help you do your job,
if you do it well, it helps you get promoted.
It is like the number one skill in the modern knowledge economy.
So you really should care about your communication,
but it's something people struggle with, writing is hard,
and they don't know how to get better at it.
This is where Gramerly can help.
Gramerly is an essential AI communication assistant
that boost both the productivity and quality of your writing.
I don't think people fully realize how powerful these Gramerly writing tools have gotten.
Let me tell you about a particular feature.
that I've been messing around with the Grammally,
and I'm really enjoying the proofreading agent.
So let me give you a real example.
All right, I have an email to write.
I'm going to send it to, you know, God,
I'm in charge of so many things.
So some group of people I'm sending an email too.
I type out like, yeah, here's what I want to say.
But before sending it, I click the button on the sidebar there,
because I have the plugin in to say,
hey, grammarly, can you take the proof rating agent,
can you take a look?
And you can select what type of things you wanted to look for.
So you can say, first of all, let's just do like a straight check.
Did I make any mistakes?
Oh, it's great at that.
Like grammar mistake here.
You have a repeated word here.
So yeah, sure, we fix that.
But then you can click something like sharpen opening point.
That's an actual option in the proofreading agent.
Sharp and opening point.
And it'll look at your opening paragraph.
And I literally got this feedback.
You're being a little wishy-washy.
Just use a more active verb here and there.
It's like, yeah, you're right.
That's better.
That's sharper, right?
It made the email better.
So this is the type of thing you can do with Grammally.
It's just, it's like having an editor sitting over your shoulder.
You don't always have time to pour over everything you write.
Click on like these things like the proofreading agent and it just goes in there and makes the writing sharper.
Fix your mistakes.
It can tell you about your precision.
It can help you detect or adjust your tone.
It can help you even brainstorm titles or ideas.
So do some of this like Gen.
A.I. Text production.
It'll make your writing better.
It'll make things go faster.
If you use a keyboard, you need Gramerly.
So download Grammarly for free at Grammarly.com slash podcast.
That's Gramerly.com slash podcast.
I also want to talk about our friends at one password.
Back in the old days, password management of companies used to be simple.
You had like what, one machine that each employee logged into,
and you wanted to make sure that they had a password for their computer,
and you're like, make it a good one, and don't forget it.
That's basically how we run the IT system here at Deep Questions.
our whole operation we run in on a mainframe computer
cost $75,000 a month electricity bills
and Jesse walks around with a white lab coat.
Have you seen that?
Why in old footage of old computers
do people wear lab coats?
Right?
I mean, a lab code is for I'm doing experiments.
But if I'm just walking in a room with like magnetic tape,
they always have lab coats on and clipboards.
I think that's just from like footage.
My mom used to program Series 7's mainframes back in the 80s.
I'm going to ask her.
I don't think they wore lab codes,
which is a little bit far afield from one password.
I was just thinking about this.
Anyways, the point is,
this is not the way people run businesses today.
We don't have one big computer with people with lab codes.
We have service after service that people use as part of their job.
Some of them you know about.
Some of them you don't.
They just signed up for and they have all these different passwords.
And it's a huge mess.
There's a lot of security vulnerabilities.
This is where Trellica by One Password can help.
Trellica by 1 password inventories every app in use at your company.
Then it uses pre-populated app profiles to assess the risks, the security risk of these different applications that your employees are actually using.
And it allows you to easily then manage your access or optimize your spend or enforce security best practices across every app your employees use.
So again, they're using tons of stuff they signed up for on the web.
They're not just logging into the mainframe that Jesse maintains.
trial code by one password helps you monitor and be on top of all of that.
You can manage Shadow IT securely on board and offboard employees and meet compliance goals.
Trailipo by OnePassword provides a complete solution for SaaS access governance.
And it's just one of the ways that extended access management helps team strengthen compliance and security.
OnePasswords award-winning password manager is trusted by millions of users in over 150,000 businesses from IBM to Slack.
And now they're securing more than just passwords with one-password extended access management.
plus one password is ISO-2701 certified with regular third-party audits and the industry's largest bug bounty.
One password exceeds the standards set by various authorities and is a leader in security.
So when we move away from our 1960s era mainframe and put away our lab coats,
one password would be exactly the type of thing we'd use here at my media company.
So take the first step to get better security for your team by securing credentials and protecting every application, even unmanaged shadow IT.
Learn more at one password.com slash deep.
That's one password, the number one word password.
com slash deep and type that in all lowercase.
All right, Jesse.
Let's move on to our final segment.
So you heard my thoughts about social media.
We heard your thoughts about social media.
What are the government going to do about this?
Well, I have good news.
The House of Representatives is on it.
And we know the House of Representatives is really good at technology.
Just to remind you of how savvy they are,
I want to play you a little bit of audio here from a couple years ago
when the House of Representatives brought in the leaders of TikTok
to interview them.
Here's a clip from that past discussion.
Mr. Chude, does TikTok access the home Wi-Fi network?
Only if the user turns on the Wi-Fi.
I'm sorry, I may not understand the...
So if I have TikTok app on my phone,
and my phone is on my home Wi-Fi network,
Does TikTok access that network?
You will have to
access the network to get
connections to the internet if that's the question.
Basically what I'm saying is
we're kind of screwed.
U.S. House of Representatives does not know a lot about technology, but
they are trying because here's the news
that we just heard about. I'll put this on the screen here
for people who are watching. U.S. House panel
asked online forum CEOs to testify after
Charlie Kirk assassination.
So they are trying to do
something here. But let's think, I just want to look at this for a second.
Who do they have coming in? I'm going to read from the
the press release here. A US House committee on Wednesday asked the
CEOs of online platforms Discord, stream, and Twitch
and Reddit to testify at a
hearing following the assassination of Charlie Kirk
citing the radicalization of online forum users.
All right. I had some thoughts on this. First of all,
as indicated by that TikTok clip,
not exactly a bunch of technical wizards.
And we're sort of seeing why this is a problem already.
Let's kill this, Jesse.
There's a lot of ads on here, by the way.
That's somehow like metaphor that like the website for this article is so covered in ads.
It's kind of unusable technology.
But we're seeing here the issue.
Not just that Congress necessarily doesn't know a lot about these technologies.
When you understand things like the slope of terribleness, they're talking to the wrong CEOs.
They're talking to the CEOs of companies.
that are at, their services are being picked up
by disassociated youth at the bottom of the slope of terribleness.
I mean, yes, great, we should talk to the existing companies
and have them turn off this behavior,
but more ones are going to come.
Because what you need, these sort of,
I think of them as like loosely regulated digital conversation spaces.
These are the end game of falling down the slope of terribleness,
distraction, demoderation, the disassociation.
And then finally when you're disassociated,
you want to be around other disassociated people and you egg each other on.
There have been plenty of sites to do this.
They often get shut down and then other ones come along and because it's not,
this is not hard technologically to have a place where people can do like voice chat,
like a Discord server that anyone can spool up on their own machine or you have some sort of forum.
And they shut these down.
Like remember 4chan?
Well, that got shut down, but then they started 8chan and then that got shut down.
And now that's gone.
there was like the Nazi website
Dair Stormer and that got shut down.
And often the way these get shut down eventually is they become such a cesspool.
It's actually the infrastructure providers like I'm not going to do business with you.
I mean, I think what took H-Hand down was Cloudflare, which is, you know, you need that to prevent the Nile of Service attacks.
Like you need it to keep in general your website up and running and they eventually are like,
oh, we don't want to be in business with you.
And so with no one to protect them, the site became unusable, right?
So like it's sure.
It's good.
I mean, existing forum companies, keep the pressure on.
Do not be a cesspool for disaffected youth.
You are a specific company.
You should be responsible for what people are talking about.
If you have to be accused of being censorious, be accused of being censorious.
Do not let disaffected youth to be on there.
But anything can function here.
And so I think that's the wrong place to look.
The right place to look is what is tumbling people down into these below ground depths
of the loosely regulated web.
And that's curated conversation platforms.
That's where the concern is.
And it's almost not even worth bringing in CEOs of these companies,
which they do.
Let's bring Zuckerberg in.
Sure, you can bring Muske.
I mean, the current administration won't bring him in,
but you could, right?
You can bring these people in.
But what can they say?
Again, it's like bringing in the head of Domino's and being like,
look, pizza's bad.
What can you do to help us out here?
And they're like, this is our business, is pizza.
What do you want us to do?
That's what we do.
We sell pizza.
Curated conversation platforms is they use algorithms
plus large groups of people in conversation
to generate lots of eyeballs that they monetize.
Like, that's the business, right?
So if you really want to step in here,
I don't even know what the exact solution is,
but there's a couple things to catch my attention.
I'm definitely interested in Australia's
model of a nationwide ban
on social media writ large,
be expansive instead of contractive here
for 16 or younger.
We could do that.
the slope of terribleness is inevitable, let's keep people off until at the very least their
brain has developed more because you know what makes that slope super steep having a adolescent
or prepubescent brain? Do you remember being a teenager? Did anything matter more to your brain
than social inclusion, social grouping, what group you're in on and out on, what was going
on with your friends, pleasing your community, being a part of a tribe? Those parts of your
brain gets supercharged doing those years in your life.
You combine that brain with algorithm of conversation platforms.
I mean, what was like a blue trail becomes a black diamond.
You just fall fast.
At the very least, yeah, let's keep them off.
We do this with lots of other things with kids.
Nicotine is very addictive.
It's super addictive in like a 12 year old's brain.
So we're like, you know, no, kids can't, they cannot consent to the lifetime addiction
they're going to get by using this.
They cannot make that decision.
They're way more,
they're more prone to it.
So, no, you have to be older
to smoke cigarettes.
It messes with their brains,
alcohol.
Let's wait till your brain develops
before we're going to allow you
to, you know,
to drink alcohol.
You're not making good decisions yet,
so we don't want you behind
the wheel of a car
until you've gone through training
and you can be older.
And sure, for all of these things,
you can argue, but no,
you're holding people back.
You don't let a 15-year-old
drive a car.
You're restricting the freedom of movement.
And you can't restrict
a freedom of movement.
movement of people in America, but you can for kids.
They can't fully consent to these things.
Why can't a kid smoke a cigarette?
Well, because their brain's not ready for it.
They can do it when they're older.
Why can't they go to an R-rated movie by themselves when they're 15?
Because we think kids aren't ready for this material or they're not going to make a good
decision on their own.
So no, you just can't do it.
I think we should do something similar for social media.
Maybe Section 230 reform.
It's a complicated issue.
There's not a simple answer here.
but section 230
of the Internet Communication Decency Act
has the effect
of basically giving legal cover to social media companies
for you are not liable for the material you publish
and this is very different than say a newspaper
the New York Times publishes something
they're liable
right if they say something that defame someone
they can be sued right
if they incite some sort of action
that is like highly negative
for an individual or company they can sue them
and say you know you were wrong
This happens, by the way.
This happens all the time, right?
The Washington Post had this issue a few years ago.
Remember the Covington kids at the mall?
It was like a group of kids, you know, all these high schools visit D.C.
And so kids from some school were visiting D.C.
And they had bought MAGA hats because they were selling MAGA hats, you know, the street fenders or whatever.
And there was like some, they got into some incident with someone that was there.
I think it was like a Native American
doing some sort of protest.
And anyways,
a picture got captured
of like one of the Covington kids
with a MAGA hat
sort of in the face of like this Native American person.
And it just fit a pattern.
This was like the Washington Post
at peak sort of, you know,
peak woke time.
And there was this like,
this kid is the kid from the Omen,
basically.
Like it's Satan spawn and is,
you know,
whatever. It's just the worst possible thing. And he went and just like with whatever felt right seeing this photo. He went and tracked down this person because they were marginalized, blah, blah, blah, blah. The problem was that's not really what happened. Actually, like, the kids were just taking a photo and this other person was like bothering them and coming over them in the photo. And they were trying to like, you know, that person was confronting them and the photo caught him in a whatever time. And the kid was like, well, you can't just say that stuff. You're liable. Like this has been a huge issue. And the post had to give them millions of dollars. So,
Most people, you have to actually care a little bit.
Like, I have to try to be accurate.
I'm liable.
And that's good.
Social media companies don't have to.
They say we're not responsible for what's published on here.
And that's section 230, more or less getting that protection.
So there have been arguments for reforming it, which would basically, no one has those protections.
If you want to use free content, like get a lot of people to create content on your platform,
have a billion users creating content that you then like algorithmically sort and use to make money.
Sure, we don't care where it comes from.
You publish it.
It causes harm.
it's on you.
Now, of course, the result of this
is the business model
for these massive social media
curated algorithmic platforms
disappears. You might not be able to
exist under such a world.
I don't think that's necessarily a bad thing.
Now, I know it's more complicated than that,
but that might be the closest
hammer you would have from like a regulatory
perspective to say, like, I think this is a bad
thing to have these
massive, if you have these
massive conversation,
platforms are algorithmically curated, you're going to create things like these slope of
terribleness.
It's just really bad for people and you can't control it.
And if you are responsible for everything that's published, then you can't have those
type of platforms.
Like, no, you're responsible.
If you want people to publish, it has to be more moderated and curated and curated,
and it matters what's on there and you have to have standards.
And I don't think it's a bad thing.
People still come together.
They still spread ideas.
You're still accessible.
You can still express yourself.
We don't necessarily have to have like a billion people on the same platform to see the
the benefit of the internet.
So maybe that's it.
I put an asterisk there because, look, I'm in digital ethics and academic context.
I know it's a very complicated topic.
So I'm oversimplifying it there.
But, you know, that's the type of thing that's worth looking at.
The most important thing we could do, though, is cultural.
This is why I'm trying to do what I'm doing.
Make social media not cool anymore.
Make it seem like, wait a second, wait a second.
You're like a grown man who's like spending a lot of time swiping this piece of glass
and typing in these things like other people.
Like, what are you doing?
Like, we should see it.
We should culturally change it.
When we see people that are like really engaged online in the social media platforms or whatever,
we should see it as if they wore like a straight up Captain Kirk costume to work.
And we're just like, the Starfleet knows what's going on.
Like, this is weird and you're weird for doing this.
You have a Starfleet pin on at your job at, you know, Kinko's or whatever.
I'm going to kind of make fun of you a little bit.
Like, this is not something a grown man should be doing.
That's the way we probably need to think about social media.
There's a cultural fix here.
Change the zeitgeist on it.
That's why I'm out here saying, doing things like the slope of terribleness.
This is not innocent.
This sucks.
You don't need to be on here.
I mean, unless you have either a lot of stock in one of these companies or you own some of the land left in Hawaii that Mark Zuckerberg hasn't bought yet,
and you think you're going to get a good price for it if he gets a little richer.
Like, if you're not in those situations, come on.
Why are we on these things?
seeing what we saw a couple of weeks ago on all sides,
the role of social media into creating assassination of a 31-year-old dad,
the reaction on all the sides is just terrible, right?
The instinct on the far left to like,
I don't care about this at all and it's all, you know,
and I'm celebrating it or like I have no empathy.
The reaction, the right, you know, not just the censorious stuff,
the far right. Hey, surprise, surprise.
Far right social media.
Guess who they're blaming for Charlie Kirk's death.
Yeah, it rhymes with booze.
It's always the same people.
Somehow, like, the Jews and Israel are involved.
None of this is good.
Why are we, like, we need to marginalize people
who are spending a lot of time on these platforms
by getting off of ourselves first.
So I don't know.
Maybe I'm biased because I've been there for a long time
and people yelled at me and I'm happy that people are listening now.
But sure.
Block it for kids.
Yes.
Phones out of schools.
Yes.
Section 2, 3, or 4, maybe.
I'm tempted.
I'm interested.
I don't know the company because I'm interested.
But mainly there's a cultural fix here.
It's time to walk away.
It's time to say enough is enough.
It's time to treat the people who are still like really on there and be like,
that doesn't make you some sort of like fierce investigator or warrior or
protest.
It's the guy with the captain Kirk Penn who, you know, just showed up at the company
photo day.
It's a little bit embarrassing.
That's where I think we need to get.
It'll help our civil society.
And I can tell you life offside of those services, you're not on those slopes at all.
It's nice up at the top of the hill.
It's sunny up there.
There's real people.
As the governor of Utah said, you can touch some grass.
It's a pretty good thing to do.
So there we go.
That's my rant for today.
Thank you for listening.
We'll be back next week with another episode.
And until then, as always, stay deep.
Hi, it's Cal here.
One more thing before you go.
If you like the Deep Questions podcast, you will love my email newsletter,
which you can sign up for at calnewport.com.
Each week I send out a new essay
about the theory or practice of living deeply.
I've been writing this newsletter since 2007
and over 70,000 subscribers get it sent to their inboxes each week.
So if you are serious about resisting the forces of distraction
and shallowness that afflict our world,
you've got to sign up for my newsletter at calnewport.com
and get some deep wisdom delivered to your inbox each week.
