Big Technology Podcast - Are We Actually Addicted To Our Phones? — With Nir Eyal
Episode Date: August 26, 2021Nir Eyal is the bestselling author of Hooked and Indistractable. He joins Big Technology Podcast for a spirited debate over whether we're actually addicted to our phones, the ethics of app developers ...who use tricks to keep us coming back, and what to do about it. You can find Nir's books here: http://geni.us/Indistractable and http://geni.us/hooked Here's an Indistractable summary article: https://www.nirandfar.com/skill-of-the-future/ And a distraction guide: https://www.nirandfar.com/distractions/ And a schedule maker tool: https://nirandfar.com/schedule-maker/
Transcript
Discussion (0)
Hello and welcome to the big technology podcast, a show for cool-headed, nuanced
conversation of the tech world and beyond.
Neera is in the house, is the author of Hooked and Indistractable two best-selling books
that dig into all the ways we get attached to apps and our phones and what to do about it.
Mir, welcome to the show.
Good to see you, man.
Hey, great to see you, Alex.
Thanks for having me.
It's great to have you here.
I think the last time we spoke, well, we spoke in once in between just to talk about
scheduling for this.
But the last time we really talked, we were at a crowded bar in New York City.
And my book was about to come out.
Yours had recently come out.
And then nothing's happened since.
It's been pretty calm across the globe.
So anyway.
Oh, gosh.
I wish people could, everybody could see your expression right now.
yeah the the the palm to uh to forehead uh sigh there is awesome says everything you to know
well i do want to it does look like we're on the mend hopefully um borrowing it goes
cross their variance and um and it's good to be back in touch uh and so i appreciate you doing
this we're going to talk about uh phone addiction the science of phone addiction and um and what
to do about it you've written these two books hooked and indestructible it's interesting because
we talk about tech all the time. We talk about the new products that these companies have made and all the problems they're having in Washington. But I would say one of the things that always looms underneath the surface that doesn't get discussed enough is our relationships with these phones and what they do for us and how they have us in their grasp, at least most of us, maybe not you because you've written the books. And I think that it's great to have you on because we get a chance to talk about the power that they have over us.
And the discussion about that power, which has really come to the fore now with movies such as the social dilemma.
So I want to start with a broad question just so we can set the table here.
How big of a problem is phone addiction right now from your perspective?
I think that the perception of phone addiction is a much bigger problem than the actual addiction, meaning that what I think the narrative that has not been widely heard,
is the counter narrative
to the technology
is hijacking your brain narrative
is the very popular notion
that I think is very misguided
that the technology is controlling us.
We'll get to the narratives.
We'll talk about that.
But I'm curious from your perspective
because when I think about,
okay, we have to like set the level here, right?
What's the baseline that we're going to discuss
as we go through this conference?
conversation. And from my perspective, this is actually an issue. Now, whether the, you know,
there are these automatons that are hijacking our brains, we're going to go through like how much
agency we have over it. But don't you think it's an issue? I mean, I can't put my phone down for
more than, you know, five minutes. So, and, and it seems to be sure, I mean, I'm sort of going to
leading question here, but it seems to be shortening our attention spans, increasing our distance
between fellow human beings and controlling our lives,
those of us who don't have, you know,
the wherewithal or the tools to deal with it.
So what do you think the magnitude of the problem is?
Look, I think that there's a wonderful quote by Paul Virilia,
who said that when you invent the ship,
you invent the shipwreck.
So there's no doubt in my mind that there are adverse consequences
to every new technology.
The question is, you know, is it net-go-go-eas?
or net bad. And I don't think that that is a controversy. I think that by and large, these
technologies are a net good. I mean, can you imagine trying to go through this COVID crisis without
these tools? I mean, it would be a million times worse if we didn't have these technologies.
Imagine if this was not COVID-19. Imagine if it was COVID-90, right? We tried to go through
this in the year 1990 versus the year 2019. I mean, it would be unimaginably worse. So. Yeah, but COVID can't
be the answer to why technology is good.
Like, we can't just say that because we were able to Zoom, we don't really have an issue
with, you know.
Do you honestly think, can you look at me in a straight face and say that it's not a net
benefit?
Well, I mean, I'm like, well, first of all, I think that we can have a nuanced discussion
about it, but we have to leave the idea of COVID being this once, well, hopefully once
in a generation type of thing.
I think it's an illustration.
Yeah, it's one data point.
I think it's, yeah, I think it's one point.
I guess, like, yeah, I do think I'm not a techno-pessimist.
I think that technology, well, I got to put this out there.
I do think technology isn't that good.
And the question we talk about on the show often is, how can it be better?
Right, exactly.
And how do you mitigate some of the downsides.
And that's why I'm such a big fan of your show is that you're about nuance.
And I think that should be the debate.
It's not technology good, technology bad.
It's, yeah, it's nuance.
It's how you're using it.
how much you're using it, who is using it,
and what would you be doing instead of using it?
So is distraction a problem?
Absolutely.
I wrote a whole book about how to become indistractable
because I think distraction is a big problem.
Is addiction to technology a big problem?
Yeah.
It is a problem.
Hey, look, any form of human suffering is a problem.
Are some people addicted?
Absolutely.
Look, some people are addicted to sniffing glue.
Some people are addicted to eating paint.
There's all kinds of addictions out there.
The question is, is technology addiction as big of a problem as people think it is?
No, I think far too many people believe that they are addicted and they're not.
They're just distracted.
But of course.
Yeah, there's really an eating paint addiction.
I didn't know about that.
Oh, yeah.
Do you remember that show intervention?
No, I never saw it before.
Oh, okay.
Check out.
If you get a chance, go on YouTube, another interesting technology we can talk about.
Look at that show.
They profile all kinds of addictions.
But near, once I get in there, I'm not going to be able to get out.
Right. That's right.
So, I mean, I do think that, you know, there's also this interesting conversation about whether, you know, you can be addicted to nicotine.
We've had a podcast here. We've talked about Jewel. And that's actual physical addiction.
But do you think that someone can be like physically addicted to technology? What's your perspective on that?
Well, there's, it's not a physical addiction. It's a behavioral addiction. And some people do get addicted. There's no doubt about it.
Right? People get addicted to all kinds of things. And so some people can become pathologically addicted. The difference is we need to understand what really is an addiction. An addiction is defined as a persistent compulsive dependency on a behavior of substance that harms the user. It is not, I like it a lot. And we have substituted this term addiction for everything these days. You know, my wife got a box of shoes from DSW. And on the side of the box, it says, danger, addictive contents inside. Give me a break. Come on. There, there's, there's.
shoes. So everything has become an addiction. And I think that, the reason we do that is that we like
to give up control. We like this narrative of, well, there's nothing I can do about it. It's an addiction,
right? It's not in my behavior. And of course, there are vested interests that want you to believe
that the only way out is with me. And in fact, I think that, you know, to your, to your, to your, most of
these are the think fluencers that are saying the only way I can lead you out is with me or who is
Who are the people that are saying that?
Look, you know, I was interviewed for the Social Dilemma movie.
I sat down with them for three hours and wanted to give them a path out of, hey, look, you know, look, I spent five years writing this book on how to become indistractable.
Is overuse of technology a problem?
Absolutely, it's a problem.
I think many people struggle with that, including me from time and time.
But it's not an addiction, it's a distraction.
And we can do something about it.
And here you go, here's exactly what to do.
I combed the research literature for five years.
Here's what we do to get out of it.
Nothing that I said.
And in fact,
nothing that anyone who countered the social dilemma narrative
was included in the movie.
Why do you think that is?
It doesn't make for a good story.
Nobody likes the narrative.
You know, I like to say that the social dilemma
is a documentary in the same way
that Jaws was a documentary about sharks.
It's as if you go to the doctor
and the doctor says,
hey, you have a terrible disease, and I have the cure, but I'm not going to give it to you.
Are you kidding me?
Like, that is irresponsible.
What they did was to promote an agenda.
And by the way, the people in the movie paid for the movie, right?
Many people don't know that, but the movie was partially funded by people who are actually in the movie
to promote an agenda that the only way out is through us.
And I think that's very dangerous.
And but when they say through us, when they say through us, is it through like our speaking fees
and our events and our organizations?
The organization.
The Center for Humane Technology.
Okay.
Yeah.
Which, again, sounds great.
That's Tristan Harris's organization.
Right.
And look, I, I, let's start with the good here.
I think it's great that we have a critical eye towards the good and bad that technology causes.
I think that's a wonderful thing.
I'm very glad that he's making us take a, a, a critical.
perspective. I think that's wonderful. There are many bad things that come out of technology.
Absolutely, right? The shipwrecks. But I think it's irresponsible to try and show one side of the
story, especially when you're right, when you have a movie about filter bubbles, about how terrible
social media is because it promotes these filter bubbles. And yet this movie is one big filter
bubble. Oh yeah. On Netflix, I mean, how ironic is that? Using all the psychological tricks,
all the, you know, the caricatures of voodoo dolls and these algorithms,
these evil algorithms that are trying to, you know, hijack your brain,
they use the exact same psychology that they bemoan on social media,
all the filter bubbles, not offering any other perspective.
And I think that's incredibly irresponsible.
Yeah, and there was some selective editing,
which we found out on this show in the conversation with Tristan.
So that was interesting.
But I'm curious why you think that resonated so much.
with so many people. I mean, the social dilemma as a movie has been watched by so many people,
people who don't care about tech in general. And I've heard this from friends and friends and friends.
Friends and friends, when they talk about technology, they say, oh, have you seen the movie?
Obviously, there's a belief among people. This was sort of what we were getting out before,
you know, this is a problem. And people want you to believe that you don't have agency over it.
But obviously what we've been talking about before, it does resonate with people.
I'm going back to the original question I asked at the beginning here because I think it's important.
There's a definite problem going on here, don't you think?
If it resonated with all these people, if these people said, yes, that makes sense to me.
Clearly, they feel like technology is controlling them in ways that are beyond their ability to get out.
Look, we love conspiracy theories.
But we live, of course, but we also, I mean,
Yes, the story, but we also live this stuff.
Totally.
Conspiracy theories spread because they're a manifestation of confirmation bias.
So historically, there is a moral panic about every new technology.
I mean, if you take Tristan's words that he said in front of Congress,
those are, some of them are verbatim, verbatim, what was said about comic books in the 1950s,
literally word for word.
It's the children that need to be protected, right?
It's these vulnerable populations.
It's hijacking our brains.
Like literally, word for word, what was said about comic books, what was said about the television, what was said about the radio, what was said about novels, what was said about the bicycle.
There's a beautiful scene in the social dilemma where Tristan says nobody freaked out about the bicycle.
It's just this technology.
In fact, it's exactly the opposite of what happened.
People did freak out about the bicycle about how women having this freedom to ride around wherever they please was a terrible thing that their mind.
would turn to mush, that they would get bicycle face. I'm not making this up. This really happened.
And so every new technology, we have this moral panic about, especially when it comes to
save the women and children. This is what we hear time and time again, especially when it's a
technology that enables people who were previously unable to get their views out there. Now they have
a medium to go around centralized media. That's a huge threat. And look, the reason we hear
so much about how terrible technology is in the mainstream media, is that they are a threat.
You know, the movie talked about how it was Facebook and Twitter and Instagram that are
sucking our attention and monetizing our eyeballs. But what exactly is CNN's business model?
What's the New York Times business model? What is Fox News's business model? They all monetize
your attention. This is nothing new. It's not the business model that's so dramatically different.
Well, let's pause there. Okay. You know, I've heard this argument.
a lot. And it's different because, you know, I do, does, uh, do some mainstream media
organizations play for your attention, for sure. But I don't think that's the only value that they
have. Whereas I do think that's, I didn't say that. Yeah, but I do think that some of these apps
are very clear that they're North Star is completely all about engagement. In my perspective.
Well, it's, it's, you know, of course it is. Yeah. Right. Do you not think that Netflix or, uh,
okay let's go back HBO or let's go back CBS or let's go back I mean any kind of any form of media I mean we've been here before look at the days of yellow journalism everybody who sells ads is is monetizing your eyeballs right by the way just to be clear I don't have a problem with that business model there's nothing ethically wrong with selling ads per se I think the hypocrisy of saying oh we know only this form of advertising is wrong without saying wait a minute wait a minute what's what are the political
of the crap you hear on Fox News or the crap you hear in other publications.
And so as a justification for your earlier question of, okay, why is that?
Why do we hear only the bad side?
It's because this new form of media is a direct threat to traditional forms of media.
Yeah, I have a bone to pick with that one because I do think that like the traditional
press does at some points decide that doesn't want to optimize for, you know, for views.
Otherwise, like, think about what the New York Times could do if it wanted to just be a, you know, a new site that optimizes for views on the internet.
There'd be a lot of stuff that it wouldn't cover.
Think about, I'll explain like even this podcast, right?
Like, if I wanted to, there's certain tricks that I could use to sort of play up the engagement and get more ad dollars.
But I choose not to, as opposed to some of the platforms who are all about who need to show that growth quarter by quarter.
I'm willing to sacrifice that growth in order to do a show that aligns with the values that I have here.
But a lot of these tech platforms, I don't think they are.
So you're saying that because an organization, a media organization, is prudent in terms of not doing everything they could potentially do to maximize engagement, that that shows that they are ethical.
I think that they do not have the same motivations as some of these tech platforms.
and while they do, I don't doubt that like, for instance, some people in the news industry
think Facebook is a threat to their business and that motivates some of their anti-Facebook
coverage.
I also think that there are some real issues with phone addiction in particular that aren't
driven by...
Let's just stay on track, Alex.
So you were saying that, hey, there's all...
Let me have you got to let me finish the sentence.
No, but you're going off track.
You're going off track.
You said there's things that I do as a podcast.
that I could do to juice engagement that I don't do.
And you're saying that's what the New York Times does as well.
Therefore, they're more ethical than the social media companies.
I was on track.
I think the question was, what we were talking about was whether that causes them to cover tech negatively, right?
Wasn't that the assertion?
I think there are, that certainly is a motivation.
Yeah.
That there is a negativity bias to look at.
at the downside of tech without mentioning anywhere near, I think, a balanced perspective or even
an empowering perspective on what we could do about it. And to emphasize your point earlier,
we know that the tech companies, and by the way, I'm not an apologist for the tech companies.
If people want to uninstall Facebook, go for it. I want you to. If you don't find it useful,
screw them, right? Uninstall them. But you can't say that these companies are doing everything they can
for engagement or to have that innuendo that, oh, all they care about is using engagement.
You know, Facebook hired 10,000 content moderators.
They've spent billions of dollars collectively between the social media companies doing things
that reduce engagement by taking off content that is inappropriate or is long term a bad
thing for their platform.
So we can't say they're doing nothing.
They are definitely trying.
Are they doing everything they could be doing?
Absolutely not.
Is there more they could be doing?
Absolutely.
But they are also not just looking for every buck by leaving up content that they think is going to be dangerous.
They have taken down all kinds of content that otherwise could make the money.
They have definitely – you can't say they're doing nothing.
They're definitely doing a lot to be more responsible.
No, I wouldn't say they're doing nothing.
I say that engagement is in our star, whereas in other places, it's not.
And, you know, I do think that they'll take down as much as they can in order to stay in business without getting a back.
backlash from Congress or as little regulation as possible.
But in terms of engagement as the North Star, I mean, I'll just quote from you from one of your talks that you gave on Hooked, which was the book about how products can get people coming back, can hook them before you're indestructible.
So you said growth hacking is important.
And if you can't retain users, if you can't keep them coming back, you've got nothing.
And engagement is more important than growth, all about modulating people's moods.
So even you said it, that it's the North Star for people.
Absolutely, because when we work with a product like a fitness app, like FitBod, that uses
the hook model to get people to exercise, absolutely, we want them to keep coming back.
A company I invested in Cahoot is an education startup that we want people to, we want kids
to get hooked onto education.
So the idea behind my work with Hooked, it's, I didn't write this book for the social
media companies.
That's where, that's, I stole their secrets.
So that the idea was to democratize these techniques so that the rest of us can build products and services that are engaging in habit forming to build good habits in people's lives.
And so that's basically what I'm going to get at is what I'm getting at here is that there are different motivations from a new site and a tech company and to say that they're all aiming for the same thing just because their ad supported, I think is a little off base.
I don't think so
I'm pretty sure that everybody who sells ads
I mean if you look at what is
what is the most important metric for Fox News or CNN
it's the ratings it's all about the ratings
whether it's you know Seinfeld to friends
I don't know like this is nothing new
companies that make content that is monetized
by ads care about engagement
there is no other metric and frankly
what is the alternative we want anybody who makes
interesting content to make that content engaging. Do we really want to tell the television stations,
hey, your television shows are too good. Please make your shows more boring. Do we want to tell Apple,
hey, Apple, your phones are way too user-friendly. Please make them less engaging. That makes no sense.
And so some of the proposals we see today, you know, Josh Hawley had, Senator Holly had a proposal
around, you know, banning the infinite scroll that we, he thinks that by taking away some of these
tactics, we're going to reduce, I don't know what he's dropped, what the goal is. He wants
to make them essentially worse. He wants, you know, part of the proposal is a pop-up every 30
minutes that you use social media that says, hey, do you want to keep using it? You've been
using it for 30 minutes. Is that really the solution? Is that really what we want? I don't think
we can legislate away a product's ability to be good, to be engaging. That is why we engage
with these products and services. So I think that's very troublesome. The, the, the reason,
reason this is done, by the way, is that the entire narrative of much of this regulation,
and again, I want to stay in my swim lane. So I'm not an expert in terms of monopoly status.
I think content moderation. There's lots of things that we have to hold these companies accountable for.
I'm not a tech apologist. But when it comes to this argument around, hey, we have to pass this
legislation. I'm your savior. My organization is going to get you out of this because you can't
stop, because you're addicted. And I think that much of this argument rests on this idea that
You know, you are powerless consumer.
You can't do anything about it.
That's just not true.
Yeah.
No, I agree with you on that front.
But again, I'm just going to go back to what I said before, and we can move on.
But look, I don't think that, like, the idea is boring or, you know, engagement.
Like, I don't think that's the tradeoff.
Like, you can be ethical and interesting.
And you can sort of, you know, you can.
can take a lot of your principles and throw them out the window and be the most engaging
new site on the internet if you want. But there are tradeoffs there. So I don't know how we got
into this spiral. But let's get out of it. Why don't we take a break? We'll be back right after
this with me. I want to talk a little bit more about the sequence of his books. And then maybe
we can talk about some tools that people can use to unhook themselves from the products we're hooked
on. We'll be back right after this on the big technology podcast. Hey, everyone. Let me tell you about
The Hustle Daily Show, a podcast filled with business, tech news, and original stories to keep
you in the loop on what's trending. More than 2 million professionals read The Hustle's daily
email for its irreverent and informative takes on business and tech news. Now, they have a daily
podcast called The Hustle Daily Show, where their team of writers break down the biggest business
headlines in 15 minutes or less, and explain why you should care about them. So, search for
the Hustle Daily Show and your favorite podcast app, like the one you're using right now.
And we're back here on the big technology podcast with Nira Ayle, the best-selling author of Hooked and Indistractable.
The first book, Hooked, is all about teaching product people how to hook us on their products.
And the second book is teaching us how to unhook ourselves from maybe some of the apps that, you know, the people who read NIR's first book hooked us on.
So, Neer, I have a question for you.
Why hooked and then distractible?
Can I respond to that first of all?
Sure.
It's not the same products.
I've heard this argument before, and this is something I want to talk to you about.
So let's hear you explain it.
Yeah, so Hooked was not written for the usual suspects, Facebook, Twitter, Instagram, WhatsApp, Slack, Snapchat, the video game companies.
They know these techniques.
That's where I learned them.
That's not who Hooked was for, and that's not something that I promote.
I don't work for any of those companies I never have.
And what Hooked is written for the rest of the,
of us. It's written for the tech entrepreneurs who want to use the very same psychology to build
healthy habits. And that's exactly what's happened. Since Hooked was published, we sold over half a
million copies, it's been utilized the kind of companies where I put my money, where my mouth is,
where I invest in these companies. These are health tech, fintech companies, ed tech is a very
big category. These products that are trying to make products and services that are engaging
that would improve people's lives if they just use the product. Whereas indistractable, if Hooked
is about building good habits, then indistractable is about how do we break those bad habits and
keep the technology, right? We want to use the technology in a way that benefits us, that it serves
us rather than us serving it. And I think we can have our cake and eat it too, that in fact,
we can build good habits with our technology, but we can also break the bad habits, but it's not the
same products, right? Nobody's trying to break their addiction to do a lingo, right, a product
that uses the hook model to help people learn languages. These are different products.
Don't you think this is sort of like the good guy with the gun argument where like, you know, I mean, I don't think it's in the same category at all, but like gun manufacturers are like, we're making these for the good guys using the guns and the, you know, and they never really talk about the bad uses of the weapons.
And yes, of course, you probably wrote hooked with, you know, some of these good apps in mind.
But I don't think there's any question.
You sold a half million copies of hooked.
I don't think there's any question that people use this stuff.
for bed. And you talk about fintech. I mean, I don't for a second think that folks at Robin Hood
didn't start to think about some of the things that you talk about, like the variable rewards
model, which you described very similar to a casino slot machine as a way to hook people to do
things like day trade. And there are probably other apps that picked up the book and used it for
other stuff that people would prefer not to be addicted to. So obviously, this wasn't used entirely
for virtuous purposes. Is that fair to say? Well, look, these techniques have been around for a very
long time. What I wanted to do is to put them in a version that the rest of us can use, right? The
alternative is what exactly? This is the thing that I think is oftentimes left out of the conversation
is that these things don't exist in a vacuum.
It's a very simplistic, easy argument.
But think about it, what would have happened
if we didn't have a book like hooked?
Right?
The bad guys who know these techniques
and keep using these techniques,
that's where I learn these techniques
would keep using them.
Meanwhile, the rest of us,
who could use them for good
wouldn't have access to them.
So again, what's the alternative?
The alternative is that they stayed secret and locked up
versus, hey, we can have a lot of good
the world if we can use these for patient adherence, for helping people save money in fintech,
for helping people learn new languages, for ed tech. This is not trivial. We have touched
the lives of millions of people by helping them build healthy habits through the technology
they use. And, and sunlight is the best disinfectant. That when we know, when I write in a book
like this, that these techniques can be used for good or ill, there's a whole chapter in the
original version of hooked called the morality of manipulation, right?
don't go past that. I knew that there are ethical implications. I'm not an idiot. Of course
there are ethical implications, which is why I wrote an entire chapter in the book on how to use
this ethically. So this is something that I wanted to get out there, that I wanted people to use
for good, but that they also know that there are some guardrails here on how to use this stuff
in an ethical manner. Why not just write indestructible? And instead of saying everyone's
going to be doing this, let's give the secret, some of the bad guys, to the good guys. And
might be some collateral damage along the way, why not just instead find, you know, figure
out what you've learned about the psychology and skip hooked and write indestructible.
Well, because look, I, when I first wrote, hooked, the publisher wanted to call it
how to build addictive products. And I said, absolutely not. Because even though it might sell
better, that is not what the book is about. The book is not about how to build addictive products.
the book is about how to build habit-forming products.
What's the difference between the habit and addiction?
An addiction is a persistent compulsive dependency on a behavior or substance that harms the user.
You would never want to build for an addiction.
A habit, on the other hand, is an impulse to do behavior with little or no conscious thought.
And we have good habits, many, many good habits, as well as bad habits.
So, hooked is not how to build addictive products.
I can always tell when someone hasn't really read the book because they say,
oh, your book is about addicting people.
No, I talk about the difference between addictions and habits in the book.
Hooked is about how to build habits.
That's a big, big difference, and it's about how to build good habits.
So, again, back to that wonderful Paul Virillo quote, when you invent the ship, you invent the ship wreck.
If every technology, we stopped and said, well, we can't do that because there might be potential harm.
We would never progress.
Every new technology has downsides.
That's part of the process of innovation.
So what we do, Alex, is what we have always done as a species.
When there is a new technology that has goods and bads, we do two things.
we adapt and we adopt.
We adapt our behaviors, which is what indistractable is all about.
It's about adapting our behavior to cope with a new reality.
And we adopt new technologies to fix the last generation of technologies.
So I work with many companies that use the methodologies inhooked to get people unhooked
from the distractions in their life.
Right.
So we adopt new technology.
This is what we've always done as a species.
We adopt new technologies to fix.
the bad aspects of the last generation.
No doubt.
And by the way, like, you know, even though you called hooked, hooked and not how to build
addictive products, I'm sure people built addictive products from hooked, don't you think?
Look, after reading it.
A hammer is a tool that can be used to build a home or bash someone's brain in.
Yeah.
So tools can be used, misused, abused.
But that doesn't mean that hammers aren't incredibly helpful.
So if our standard was, hey, let's never innovate, let's never write a book, let's never say
anything because it might be misconstrued, well, then we'll all be bound and gagged.
Yeah.
But then I, again, like think about like the, why do hooked first and then indestructible
second?
Because we can get the good aspects while dumping the bad aspects.
Okay.
I think we'll have a stalemate on this one.
Well, what do you recommend?
I mean, no, I can't tell if you're if you're poking the box to, to, to,
actually get an answer or do you have an opinion here? What do you think? Personally, I think
Indistractable is a valuable book. And I think it's good. I think I agree with you on a lot of
this stuff, right? That there's been too much of a black and white discussion about what
technology does to our brains. And we have been portrayed. And I spoke about it on the podcast
before. We have been portrayed as people without any agency here. I think that's the
something that you and I both have a have a bond over.
And so I think indestructible is good and the message is good.
I looked, I could feel like I feel like you could write the manual hoping that only the good
guys are going to use it, but there's, you know, without a doubt, like it's not just an oral
tradition among the bad guys who like pass it down to one another.
People, I'm sure, read the book and, you know, thought about ways that they could, you know,
create new products using these addictive techniques and use them for bad.
And I think that the proliferation in these techniques, you know,
throughout the industry doesn't seem, you know, net constructive to me, my perspective.
Hmm. So I want to thank you, first of all, because I think you give my work way more power
than maybe it deserves. I don't want to shoot myself in the foot here, but I'll tell you, Alex,
that, look, the techniques and hooked are good.
They're not that good.
They're not that good.
This is not mind control.
I mean, what are we talking about here?
We're talking about variable rewards, okay?
Like, we're talking about making something user-friendly.
We're talking about improving the product with use.
I mean, the techniques I talk about and hooked are very powerful.
And I appreciate that you think that they're so powerful.
But I wrote the book and I'll tell you they're not that good.
They're not that good.
They work.
They're effective at building habits.
they will not cause hijacking of brains.
Nobody is getting addicted to SaaS software, Alex.
Nobody is getting addicted to educational software.
You know, I don't think that there's any mind hijacking going on.
So what's the danger?
So what's the danger?
I think the danger is when it ends up going into things like fintech apps
that get people to day trade and end up getting these variable rewards.
and it makes it addictive.
And they lose money from it.
Now, I think that apps that democratize finance are good.
And it pisses me off, honestly, that the everyday person doesn't get a chance to trade in the same way that an institutional investor does.
That said, there's got to be a level of responsibility that these apps use so that they don't encourage people to just, you know, trade their way into debt.
Totally.
the same thing and the same thing goes with i think the same thing goes for um gaming companies i don't think
i mean i like to play video games i think they're good um but i also think that like there's a line
that needs to be drawn to you know make sure that people don't have their lives waste away you know
inside a video game totally now i don't think it overpowers them you know i think that like you can
walk into a casino just to use your analogy and decide okay i don't want to play the slot machine anymore um
But at the end of the day, like the casinos have these well-worn techniques because they work at scale.
And there are some people that will become addicted to gambling because of it.
So that's sort of where my worry is.
And by the way, I'm willing to change my mind.
So let's talk about it and see where it goes.
So this is what I always love about whenever I listen to you on a podcast is you are on the side of nuance.
And I always hear that from you.
And one of the things you always say, it's like maybe it's your catchphrase is we should take it on a case.
case-by-case basis. I always hear you saying that. When it comes to monopoly regulation,
it should be antitrust case-by-case basis. I always hear you saying that. And it's the same thing
when it comes to this discussion, that when we have these blanket statements of, you know,
people are getting addicted and it's harming people, well, it's just never that simple. So one thing
that I have proposed for years now, and in fact, this has probably been six or seven years now,
I've been meeting with the tech companies to try and implore them to do. And frankly, I'm at my
wits end. I think, you know, I'm not a big fan of government overreach and regulation,
but I am in support of regulation that would do the following, which is create a use and
abuse policy. And I'll send you this article. I wrote about it seven years ago, where what I want
companies to do is to say, look, for the people who are actually addicted, for the pathological
addicts, they deserve protection. We have special protected classes in society, children, for
example, right? There are certain things that children are not able to do. My 13-year-old
daughter can't walk into a bar and order a genitonic, right? She can't walk into a casino and start
making bets on the blackjack table. There are certain things that children are protected from
because the theory is that, you know, they're not of sound mind and bodies. They can't make
their own decisions at that age, which is true. People who are pathologically addicted also
should deserve protection. So the difference between these products and other products that
can potentially addict people is that these companies know how much we use them. So an alcohol
manufacturer can't know who the alcoholics are. How could they know? Right? There's no direct
connection. Tech companies, gaming companies, they know, right? They have personal identifiable information.
So what I've been proposing for years now is to create a use and abuse policy. Give me some kind of
number that over a certain number of hours a week, we're going to reach out to you. Some kind of
circuit breaker is going to be tripped and we're going to reach out to you and say, hey,
we see that you are using our product in a way that may indicate you're struggling with
an addiction. Can we help? Right? I think that should be part of their policy in their service
agreement, right? Some number of hours, 30 hours a week, 40, whatever it is, give me some kind of
number. So those people do deserve special protection. The rest of us leave us alone. We don't need
that type of mottling and protection. We don't need that, you know, because
the vast majority of us are not addicted.
So what we need to do is to look for those people who do deserve special protection,
protect them, do something for those folks.
But the rest of us, if you're not a child or if you're not pathologically addicted,
this is not an addiction.
It's simply a distraction.
And it is something we can do a lot about.
Yeah.
No, I agree with that.
I think that's smart.
And, you know, I do think that, yeah, it comes down to the developers of these products,
actually developing them in a way that treats their use.
user as well. I don't think it's been done across the board, but I like this solution.
Yeah, I think it gives the best of both worlds, right? So that for most of us who just use this
recreationally and once in a while, there's nothing wrong with it. There's nothing wrong with
Facebook or YouTube or Instagram. And by the way, another thing that you've seen over the past
few years that they've done, I've been very glad to see it, is that these companies are
putting in tools to help us moderate our use of these products, right? You know, Apple has
Apple screen time
Google has well-being
Instagram now has stopping cues
there's all kinds of things these tech companies are doing
why I mean how many products do you know
that built into the product a way to use them less
it reminds me of what happened in the automobile industry
17 years before any law was passed
that made auto manufacturers put seatbelts in cars
the manufacturers themselves started to do this
why because people buy safer cars
that's a feature they like that
Yeah, TikTok has, I think at a certain point, if you watch enough TikTok, they'll tell you to take a break.
They take, yeah, exactly, a stopping queue.
And I think that's fantastic.
But you have to watch a lot of TikTok in order to get that.
I know, I've watched a lot of TikTok in my day and it encountered that.
Yeah, I heard you want to install the app.
I did.
I did it again.
I did the same.
I did the same.
Yeah, I got super addicted to it.
I think that, and that one was one where I uninstalled it and didn't really feel compelled to go back.
I've tried to do it with Twitter about a thousand times, and I can't get rid of it.
And I do wonder if I meet that definition of addiction where it does become harmful.
But the good news is that you can use the whole, maybe we can talk about this next.
There's all kinds of things you can do.
You don't have to necessarily uninstall every app.
You know, for many of us, that's why I wrote indistractable is because I was very frustrated with the tech critic perspective of, you know, just stop using the technology.
the technology is melting your brain, well, that's not really practical, right? Many of us will get
fired if we don't check email. We'll get fired if we're not on Twitter. We'll get fired if we're
not checking social media. It's part of our job. So I don't think that's a good solution. So I wanted to
offer a more practical solution for how to become indistractable. All right. We'll do that after
this break. But let's just go back to the question that we sort of got sidetracked on in the
first half. Pardon me, I ended up with a long-winded defense of the press. We got into our
back and forth. But I'm curious why you think it is.
is that the discussion is so black and white around this stuff.
And, yeah, I mean, I wrote this down ahead of the time.
But, like, you know, some tech critics seem to just take all of the ills of society
and blame it directly on tech.
And it isn't nuanced.
So what's going on there?
We like simple stories.
I mean, I think it's, we jump to the conclusions of black and white thinking.
And that's not real life.
Again, I don't think that tech is guilt-free.
There's lots of things that the tech industry has done wrong.
Lots and lots of things that they've done wrong.
But that doesn't mean that there isn't nuance here.
That's the answer.
It's not tech good, tech bad.
It's nuance.
Yeah.
All right.
Well, more of that coming up after the break,
maybe you can help me and my Twitter addiction.
That would be nice.
Or whatever you want to call it.
My life being hooked to Twitter.
or the habit. We'll be back here on the big technology podcast right after this with
Niroil. And we're back here for our third, probably final. Who knows, we're having a good
time segment with Niarayal, the author of Hooked and Indistractable. Neer, are you still in
Singapore? Where are you now? I am. Yeah, I'm in Singapore. Yeah. So I can, I don't know if I was
allowed to share that, but you can tell me if I should delete this afterwards. But you've had an
interesting pandemic experience. So do you want to share a little bit about that?
Yeah, which aspect? Coming out here, you mean? Yeah, I mean, yeah, what propelled you to go to
Singapore and are you going to come back? Yes. What's it been like out there? It's been,
it's been really interesting. I have to say, I didn't, I've been here before several times on
business, but never stayed for very long. And this time, we've been here for about a year and a
half. And it's really grown on us. We really like it. It's a very interesting. It's a very interesting
culture and yeah we've really taken to it nice we'll be back in the states probably
next year was it like a covid hiatus and what's this what's the story like out there right now
it started as as a covid hiatus and then uh we you know we could have gone back a long time ago
but we kind of fell in love with it since we got here and it's uh yeah what's that oh what
have you fallen in love with it's um you know it's interesting seeing um how many of the
problems that we struggle with in America, they have figured out. Health care is very good
and much more affordable than it is in the States. I mean, I pay for health insurance here for a
year, what I paid in New York for a month of health insurance for my family. Housing, you know,
they have highly subsidized housing for citizens. There's virtually no homelessness. There's almost
no crime. There is no graffiti. It's everything runs on time. There are robots,
cleaning the streets. There's racial harmony day. A few days ago, they had racial harmony day
where what you do as part of racial harmony day is you dress up in other people's ethnic
garb. You know, it's essentially state-sanctioned cultural appropriation. It would never happen
in America, but it's just so interesting seeing. No, no, no, that would not happen. Just so interesting
how many problems it seems like they have overcome. Not that it's a shangra law. There are, of course,
problems here too, but it's very interesting. It's very tech progressive. It actually feels to me,
what Silicon Valley felt like
when I moved out there back in 2006.
A lot of optimism, a lot of opportunity.
Yeah, it's a really exciting place.
Do you think there's anything that you've seen out there?
Now we're way off track, but I'm sort of going to just keep asking those questions.
Is there anything that you've seen out there that you think of the things that you listed
could be implemented here in the U.S.?
Might translate well?
I think a lot of the things here.
You know, there are always kind of excuses for why,
Singapore is an exception.
It's a small country.
It's a this, it's a that.
I don't know.
There's all kinds of reasons.
But I think what Singapore gives us is evidence that there are political solutions to
make these problems.
Like with healthcare, for example.
You go into a doctor's office here.
It's very affordable.
Everything starts on time.
There's a pharmacy built into the doctor's office.
What a concept, right?
In America, the doctor gives you a prescription.
Then you've got to go to a different pharmacy.
Well, here they put them all together.
Yeah.
Well, putting some of that friction.
My, it's probably, with pharmaceuticals and doctors, it's probably a good idea in this country,
given because I've gone through.
That's good, too.
That's another podcast.
That's another podcast.
Yeah, but there's, it is very interesting how seeing that, you know, these problems that we think,
oh, are just endemic to a modern civilization don't necessarily have to be.
Not that every solution translates, but I'm sure some of them would.
Why Singapore, did you always have that on your mind?
Yeah, how did that one come up?
So, honestly, I typed in on my,
March 11th when, so we were in Midtown Manhattan. If we weren't in Midtown, we probably
wouldn't have left. We probably would have just stayed. But given that it was in midtown and we could
see that, you know, people were not taking this seriously. And then Trump came on TV on March 11th and
said, okay, we've got this Corona thing under control. We're going to cut off trade with Europe.
Do you remember that's what he said? That was his big plan. We're going to cut off trade with
Europe. And I said this is not going to end well. Not to mention the mayor in New York was
saying get it in your last hurrah, go to the gym.
Yeah.
We went to the gym as we were about to have this massive wave.
Oh, crazy.
And I remember.
So yeah, what do you type in?
Yeah.
I typed in best place to go during a pandemic into Google.
Really?
Yeah.
No way.
And it was Singapore, Taiwan, and I think New Zealand.
And the world, I think it was some website had ranked these countries on pandemic
preparedness.
And I remember at the time, too, we were wearing masks.
and people were like, you know, walking down the streets in Manhattan and saying,
oh, you don't need a mask.
What are you wearing a mask?
And, you know, as soon as we get to Asia, everybody is wearing a mask.
And one of the first things I did when I got here was type out a blog post that said,
you know, at the time, the CDC, WHO said, don't wear masks.
And I said, no, no, no, like look at what's going on here in Asia.
Everybody's wearing a mask.
We got off the plane in our connector flight in Japan.
And my daughter turned to me and said, look, everybody here is wearing a mask,
as opposed to when we got on a, on the plane in Newark,
people were, and this is March 11th of last year.
People were hugging and kissing and shaking hands.
Everything was fine.
Barely anybody was wearing a mask.
And here everybody was.
I think actually this brings us back to this conversation on technology.
You know, at the time I posted this article on my blog and I posted on social media
and I got so much blowback saying, don't tell people to wear masks.
That's not what the CDC says.
That's not the official line.
And then, of course, we all know what happened.
A few weeks later, they changed their mind and said, yeah,
you should be wearing a mask.
But I think this is an important part of the discussion around, you know, content moderation that, yeah, there's a lot of crazy stuff on social media.
There's a lot of people who are espousing silliness.
But there's also for every, for every crazy, there are ideas every once in a while that turn out to be correct.
You know, there are people who have a different perspective that should be aired and would not otherwise be aired.
We're not through these social channels.
Oh, I agree 100%.
Question.
Well, actually, you know, it's true.
Question is, you know, what do they end up incentivizing?
But this idea that alternative points of view or that, like, folks shouldn't be able to talk about the origins of coronavirus and get, have that stuff moderated out, I think is a little bit absurd.
Yeah, I totally agree.
And I think, I think, you know, we tend to think that it's, you know, we want moderation for the, but not for me.
Like, only moderate the opinions I disagree with.
And that's not.
And that's for sure.
Yeah.
Yeah, exactly.
And when everybody's pissed off at you for, uh,
posting opinions you don't like, it probably means you're doing something right.
That's probably not such a bad thing if both sides say, you're biased.
Well, I mean, you know, I think I've talked about this on the show once.
I brought that up to, to Ben Smith once, who was the editor of BuzzFeed now, he's the media column
in New York time, that if everyone's met it, you might be doing something right.
He's like, said, yes, or you're a troll, which I think really is the case for social media.
Could be, good, but this is the process.
It's messy.
Yeah.
It's, you know, what is that?
I agree.
Voltaire quote, I'll, I don't agree with.
what you say, but I'll die for your right to say it. We need to remember that. You know,
it's, you're part of the process of getting to truth when it comes to these murky times,
murky waters that we have, that we're swimming in right now is, yeah, some nonsense along
with the, with the truth. Well, I mean, that we're, well, we're going to go in a different
direction now, but let's, let's do it a little bit. Don't you, you have to draw a line somewhere.
You do. I mean, even, even the founder of, um, of 8chan, or 8 Coon, as it's called now,
Fred Brennan is all for shutting the site down because it's become too toxic.
Yeah, absolutely.
You can't let, like, people, you know, spread terrorist recruitment message and child porn.
Totally.
It's a matter of what Voltaire is going to die for, you know, what you can say.
Not that ever, right.
You can't say anything.
Yeah, you can't, you can't know, fire and a crowded theater.
For some of the stuff.
And that's where we do.
But we have these lines long established in jurisprudence around what you can say, right?
You can't shout fire in a crowded theater that even though it's, you know, free speech.
No, there are restrictions.
But I think, you know, the debate, the stuff that people are getting angry over, you may not agree with it.
But a lot of this stuff is not inciting violence.
A lot of the stuff is, you know, anything that does incite violence, of course, we have existing laws that protect us against, you know, child pornography and inciting violence.
Those restrictions are clear.
I think that's not what people are pissed off about right now.
Well, yeah, a large part, right?
It's a question of what are some of, I don't know, where are some of the, um,
These fringe movements, some of them are going to be good and tell stuff we didn't know.
Some of them are going to be terrible and inspire, you know, it's death, I mean, and animus.
And so can you can you get the good and mitigate the bad, I think, is the ultimate question about these, these companies and their content moderation policies.
Yeah.
Yeah.
I think, look, you know, societies and individuals in those societies have the right to be wrong many times, right?
I think there are.
Of course.
Yeah.
I think, look, there's there's, I think there's, I think there's a.
difference between good faith efforts and intentional misinformation, right?
State actors, you know, Russians putting out disinformation, that's, okay, that's not done
with a clear conscience. That has a direct political motive. And I think that, you know,
I think that should be taken care of and done away with. But, you know, mom and pop posting
opinions that other people might, other people not agree with. I don't think that's necessarily
something people need to moderate. Yeah, of course. And the, I mean, you know, the last person
that, the last person, but the most prominent person that made that argument was Mark Zuckerberg
right before he said that Holocaust deniers are just getting it wrong versus using that lie to,
yeah, for political purposes.
What do you think about?
I mean, this has been an issue for a very, very long time, Holocaust denial.
I mean, I'm in the category of, I do think people should be free to deny the Holocaust if they so choose.
I don't think that should be banned speech per se.
In Germany, it is, but that's not banned in America.
I wouldn't ban it.
But I also wouldn't use it as an example for people getting stuff wrong because I think oftentimes it's very intentional.
And I also would say that, you know, what's that phrase?
Freedom of speech is not freedom of reach.
I don't think there's a, you don't need to throttle that kind of content to show to more people.
You can throttle it back as opposed to throttling it forward.
Well, that's become a hot, a hot phrase.
Personally, I think that like, you know, freedom of speech, freedom of reach, like the whole debate is somewhat misframed.
because I think that Facebook is an editor at the end of the day and it's making choices about
what we see and so you know people can post on I mean yeah people can post on on it all day
long but ultimately it will decide what's in the news feed you know and so people are saying
I mean I guess it sort of is along this line of freedom of speech and freedom of reach but
in a in a more clear-headed view it should be like you know Facebook is the editor and I want
my story to run on A1.
Yeah, I think you're going to see more and more of this type of content
pushed into groups, is my guess.
I think they're going to be stricter around the news feed.
I think we already see this, right?
Like Facebook 2.0 is Instagram, right?
I mean, if you go on your Instagram feed, it's much more curated by you rather than
the Facebook algorithm because it's...
Oh, but that's about to change.
Oh, yeah?
How so?
Yeah.
What are I mean?
Well, they're looking at TikTok and are quite.
scared of what TikTok is going to do to them and are now experimenting with putting a post
from people you don't follow into your feed and then more full-screen video.
It's working very well for TikTok to not use that follow signal as much as an Instagram.
Yeah, it's interesting.
I mean, I think this is such a great illustration of how, you know, we tend to legislatively
fight yesterday's war, right?
We're all worried about Instagram and Facebook.
And meanwhile, here comes TikTok.
Oh, I think, yeah, TikTok is going to just, I mean, crush them.
I mean, this will be language that you'll appreciate it.
It's a time spent game in social media and more and more people are spending time on TikTok
and that's a zero-sum situation for Facebook and Instagram.
Yeah, I mean, that is a huge competitive threat.
Yeah.
So, Nir, what got you interested in this type of stuff?
I mean, what about your life made you want to start thinking about the science of
hooking people on apps, not addicting, to use your language and then getting them unhooked.
Let's see.
So how far back do we want to go here?
So I was...
I want you to get, let's go as far back as we can.
Okay.
Let's go back to my early childhood where I was clinically obese.
And I think that had a big impact on me that I felt like at one point in my life that food controlled me.
and I mean I was I was pretty big my my parents took me to fat camp and I remember going to the doctor's office beforehand and the doctor saying okay here's normal weight here's overweight and here's you you're over here on the chart like way in that red zone and and it was a big problem for me it was a big struggle for me and it was you know working through why food seemed to control me and if if I'm really honest like I became fascinated by how that
that was the case. Like how did food seem to have such a huge control over me? And it wasn't until
I realized that I had to stop blaming McDonald's and stop blaming Dunkin' Donuts that really I wasn't
eating because food was delicious. I wasn't eating because I was hungry. I was eating because
there was something inside of me that I was trying to escape emotionally. And it wasn't until I made
that revelation, sorry, that I could do something about it. And I think that was incredibly
empowering turning point for me, but I always was fascinated by the marketing tricks of the trade
of how they did that to me, and then how I finally kind of broke out of that. And Wade is still
something I struggle with. Thankfully, now at 43. You look good, man. Thanks, thanks. I'm now finally
in the best shape of my life, but it's been a long, long road to get here. And I still struggle
with that. I still have to keep on top of it. But I think that's probably where my fascination
with this started. What were we trying to get away from?
oof how long you got um look you know i think um whether it was um you know eating because i was
lonely eating because i was bored eating because i felt ashamed at how much i was eating um all of
these things would send me down this this spiral that you know now doing research into this field
this is this is pretty common right that when you know there's a big difference between
overweight and obesity and where we where we really find the deeper endemic issues is where we
find the root cause of the problem and so that sounds like there's a parallel between breaking
those habits and then breaking the habits online absolutely so so the most important part i think
now we get to how we can unaddict people from twitter yeah i might be talking about myself i don't
We should make you the case study here to figure how to help you with this.
But I think the most important aspect of becoming indistractable is understanding the internal triggers.
We know the studies find that only 10% of the time that people check their phones,
do they check their phone because of what we call an external trigger?
So an external trigger is a ping, a ding, a ring, anything in your outside environment
that prompts you to take an action.
But that's only 10% of the time that we get distracted.
percent of the time that we check our phones because of those external triggers. So what's the other
90 percent? The other 90 percent of time are what we call the internal triggers. Internal triggers
are uncomfortable emotional states that we seek to escape from. So boredom, loneliness,
fatigue, uncertainty, stress, anxiety, this is 90 percent of the time that we get distracted.
We are looking to escape an uncomfortable sensation. So that has to be the first step, that whether
it's too much food, too much news, too much football, too much Facebook, it doesn't.
matter we will always look for a distraction one thing or another unless we understand what we
are looking to escape from yeah so i guess this is a nice question to end on or a nice sort of theme to
end on so many people who try dieting uh don't succeed uh and then they end up you know back on the
horse you know uh bag of chitos on the couch yeah so um and then and then oh we all have him
talk of personal experience.
I can take out the big tub of party snacks to a test that's nearly empty after getting it this
weekend.
But so I do wonder whether there's really hope for us to unhook ourselves or and distract
ourselves from these devices, given that we are so, if it's the same sort of triggers as food
and doing, you know, a lot of stuff on the phone, do we have hope to get out, get away from
that?
I think we do.
And real, you become, yeah, skinny users of the telephone or whatever you want to call it.
Yeah, I think we absolutely do.
And I think you're a test case for that.
You know, when you said, hey, you know, I found that I was using TikTok and it was taking
up too much time in my life and it wasn't serving me, I uninstalled it.
And this is a very common truth that I uncover with many times with tech critics.
You'll hear them saying something like, oh, the average person can't do that.
Well, what about you?
How is it coming with you?
Oh, yeah, yeah, I can get control of it, but everybody else can't, right?
Like the masses can't.
And I don't think that's true.
I think it's belittling that in fact, if we believe we can't, right, if we tell people
they're addicted, if we tell people it's hijacking their brain, if we tell them there's
nothing they can do about it. Guess what they do about it? Nothing. And so this is called learned
helplessness, right? That when we teach people such lies, it becomes true. And so that's why I
think this narrative is so dangerous to tell people that they're powerless, that they're addicted
and their brains are being hijacked. That's just not true. So it starts with taking action,
with believing, first of all, that you can do something about it. I mean, lots of people do
lose weight. Lots of people do moderate their use of technology. And so it starts by looking at
these four key strategies that I spent five years researching into why we get distracted.
Why do we do things against our better interest?
And by the way, not a new problem.
Plato was talking about this 2,500 years ago, right?
2,500 years before the Internet and the iPhone, people were struggling with distraction.
It's something that has been part of the human condition.
And so we start with, number one, mastering those internal triggers.
That's step one.
Having tools that we can use to know what to do when we feel discomfort, when we feel
hunger, loneliness, uncertainty, fatigue, stress, anxiety, do we look to escape these uncomfortable
sensations with a distraction, with another drink, with flipping on the TV, with scrolling
something, with eating something? Or do we use that discomfort to help move us forward in our
goals, to move us towards traction rather than distraction? So that's step number one, mastering
the internal triggers. Step number two is making time for traction. That the vast majority of people
when they say, oh, I got distracted, I didn't do what I said I was going to do, look at my to do list,
It's a mile long, everything's so distracting.
I turned on the news, Donald Trump, this, that.
But he said, okay, but what did you plan to do with your time?
Show me your calendar.
It's blank.
You can't call something a distraction unless you know what it distracted you from.
So this simple technique that psychologists have studied in thousands of peer-reviewed articles now.
We know that it's called setting an implementation intention, which is just a fancy way of saying,
planning out what you're going to do and when you're going to do it.
So simple, very few people do it, but you cannot say you got distracted unless you know what you got
distracted from. So that's a big technique. And I show you how to do that according to your values,
synchronizing your schedule with stakeholders in your life, your boss, your kids, your family,
very, very important. The third step is the nuts and bolts of hacking back the external
triggers. So this is where we go step by step through email, social media, meetings,
our kids, all these things can be sources of external distractions. Good news is we can hack
back. And I use that term hack back because, you know, we all know that these companies are hacking
our attention, right? To hack means to gain unauthorized access to something. No doubt that these
tech companies as well as media companies. I thought they're not hooking into our brainstorm.
Well, they are trying to gain unauthorized access. We know that. Every, every advertiser,
every media company, they're all trying to do this, right? That's their business model. I never said
they weren't. Of course they are. But that doesn't mean we can't hack back. And in fact,
there are all kinds of technologies we can use to change their technology. So like, for example,
when I use YouTube, I have a Chrome extension called YouTube DS.
totally free, doesn't cost a dime,
DF stands for distraction-free,
that scrubs out all those videos you see on the right-hand side
that are there to get you to keep watching and watching and watching.
It disables autoplay, for example.
So all these ways that we can hack their technology,
and they can't do anything about,
all free tools that we can use.
We just have to use them.
And then finally, the last step is to prevent distraction with PACs,
where we make some kind of pre-commitment as a firewall against distraction.
So it's really about these four techniques
that anyone can use to become indistractable.
You know, oftentimes we have discussions about the system and its responsibilities.
And then sometimes we talk about personal responsibility and our responsibilities.
And it's never going to be entirely one or the other.
You know, the system will do what it works, what it does, and we'll do what we can.
You know, and, you know, our society is not going to be able to get anywhere with just one.
It needs, it takes both.
What I like about you is that you wrote the book that.
taught the system, but also the book that teaches their responsibility. And so, you know, at least
yeah, anyway, like I said, I appreciate the fact that you wrote Indistractable. It's a good book.
And I do think, you know, your core message is so important, which is that we can criticize
tech all we want, but if we don't think about ways that we can take this stuff into our own hands,
we're going to be in trouble. Yeah. I appreciate you doing that. Yeah. I think, you know,
just to just to build on that you know if you hold your breath and you wait for
Washington to fix the problem you wait for the tech companies to fix the problem you're
going to suffocate right why would we wait why is this only the only discussion we have is
about big bad tech and what they're doing to us why don't we also talk about wait a minute
what how about turning off notifications right like what simple things that any of us can do
but we never talk about that stuff because it's just no fun I want it to be of course and
But it's also that when you talk about what the tech companies can do,
they can have a greater impact at scale by making some changes.
Same thing with the food companies, by the way, since we're on that topic.
Yeah.
You know, they engineered foods to a bliss point where you pack as much sugar and salt and fat into it
to make people feel good, but not feel kind of gross inside.
And that bliss point will get you, you know, eating more and more and more.
And so we should definitely focus on like how a personal weight loss journey and like the secrets for the individual to be able to fight back against that.
But if we start to, you know, if we spend time on criticizing, you know, the food companies or the tech companies, that's not a waste of time either because you can end up helping lots of people at scale if you, you know, try to bring some light to the negative things that they're doing to try to get us hooked to their products or their food.
And ultimately, both discussions are worth having, individual and the system.
I would agree.
And there's some, you know, there's lots of problems with tech, whether it's, you know,
their monopoly status, perhaps, or content moderation, lots of stuff.
When it comes to this particular challenge, I mean, we're my area of expertise on, on
whether products are addictive, whether they're too engaging, that, that I think I can
pretty conclusively say that this is not something that the companies are going to solve for
us or that we want them to solve for us because again they are designed to be engaging that's
why we buy them we want them to be interesting right we Netflix is never going to start making
crappy shows that are not entertaining right okay but you but you did produce a solution that
they can solve for sure I don't for the addicted people yeah right I know but it's also like
let's I think that there should be a greater realization of these techniques that they're using
to hook people.
And personally, you know, I don't know, stand on the merits.
Don't do these tricks to, you know, play with people's psychology in order.
I don't think it's tricks.
You know, variable rewards.
Yeah.
That's not a trick.
You know, what makes romance romantic is variable rewards.
What makes a book fun to read is variable words.
Sports, spectator sports.
You want to talk about addiction.
You want to talk about going to a bar during a big game and try and turn off the TV.
You will not leave with.
your life okay right but the example that you give when you try to talk to people about why they
should put this in their apps isn't about you know sports games maybe you do talk about sports a little
bit it's about a pigeon that can't stop hitting this bar in order to get food and I don't really
like the idea I mean if you want to do that to a lab animal fine but I don't really like the
idea of of productizing that inside a nap otherwise I don't know maybe you wouldn't
maybe maybe the book wouldn't be such a necessary read if this new book wouldn't be such a
necessary read if we had designers that actually focused on making stuff interesting versus
turning us into those pigeons and the food I think that's what makes them interesting
I mean that is why we read the books that is why we watch the TV shows that is why we watch
the sports game it's all about variable rewards I mean and this this isn't you know it's not
a trick
it is
you know something
when something is variable
mysterious that's what makes it interesting
and that's that can be a very good thing too
that doesn't mean it's it's nefarious per se
you don't see any difference though
between like a book with like a short chapter in the middle
or a game a baseball game
where like your team gets like nine runs in an innings
in an inning and an app developer
you know tracking all of the different data
that you use in optimizing to get
you, you know, through these variable reward model, optimizing to get you using the app at
the, you know, at the maximum amount of engagement and maximum amount of time.
Do I see a difference? Isn't there a bit of a difference where like, you know, the tech,
I mean, the techniques of baseball, the rules of baseball, for instance, are very different from a
casino, very different from an app using these variable rewards, incentives in order to get people,
you know, back and different from what the person does to the pigeon.
You think that when you watch a sports game on TV, that that's not hyper optimized for all kinds
of crazy psychology around identity, around tribal mechanics, around, you know, all the
investment that people make in their teams. I mean, if an alien came down to Earth and watched
this crazy behavior we have of rooting for people we've never met just because they wear our team
colors and the money we spend, the obsession people have, I mean, I think this is part of
of my beef is that we only see the new stuff as dangerous.
The stuff that's been around before we were born, oh, that's commonplace, nothing to
be worried about.
You think people don't spend their way too much money on sports memorabilia and all the mumbo
jumbo and crap that is associated with that kind of stuff?
Of course they do.
And look, here's the crazy part.
Somehow when it's my obsession, that's okay, right?
Telling my family to go to hell because I have to turn on my television so that I can
watch a sports game live, and I don't care because I got to watch my team, that's okay.
But, oh, I want to play a video game. Oh, that's evil. That's rotting your brain. Come on.
It's a pastime. And there's nothing wrong with either of them. It's about to what degree.
It's as you say, it's about case by case. If you ignore your family and don't have your
responsibilities to them because you're obsessed with watching sports, hey, maybe that's a problem.
Same would go for social media or a video game. It's not necessarily about the behavior itself.
It's about the consequences about that behavior.
Yeah, of course.
But I also like, I don't know, I think that everyone does defenses as a little bit wanting, you know, from my perspective.
And I think there is a difference between, you know, watching a baseball game and, you know, being drawn back to Robin Hood day after day to day trade when it's clear that, you know, a large vast majority of day traders end up losing.
Yeah, I agree.
I mean, day trading wasn't invented by Robin Hood.
day trading has been around for a very, very long time.
It's a wonderful way for most people to lose money.
This is the whole point about like when you use these techniques to make it more addicting,
that's when I see the issue.
So yeah, we'll go case by case.
It's exactly case by case.
So when Robin Hood, so Robin Hood actually approached me to work with them.
And I backed off because I think some of their tactics were not ethical.
So they did read your book.
They did read my book.
That's how they found you.
Absolutely.
In fact, one of the founders approached me in a bar back when it wasn't called Robin Hood.
It was called Cash Cat.
or fat cats, something like that before it was called Robin Hood.
And he read my blog even before my book was published.
And I met with those guys for sure.
And I was always kind of a little uncomfortable with whether they would do the right thing.
And look, they've pushed the boundaries and they've done too much.
And then regulators have come in and said, yeah, you can't do these things because it's, it's unethical.
A user who killed himself after he, you know, thought that he had lost hundreds of thousands of dollars.
So obviously, I mean, you can't, you can't, like I said before, you can't take one example and like the COVID thing and why tech is good.
You can't take one example and say that's why Robin Hood is bad.
But it does seem like these techniques can be used for evil and or for, you know, nefarious purposes.
Let's not go all the way to evil.
Totally.
Every tool can be abused, misused as well.
For sure.
Well, NIR, thank you so much for joining.
It was a great conversation.
As always, great talking with you.
I hope you'll come back.
what's the next book going to be?
Oh, good question.
I don't know quite yet.
It'll be a few years probably.
All right.
Sounds good.
Well, looking forward to having you on when that comes out and hopefully when you're back in New York,
you and I can get together and keep up the discussion.
That sounds great.
Thanks so much, Alex.
All right.
Thanks for coming on.
The books are indistractable and hooked.
If you haven't read them yet, go ahead and grab them and see what we've been talking about
and see whose side you fall.
Although I think there's a lot more common ground between you and I.
then maybe I let on over the course of this conversation.
But exciting nonetheless.
Well, thanks so much to Nate Gwattany, our editor and Red Circle for hosting and selling the ads.
And most importantly, to you, the listener, show would be nothing without you.
And I appreciate it.
We'll include some variable incentives so that you'll be coming back more and more.
All right.
We'll see you next time on Big Technology Podcast.
That'll do it for us here on the show this week.
Take care.
Thank you.