The Tim Ferriss Show - #387: Tristan Harris — Fighting Skynet and Firewalling Attention
Episode Date: September 19, 2019"Big Brother isn't watching. He's singing and dancing. He's pulling rabbits out of a hat. Big Brother’s busy holding your attention every moment you're awake. He's making sure you're always... distracted. He's making sure you're fully absorbed. He's making sure your imagination withers. Until it's as useful as your appendix." — Chuck PalahniukTristan Harris (@tristanharris) was named by Rolling Stone as one of the "25 People Shaping the World." He was featured in Fortune's 2018 "40 under 40" list for his work on reforming technology, and the Atlantic has called him the "closest thing Silicon Valley has to a conscience."Formerly Design Ethicist at Google, he is a world-renowned expert on how technology steers our decisions. Tristan has spent nearly his entire life studying subtle psychological forces, from early beginnings as a childhood magician, to working with the Stanford Persuasive Technology Lab, and to his role as CEO of Apture, which was acquired by Google.Tristan has briefed heads of state, technology company CEOs, and members of the US Congress about the attention economy, and he's been featured in media worldwide, including 60 Minutes, PBS News Hour, and many more. He is the co-founder of the Center for Humane Technology, which can be found at Humanetech.com, and cohost (with Aza Raskin) of Your Undivided Attention podcast, which exposes the hidden designs that have the power to hijack our attention, manipulate our choices, and destabilize our real world communities.This episode is brought to you by MeUndies, which brings you the softest undies known to man. Whether you like crazy prints or classic black, MeUndies give you the freedom to express yourself comfortably.They're made with soft, sustainable fabric, and they're available in sizes from XS to 4XL.MeUndies has plenty of options for those looking to up their undie game. You can join its monthly membership. You can build a pack of your favorite undies or socks and save up to 30 percent. You can match with your better half if you're into that sort of thing. New prints drop every Tuesday, and members get access to exclusive prints every month. MeUndies has a great offer for listeners of this podcast: Get 15 percent off your first order with free shipping and a 100 percent satisfaction guarantee at MeUndies.com/Tim.This episode is also brought to you by LinkedIn Jobs. The right hire can move your business quantum leaps forward, while the wrong hire can crater it. Luckily, you can rely on LinkedIn Jobs to find you the most relevant, qualified candidates so you can focus on making a hire you're excited about.With five hundred million active members, LinkedIn attracts people every day who want to make connections, grow their careers, and discover new job opportunities. Note that 90 percent of LinkedIn users are open to new opportunities, but not actively scanning job boards. Post a job today at LinkedIn.com/Tim and get $50 off your first job post!***If you enjoy the podcast, would you please consider leaving a short review on Apple Podcasts/iTunes? It takes less than 60 seconds, and it really makes a difference in helping to convince hard-to-get guests. I also love reading the reviews!For show notes and past guests, please visit tim.blog/podcast.Sign up for Tim’s email newsletter (“5-Bullet Friday”) at tim.blog/friday.For transcripts of episodes, go to tim.blog/transcripts.Discover Tim’s books: tim.blog/books.Follow Tim: Twitter: twitter.com/tferriss Instagram: instagram.com/timferrissFacebook: facebook.com/timferriss YouTube: youtube.com/timferrissPast guests on The Tim Ferriss Show include Jerry Seinfeld, Hugh Jackman, Dr. Jane Goodall, LeBron James, Kevin Hart, Doris Kearns Goodwin, Jamie Foxx, Matthew McConaughey, Esther Perel, Elizabeth Gilbert, Terry Crews, Sia, Yuval Noah Harari, Malcolm Gladwell, Madeleine Albright, Cheryl Strayed, Jim Collins, Mary Karr, Maria Popova, Sam Harris, Michael Phelps, Bob Iger, Edward Norton, Arnold Schwarzenegger, Neil Strauss, Ken Burns, Maria Sharapova, Marc Andreessen, Neil Gaiman, Neil de Grasse Tyson, Jocko Willink, Daniel Ek, Kelly Slater, Dr. Peter Attia, Seth Godin, Howard Marks, Dr. Brené Brown, Eric Schmidt, Michael Lewis, Joe Gebbia, Michael Pollan, Dr. Jordan Peterson, Vince Vaughn, Brian Koppelman, Ramit Sethi, Dax Shepard, Tony Robbins, Jim Dethmer, Dan Harris, Ray Dalio, Naval Ravikant, Vitalik Buterin, Elizabeth Lesser, Amanda Palmer, Katie Haun, Sir Richard Branson, Chuck Palahniuk, Arianna Huffington, Reid Hoffman, Bill Burr, Whitney Cummings, Rick Rubin, Dr. Vivek Murthy, Darren Aronofsky, and many more.See Privacy Policy at https://art19.com/privacy and California Privacy Notice at https://art19.com/privacy#do-not-sell-my-info.
Transcript
Discussion (0)
At this altitude, I can run flat out for a half mile before my hands start shaking.
Can I ask you a personal question?
Now would have seemed the perfect time.
What if I did the opposite?
I'm a cybernetic organism, living tissue over a metal endoskeleton.
Me, Tim, Ferris, Joe.
This episode is brought to you by MeUndies. MeUndies makes the softest undies known to man.
That's what the copy says. And they are soft. They're really soft. Whether you like
crazy prints or opt for classic black, MeUndies gives you the freedom to express yourself
comfortably. I wonder what expressing yourself, of course, within legal bounds means in this case,
but I do like to express myself. I'm wearing some tie-dye MeUndies right now as I record this.
In the room next to me, I've got some pizza and video game prints. Those are not on the same pair
of underwear, but two separate ones. And I'll be packing for a trip, and I have a nice stack
of MeUndies going with me. Why? Because they're comfortable. Very, very comfortable.
MeUndies has plenty of options for those looking to up their undie game. You can join the monthly membership. You can do build a pack, that is building a 3, 6, or 10 pack of your favorite
undies or socks and saving up to 30%. You can select a matching pair to match with your better
half or just pick out one pair that strikes your fancy. And there are some pretty ridiculous ones that I specialize in personally.
MeUndies are made with soft, sustainable fabric
and available in sizes from extra small to 4XL.
Fun new prints drop every Tuesday,
and members get access to exclusive prints every month.
MeUndies has a great offer for listeners of this podcast.
For any first-time purchase,
you get 15% off and free shipping. They also give you a 100% satisfaction guarantee. And I
like to be satisfied with my underwear. To get your 15% off your first order, free shipping,
and a 100% satisfaction guarantee, go to MeUndies.com slash Tim. That's MeUundies.com slash Tim. That's meundies.com slash Tim.
This episode is brought to you by LinkedIn Jobs. Hiring can be hard, really hard. And it can also
be super, super expensive and painful if you get it wrong. I certainly have had that experience
firsthand multiple times, and I am not eager to repeat it. So I try to do as much vetting as possible on the front end. And today,
with more qualified candidates than ever, you need a solution. You need a platform that helps
you to find the right people for your business. LinkedIn Jobs does exactly that. More than 600
million users visit LinkedIn to learn, make connections, grow as professionals,
and more than ever, discover new job opportunities. In fact, overall, LinkedIn members add 15 new
skills to their profiles and apply to 35 job posts every two seconds. That's a crazy stat.
LinkedIn does the legwork to match you to your most qualified candidates so that you can focus
on the hiring process, getting the person into your company who will transform your business.
They make sure your job post gets in front of the people with the right hard skills
and soft skills to meet your requirements. They've made it as easy as possible.
So check it out. To get $50 off of your first job post, go to linkedin.com slash Tim. Again, that's linkedin.com slash Tim
to get $50 off of your first job post. Terms and conditions apply. But check it out, linkedin.com
slash Tim. Well, hello, boys and girls. This is Tim Ferriss, and welcome to another episode of The Tim Ferriss Show, where it is my job each and every episode to interview world-class performers or people who are exceptionally good at what they do, domain experts who can speak to the subtlet. His name is Tristan Harris, at Tristan Harris,
T-R-I-S-T-A-N, H-A-R-R-I-S, on Twitter. Rolling Stone has named him one of the 25 people shaping
the world. Tristan was featured in Fortune's 2018 40 Under 40 list for his work on reforming
technology, and The Atlantic has called him the closest thing Silicon
Valley has to a conscience. Formerly design ethicist at Google, he is a world-renowned
expert on how technology steers our decisions. Tristan has spent nearly his entire life studying
subtle psychological forces from early beginnings as a childhood magician. We talked quite a bit
about this and also his study of pickpocketing
and other fascinating domains that I'm very, very interested in, to working with the Stanford
Persuasive Technology Lab, and to his role as CEO of Apture, which was acquired by Google.
Tristan has briefed heads of state, technology company CEOs, and members of US Congress about
the attention economy, and he's been featured in media worldwide, including 60 Minutes, PBS NewsHour, and many more. He is the co-founder
of the Center for Humane Technology, which can be found at humanetech.com. Please enjoy,
without further ado, my wide-ranging conversation with Tristan Harris.
Tristan, welcome to the show.
Thanks for having me, Tim.
I am thrilled to finally have you on the line to have a wide-ranging conversation because we have
many mutual friends and many of my listeners have requested you on the show. And I thought that perhaps a good place to start
would be beautiful Bali, Bali, Indonesia. So, I have in my notes here a bullet that references
a retreat in Bali, that's in Indonesia for folks who are curious about Bali, on hypnosis, pit pocketing and magic. So let's, let's dig into that. Why,
why go to such a thing? And what did you learn? And did you accidentally sign any powers of
attorney or walk out empty pocket? It was actually one of the best life choices that I think I've
ever made. You know, I was, as a magician magician, or as a kid, I was a magician early on
and got interested in reading, just siphoned up all of this information from books and things
like that. But then it wasn't really a thing that was going active in my life as an adult.
But then about, I don't know, sometime in 2016, I saw this, I was part of this newsletter,
I think his name is James Brown.
He's a hypnotist based in the UK.
And he said he was going to run a workshop on hypnosis, pickpocketing, and magic in Bali.
And I just thought, this is too good to pass up.
It was about the one week of vacation I had in the year.
And I ended up going out there.
And it was something like me and eight or nine magicians.
I was probably the most amateur.
And it was so much fun because every night you just have these magicians going out on the town.
Like we go to a bar somewhere in Bali.
And they would just clean up, not clean up in the sense of their money in their wallets,
but in the sense of just having fun with people.
And you would just watch these guys play with people's attention you know, they didn't know what they were up against. And it wasn't pickpocketing in an adversarial sense, like, let me get all your
money. It was done in the... Putting money into pockets.
Yeah, yeah. It was done in the guise of, hey, I'm a magician, this is what I do. But, you know,
you want to have some fun. But it's just really
fascinating, especially pickpocketing. I don't know if you know Apollo Robbins.
I don't, but I'm definitely going to look this person up to learn more.
He's also one of the world's most famous pickpockets. He's a TED. He actually helped
me with my TED Talk when I was there. He actually worked and collaborated with a bunch of neuroscientists on essentially the limits of attention and stuff that he had picked up just by doing it, but then is later now being confirmed by neuroscience.
And that's why I find fascinating about magic and pickpocketing is they were kind of the first applied psychologists.
I mean, they've been doing this for hundreds of years, right?
And I just love that our you know, our science is catching
up to what the practitioners have known how to do for a long time. And I'd love for you to
perhaps talk to some of the techniques or principles behind good magic or pit pocketing.
And I'm sure there are and we'll have a chance to explore parallels
in other places. But for instance, speaking from my own personal experience, I, about a month ago,
had a chance to go to the Magic Castle in Los Angeles for the first time. And the recommendation
from the member who brought us in was to go to the closeup room,
the closeup magic room.
And it's seats somewhere between 12 and 20 people.
It's very small room.
And there's a table right in front of you.
It was about five feet from me.
And after the performance that we saw,
which was,
which was truly staggering. I mean, it was just world class in terms of sleight of hand. A number of friends who were with me, one a very high-end musician, the other a very successful entrepreneur, and then a number of other folks walked out and said, I have to question everything in my reality because of what could be done right in front of you.
I mean, literally right in front of you. I mean, literally right in front of you.
Right.
Are there any particular sort of techniques or principles
that stand out for you in the realm of magic,
pit-pocketing, hypnosis, or other
in terms of these practitioner arts?
I mean, the punchline is it's really about the limits of attention in all of the cases, right?
I mean, I think the other thing you're also getting at is you had some pretty successful people by your side, it sounds like.
I mean, business people, entrepreneurs.
The thing about magic that I always found most interesting is that it has nothing to do with the level.
Like your level of being inoculated from the effect has nothing to do with your intelligence.
Right.
Right.
It's like, which is so fascinating, right?
Because you could have the most successful business person or, you know, off the charts prodigy in mathematics or something like that.
But it has nothing to do with the extent to which they can be fooled in a close up, you know, experience or pickpocketing. And that they are living in separate different
domains, that those are two different areas of skill or inoculation, I found fascinating.
Because I think it says so much about what magic is doing. It's not about intelligence. It's about
something more subtle and about the weaknesses or the limits or the blind spots or the biases
that we're all trapped inside of.
You know, I always say it's like we're trapped in a mind-body meat suit that has a set of, you know, bindings and bendings to how we see the world
that you don't know that you're living inside of that corrective tissue that happens to bend attention in that way.
I mean, so misdirection is the core principle. You look over here and you think you're going to catch the magician doing it because you're looking where you would think that he doesn't want you to look, but he's probably by behind it happened three steps ago and usually there's a setup so sometimes the the actual trick has happened you know at the very beginning and then
there's there's layers upon layers that are being built and the magician's usually working two or
three steps ahead i wish i could give more you know concrete examples but you know the magician's
code is you don't you don't give this stuff up to the public the funny thing about magic of course
is it's all it actually is all public you just have to buy a book, but it turns out people don't read books. And so it remains a
secret. Um, but you know, I think that in pickpocketing, what was fascinating is it's,
you know, people think, Oh, do you just, you know, you grab it when I'm not looking. And it's not
like that at all. I mean, as a pickpocket, the person will stand right up next to you. They'll
look at you. They'll talk to you. You're just having a conversation. They will, with you, they will look down at your left pocket
and they'll tap it and they'll say, oh, so what's in that pocket over there? And then you'll pull
out, you know, keys and a wallet and you look at it and say, oh, okay, that's interesting. What's
in that wallet? And they'll, you know, you're right there with them as they're doing this.
And then it's in this other moment where they say, well, look at what's happening in the other
pocket. And they'll turn around and walk around you. And there's all this other moment they say well look at what's happening in the other pocket and they'll turn around and walk around you and there's all this mischief that starts to happen
in those moments in between but what's interesting about pickpocketing is the way people on the
outside the public tends to think about it is they just kind of grab it when you're not looking but
what's fascinating is it actually happens right underneath your nose so i just i love it it's it's
so amazing to watch these guys work. Now, you were very recently...
Is testifying the right verb here, or presenting?
Yeah, I was the lead witness in a Senate hearing on persuasive technology.
And if I'm remembering correctly, and feel free to fact-check this,
but you talked about the magicians pick a card pick a card any card uh or alluded to that
might have been in your ted talk it was in it was in one of the two uh but we'll come back to the
to washington dc but the when you mentioned that it it's i'm going to try try to tie two things
together here it made me think of the you you know, if you control the menu,
you control the choices, which is one of the hijacks you talk about. Can you describe this?
Can you elaborate on that? Yeah, well, I mean, we tend to live in, you know, we're in the United
States, and we tend to live in a libertarian culture that's all about celebrating and
protecting the freedom to make our own choices.
But at a very, very, very deep level, we're not also taught to question who controlled the menu of choices that we are picking from. This also occurs, I think, at a deep spiritual or identity
level. You can make any choice you want, but you don't see the invisible constraints on how you are seeing the world in such a way that you're only picking from the five habitual things that shows up in your mind on a daily basis.
But in magic, the principle is just – and it's actually more nuanced than this.
It's funny, Darren Brown, the famous mentalist, I was emailing with him the other day, and he was saying, I could probably teach you some things to update your view that this is the most important principle in magic. But
if you control the menu and the order of options as they're presented and the emphasis as they're
presented, you can, I mean, I wish I could do a demo here, but I'm not a good enough magician to
do it live. You can make it seem as if someone has whittled down from the entire deck of cards down to one,
from 52 cards down to one.
And it seems as if they've made their own free choice along the way in four distinct choice moments.
But in fact, you know exactly what card you wanted them to get to all along.
And the kind of questions you can ask people shape the outcome,
the kinds of sequencing of the questions, the meaning making it.
It's hard to do this without actually giving people the whole techniques,
but I think this is something that is really important to understand,
whether it's in the way that technology presents menus to us
or the way that society or culture do.
Any way you choose, you're still choosing within a menu that has
other people's interests behind it.
You mentioned invisible constraints. So the assumptions that we may not be aware that
we're making or the box that we've created for ourselves in some fashion or adopted from our environment or our parents or other places.
Are there any particular sort of tools or mental models or anything at all that you use to try to identify the invisible constraints in your life?
Yeah, I mean, it's a great question. I mean, fundamentally, I feel like the process of
waking up or awakening is to try to see assumptions that we're making or, you know,
guiding principles in our choices that, you know, are we even asking the right questions? Like,
so let's say right after this interview, you know, you get outside and you could go in any direction.
Like, what is the, just think about that, that moment. So I'll leave
this, this podcast studio and you'll leave your, your house. And then what comes up into your mind
about where to go? Right? I mean, it's usually a set of habits. Maybe it's like, oh, what do I need
to do? Let me refer to my to-do list. What is it? You know, which cafe do I want to go to for an ice
coffee to run away from that anxiety that I was feeling because I don't know what to do with
myself. There's this limited, we're kind of creatures of habits, right? And so,
especially when we're inside of embedded environments that we've been in for long
periods of time, we tend to play out the same patterns over and over again. This is where the,
you know, this is both a kind of a new age throwaway statement and also a real one,
which is, you know, wherever you are, what is it called again? I think it's wherever you go, there you are.
Yes, something like that.
The Jon Kabat-Zinn book title.
Jon Kabat-Zinn.
Yeah.
Yeah, exactly.
Jon is also a friend.
And, you know, the point is that we repeat our same mental habits everywhere we go.
So, you know, in so many ways that are often invisible to us, we don't notice the consistency of a structure to the way our minds happen to process information or the way that we think about what to do with our time or the way that we value things or we sort things.
All those processes that are sitting inside of us, happening all the time, are often invisible and not available for introspection.
They basically run our whole life,
which is why they say,
like, oh, maybe I'll go on a meditation retreat,
or maybe I'll go find myself in Bali.
But then, you know, you find that, you know,
and as I know from your meditation experience
with, you know, the monkey mind,
it's like we just have these recurring processes
that follow us everywhere.
And I think if you can't see them,
then they run your life.
And then we're kind of like automatons. We're robots that are living according to the previous set of constraints.
And the extent to which we have choice is the extent to which we see those patterns. And as
far as techniques to see them, I mean, I think that's tricky. Have you ever done the work of Byron Katie? I have. I find a number of her one-sheets,
sort of these one-page worksheet prompts
to be very helpful.
It takes a little getting used to.
It can seem very strange and nonsensical at first,
but I think if you're willing to force yourself to
do the thought exercise of contorting, um, the, the beliefs, uh, you know, these statements that
you take is true. It's, it's super valuable. Could you, could you describe, uh, if, if,
yeah, if you've done it, how, how you've done it? It sounds super abstract for those people
who haven't seen it, but I mean, she's basically just come up with this set of four questions you can ask of
any moment in your life that causes stress.
Because usually what's happening is you are creating that stress for your own mind and
you just can't see it yet.
So I kind of think about it to link it to the magic metaphor that our brains are living
inside of this 24-7 magic trick, which is that whatever thought pops into our mind, we believe it.
We don't not believe it.
We automatically step into it and we see the world through that thought, through the assumptions of that thought.
And essentially what her four questions do is they let you see the exact opposite of that belief, which then question makes you not take your
beliefs and your thoughts so seriously. And it's a great parody with, with meditation,
but essentially something like, um, I don't know, for example, you're driving and there you are.
And then some guy in a red Corvette like cuts you off and you're like, I don't know, something like
that guy is an asshole or something like that. Right. And you are convinced of it.
Every bone in your body, every bit of your nervous system,
just you know for sure this guy is impatient, he's inconsiderate.
All of these thoughts just rush into your mind,
and you have utter certainty about your experience
and who this other person is, right?
Let alone the fact you don't know if this person
is rushing to go get their wife who's at the hospital because something's wrong. I mean,
you don't know, right? So the four questions are, okay, that person, you know, that guy is
inconsiderate. The first question is, is that true? That guy is inconsiderate. And you have to like
pause and sit there, you know, there you are in the car,
looking at this person and say, that guy's inconsiderate. Is that true? Okay. Second
question is usually to reinforce that and loosen up maybe the beliefs a bit, which is, can you be
absolutely sure that it's true, that that guy is inconsiderate? And you realize, no, I can't.
In fact, I just thought that the moment that he, you know, stepped in or ran in
front of me. Okay, so then we get to the third question, which is, okay, what happens? What do
I what? How do I react? What images come to mind? How do I feel? How do I relate to the world? How
do I relate to him when I believe the thought that guy isn't considerate? What happens? And
the answer would be something like, see him as you know naive i see
him as thoughtless i see my you know i i don't care about him i i want him to be you know removed
off the face of the earth i i want that car out of my way i get angry my my body i feel you know
all these things right the entire you're trying to basically list the ecology of just what that
one belief in that one belief
and that one moment of that guy is inconsiderate does to your whole nervous system. So it's like
a full body scan, kind of full belief scan of what that does. And you sort of see, oh my God,
just by believing that one thought, it's totally transformed my entire experience in that moment
with reality. I am now seeing reality in a totally different way, and usually in a more distorted, disconnected, not centered, not calm, not connected way. And the fourth question is,
once you realize the kind of absurdity of that ecology of beliefs, is who would I be in that
moment without the thought that that guy is inconsiderate? And so there he is, he crosses,
he cuts over right in front of me, but without the thought that guy is inconsiderate. And so there he is, he crosses, you know, he cuts over right in
front of me, but without the thought that guy is inconsiderate. Maybe it's something like I have
curiosity about what happened. Why did he do that? Um, you know, whatever you, you let you get that
ecology. And then the last step is to list the opposites of the belief. So, um, instead of that guy isn't considerate one opposite is uh that guy is very considerate
or he is considerate and you you try to find evidence is there any way in which that could
be true and you know in that moment prior to doing this process you were convinced that this guy just
was absolutely inconsiderate but as you after you've done those four questions you think is
there any evidence for him being considerate well what if he's on the way to the hospital to
meet his wife who just is in labor or something from being pregnant? And you realize that he
could be the most considerate person in that way. Or another opposite to he's inconsiderate could be
I am inconsiderate. And the evidence there would be that I'm inconsiderate of the fact that I don't know the ecology of this other person's life.
And I rapidly jump to conclusions.
So what this process does, and I feel like I didn't mean to go through it for so long, but it shows you something fundamental about the ways that our mind trap us in almost like a permanent fixed set of glasses that temporarily occupy the
way that we see the world and make meaning. And when you see that, you just stop taking your
thoughts and your beliefs quite so seriously. And you realize that even in those moments when you're
stressed and you're convinced it's because the world really is, you know, doing that thing that
pisses you off, it lets you see maybe I'm actually doing this
for myself. And that also gains and increases responsibility, because that means that now
we're responsible for our own experience, as opposed to, you know, the world is constantly
terrorizing us with situations. Thank you very much for that overview.
That was really good. That was really good. I spent two days with Byron Katie in a small group. And for people who are listening, I will confess something that someone listening might also experience, which is when I was first given this exercise and did it as related to a few different situations, I had a lot of resistance.
Yeah, I did too.
Yeah, it struck me as this sort of semantic tail chasing or highly abstract. And when you dig into
it, if you give it a chance as a thought exercise, it can be incredibly valuable. I mean, some of the transformations
that I witnessed in the room with people who had longstanding beliefs about, say, a family member,
which were completely crippling, like had paralyzed a family situation,
was really remarkable. And you mentioned the three questions. Is it true? Can you absolutely
know it's true? How do you react? What happens when you believe that thought? And who would you
be without that thought? A couple of points that were really valuable to me or questions to ask,
as a subsection under how do you react what happens when you believe that thought one of the subsets of that that uh byron uh
byron katie has on the website it's just the work.com and you can find all this stuff for free
is you know do any obsessions or addictions begin to appear when you believe that thought i think
this is yeah it's a really really good one and really important.
You mentioned leading into this, you know,
do you go to the coffee shop to drink a coffee because you're overwhelmed or
worried about not knowing what to do, right?
And then that likely triggers a whole new set of physical sensations,
which trigger a whole set of sort of emotional and thought responses, which you might
blame on the circumstances of two hours before. But in fact, you just took down 200 milligrams
of caffeine in four minutes, right? So, it's like fractal levels of running away from anxiety. You
know, it's like running away from anxiety creates an experience that that's an addiction that then
creates more anxiety that we then run away from.
And we spend our whole day clicking between Facebook and email, Facebook and email.
And then you're like, where did my day go?
Exactly.
And the last thing for now that I'd like to say about this, because I'm really glad you brought it up, is that the portion of creating the opposites is where I had the most resistance. And the, for instance, that person is very considerate, or I am inconsiderate, head and had to come up with three hypothetical cases where
this could be true, whatever the permutation is, what would they be? And it's really powerful.
And I don't want to belabor the point, but I do encourage everybody to check it out and try it
out. I'm really glad you brought it up. Yeah. I mean, it's, I totally
appreciate what you're saying not to dwell so much on, on her work, even though it actually has been
impactful for, I think probably other things we may talk about is you just, you just realize the
way that the mind so quickly steps into some new belief with, with utter certainty, you know, and
just to your point, like, you know, when you find these opposites, like, well, maybe I'm not
considerate. Maybe that person is so considerate. It's like, no, sometimes that guy just is
inconsiderate. He actually just wasn't looking and he's not trying to rush to save his wife and
whatever else. I mean, there's definitely an argument you could make that he was being
inconsiderate. And it's not meant to deny facts about reality, about someone else's objective state. But I think what it does overall
is makes you realize that we live in utter certainty about a world that's highly uncertain,
and that whenever stress comes about through that process, we might be able to you know down regulate a lot of that stress by just not taking our thoughts
and beliefs quite so seriously i mean it's an amazing tool uh and i you know it relates to
technology in a way because i think technology is this sort of false belief factory like it just
generates you know all of these false beliefs there's moment by moment by moment by moment
and i mean the premise of of her work and doing this process is so that you don't identify
with your thoughts.
I mean, the fourth question she asks, which is, who would you be without the thought?
It's not what would happen instead if you didn't think he's inconsiderate.
It's who would you be?
So it's an identity level question.
And this actually is really important because when you're doing belief transformation work,
when you do identity level work, it's much more persuasive. I mean, if you want to link this to
the stuff I know about Russia's influence campaign in the 2016 elections, I mean, a lot of it was
identity level work. Like we are African Americans and Hillary doesn't care about us. That was the
message that Russia went after. It's because identity level propaganda and, you know, identity politics, it's, it's the deepest level of psychological influence work. Now, in the Byron Katie sense, she's doing it to try to empower people to overcome, uh, the ways that they, their brain lies to them and deceives them. And the other sense, it can be used obviously to manipulate people. But, you know, in studying, I don't know, have you done neuro-linguistic programming?
You know, I first read, I have not done any training, at least not directly, but beginning in high school, which is, I think, when Tony Robbins really put NLP on the main stage in some
respects, became fascinated by the prospect or the implications as described by Tony Robbins in
his first book of NLP. Could you
describe that for people who don't know what it is? Yeah, I mean, I'm not an expert, but I have
taken some workshops in it. I mean, neurolinguistic programming is essentially a study of how
language and thought and meaning are, you know, basically each of us have a map in our own brain
of how we see reality. We're not
actually directly in touch with the reality in front of us. We're living through this mediated
map that, you know, and based on word choices we use, it shapes the reality that we have.
It's used in hypnosis. It's actually the basis for Ericksonian hypnosis and, you know, how you,
what kind of language choices to make and how you can
deepen people's experience or alleviate people's experience. Like a simple example, just to make
it concrete is something like, you know, think of a person that you love and see their, see their
face in your, in your mind's eye and then turn up the colors. So like just make the colors more
vivid. Do you feel more of the love or do you feel less of the love when you turn up the colors?
How about if you bring the image even closer?
So bring it up way close right in front of you and turn up the colors.
And then just noticing that even as you do this, you get different kinds of feelings and experiences versus, for example, if you turn down the colors, you make it grayscale.
What if you make it small and move it very far away?
These are all ways of playing with human cognition and experience.
Anyway, when you do this kind of work, it's used in counseling, psychological counseling as well.
And when you work with people on a counseling level, if you can do identity level transformation work where, for example,
if you ask someone the phrase, are you an athlete? I mean, if I ask you, would you say you're an
athlete? When I'm not eating donuts and sitting all day, I would like to think of myself as an
athlete. I used to be an athlete would be my real answer. I'd say I used to be an athlete.
So is that like when you sort of query your nervous system, if you say the phrase like,
I used to be an athlete, does that feel like when you sort of query your nervous system, if you say the phrase like,
I am an, I used to be an athlete, does that feel like the most accurate thing for you?
It does. Yeah. Because I think athlete, competitive athlete. So that's, that's,
I would say I used to be an athlete. I see. Right. So there you go. So that's your map,
right? It's like athlete for you means competitive athlete in some kind of professional sense,
which is interesting. I mean, a lot of people would probably answer that question. No. Right.
And yet a lot of people, I mean, I might answer that question, no. But, you know, do you exercise? You know, do you go to the gym? Do you, you know, I do boxing and some kickboxing stuff for fun,
just fitness classes. And, you know, I wouldn't put myself in the category of athlete, but just
notice that that's just, you know, whether I fall on the side of yes or no to that question has a
really big implication for how I see myself, right?
Definitely.
And it's totally arbitrary whether or not I call myself as part of the category of I am an athlete versus I'm not.
And what would make me an athlete?
What are the criteria?
Well, there we go.
Now I'm inspecting the map inside my brain that I've invisibly constructed some set of rules about when you officially qualify for being an athlete and when you don't.
And it's all artificial.
It's all arbitrary.
It's just coming from our own mind happening to organize these rules and obligations, which are self-constructed.
And it's through the NLP type stuff or Byron Katie stuff that you can living in this fractal kind of hall of mirrors in your mind that makes us think or believe all these things that are just kind of distortions self-constructed
out of invisible parts of our brain.
And waking up is the process by which we can shatter some of the glasses and see more clearly.
Yeah, and waking up, I mean, feel free to offer a counterpoint. It seems to me that waking up here is,
at least in part,
simply becoming aware of your habitual processes, right?
It's kind of like stepping out of the movie itself
in which you're the lead actor or actress
and stepping back into the audience
and watching,
becoming the observer of your own behavior.
And what you were saying
earlier about thoughts and beliefs and how much conviction we can have about a snap judgment
reminds me of something that B.J. Miller, who's a doctor and hospice care physician who's been
on the podcast who's helped something like a thousand people to die. His answer to the question I often ask, which is, what would you put
on a billboard, is, he actually got from a bumper sticker. So, I don't know the original attribution,
but it's, don't believe everything that you think. Which I liked a lot. And I think about language a lot, because when we're talking about language, to some extent, we're talking about labels.
And if we're talking about labels, we're talking about conceptual overlays that we're putting on top of our sensory input.
So it's really like how you're constructing reality. And when you look at something, we don't have to go into it in depth right now, but if people search for the 21-day no-complaint experiment, there's a, I want to say pastor, might be a reverend, Will Bowen, might be Bowen, B-O-W-E-N, who began doing an experiment with his congregation in which they would wear a rubber band or a wristband that was elastic. And they would,
they attempted to go 21 days without complaining.
And there are parameters which were mostly language-based for what
constituted a complaint.
And if you complained,
you had to switch your wrists and start your 21 day clock over again.
And the,
the,
the effects on people who completed the 21 days or even made it halfway on quality of life, on their thinking and the lens through which they looked at reality was so profound.
And if you really look at the nitty gritty of it, it's training and awareness of the statements in your mind and the statements that you use, just like Byron Katie's The Work, in a sense.
I mean, in essence, I mean, it's like, this is why, I don't want to switch into technology stuff, at least not yet.
But, you know, we say the attention economy is beneath all other economies, like the psychology.
Like, if you had a, you know, an amplifier or a voice, like an output for all of the thoughts running
through our heads, I mean, this is what constitutes our inner lives. This is the soundtrack. This is
the things that we're repeating invisibly. We don't even notice that we're repeating it because
it's almost doesn't have audio. But immediately, I mean, I've done, I know you have done lots of
meditation. And on a seven-day meditation retreat I once did,
that's what I was most surprised by, was just how quickly these next thoughts would come up,
and how quickly I was tempted to believe them. And, you know, the whole, like, wherever you go,
there you are, like, the same patterns of thoughts would come up, like the same self-doubt or the
same self-criticism. You know, I don't want this to sound dull for
listeners because I know that when people describe these things from a distance, it doesn't
sound as interesting as profound. But to your point about language, you're just making me think,
I remember where I first encountered your work, Tim, which was, or at least it was one of the
early recommendations you made, I think in 4-Hour Workweek, about the 22 immutable laws of marketing.
Yes. And which also was a profound book for me.
And, you know, the example of marketing
is all about using language to manipulate perception
and the fact that your mind organizes information
in particular ways.
And I remember one core thing in that book
is just the way that our minds create
kind of ladders of, you know, in competition, like invisible
categories, like safety, you know, which is the number one safest car in the world for, you know,
what's the most safe car in the world? And everyone says, Volvo. Great. What's the second
safest car in the world? And you realize your mind draws a blank. It's because your mind doesn't
even organize information past slot number one. And it's all based on the slots. You know, what's
the fastest car in the world? What's the safest car in the world?
And, you know, I think it's the same thing in our own lives that invisibly the way we construct, you know, am I an athlete or not?
I mean, these are the, it's just this, again, this sort of structure of identity, of belief, of meaning that makes up and constitutes, you know, our well-being, what choices we make, whether we dare to take those risks, whether we dare to jump off a cliff, whether we look at the world's problems in the face.
I think the psychology is everything and it doesn't seem important if you haven't looked
inside, which is also fascinating that people can spend their whole lives, not even, you know,
looking, looking in and hearing what the words that keep showing up in our brains are.
I didn't do my first meditation retreat, I think, until I was 32.
Well, you beat me.
20 years.
I did mine just a few years ago, so I was probably 39 or 40.
And for those people who want a little comedic relief,
one of the terms that one of the coordinators used, I don't know if it was Jack
Kornfield himself, he was there at Spirit Rock. It may have been one of the other teachers,
but they joked about Vipassana Vendettas, where people in the room would become so preoccupied
with the person 10 feet away who's coughing too often or who's like shuffling too often or
has like the noisy jacket with the zipper and it becomes this sort of obsessive focus,
which happens all the time in daily life. It's just not as obvious.
It's totally true. It's funny you mentioned this because, you know, when you're on a meditation
retreat, you're in silence for days. And what I find fascinating is the way that for whatever
reason, you just kind of, your mind locks onto people and you start making judgments about
them. Like you think like that person over there, oh, they just think this, or like, look at the
way that they, you know, serve themselves food quietly. Like they're just a slob or, you know,
like whatever the thing that comes up. And then what's funny is like, I don't know if you
experienced this, but in the last day of my meditation retreat, obviously we had this little,
we started finally talk to each other.
And you get to know who people are, and you realize just how off base you were.
And these invisible, how quickly our mind jumps to conclusions about people for whom we've literally,
we have never talked to.
We've never inspected the contents of their mind.
We're just, we get obsessed with it.
It reminds me of another attention exercise I did at Burning Man once that was really
powerful, actually.
Like if you're ever in a group setting, this is a super meta mind kind of podcast interview.
So hopefully listeners don't find this too conceptual and abstract, but it's actually
really fun stuff.
I mean, our attention is so profoundly happening without us really realizing it.
But this exercise I did, you're in a room of
people, like 30 people, and you're walking around in silence. And then you kind of stand on the edge
and you're led by a facilitator to first look around the room. So there you are looking at all
the 30 people and looking around the room and it says, they'll say like, so first just look around
the room and notice who you have noticed. Like notice that there's certain people,
certain faces that you've drawn,
that draw a lot more of your attention than other people.
Like right, like in an interim of 30 people,
you would think like,
oh yeah, we just, we pay attention to all 30.
But actually, if you look closely,
your mind is actually paying attention to a subset. For whatever reason,
there's a subset of people who you find more interesting.
Second question was, or second prompt was,
look around the room and now notice the people that for whatever reason you don't like. Like, you don't even know why you
don't like them. You just, or you're just not interested to connect with them or you would not
want to be with them or talk to them. Just notice that there's some people you've already selected
that you don't want to talk to. And isn't that interesting? Like, what about them
has you feeling like, I don't even want to talk to them. And then the third prompt was,
look around the room and notice all the people you didn't even notice. They're like the people
in between the faces who you mind completely skips over. And you don't even notice that you're doing
that.
And it's a really profound exercise. There's some other steps to it, but it really shows you that your mind is living inside of this selection filter that is pre-selecting certain bits of
information to reach your conscious awareness, and then hiding lots of others, and also polarizing
you against other people or sources of information. And you don't even know why.
You're just living inside of that hammer that's wanting to treat everything like a nail. But you
don't even know the direction of the hammer and that there are lots of nails.
Yeah, definitely. And I was also, as you're talking about this, these selection filters,
right? And the 22 immutable laws of marketing for people who want to look at the power of words
through a different lens uh and this this came up for me actually i should say this person uh
frank luntz uh came up for me oh i know him yeah yeah so he's come up for me in a few different
scenarios one a friend of mine very very very actually mutual friend of ours, but I won't name him by name.
Certainly very socially liberal guy, recommended, I think it's Words That Work.
I think that's the title.
Words That Work, yeah. It's not what you said, it's what people hear is the title.
Right, by Frank Luntz. And he came up recently because I was watching Vice,
the movie about Dick Cheney. And so Frank Luntz, for those people who don't know,
and I'm reading directly from Wikipedia here,
he's an American political consultant, pollster, and public opinion guru
best known for developing talking points and other messaging
for various Republican causes.
And I'll skip a bunch of it just to give a few examples.
He advocated use of vocabulary crafted to produce a desired effect including
use of the term death tax instead of a state tax and climate change instead of global warming
those are really powerful uh vocabulary reframes really really powerful if you think of the
implication the implications of of those reframes.
Totally.
And this is, I mean, we can certainly chat about Frank and the power of words, but the sort of meta, so feel free to jump in with anything you'd like to say. Yeah, I mean, I love you bringing this up.
I mean, I hope this, again, isn't too meta,
you know, trippy for people listening.
There's so much focus on language, but it does shape everything.
I mean, again, if people think climate change
versus global warming, the whole point is,
well, climate's always changing, right?
There's nothing to worry about
because it's always changing.
It's a neutral statement.
Another one that's like Frank is, you know know he's often about to be on the right and there's this guy george lakehoff who's on the left who wrote a book called metaphors we live by
and he's like an academic linguist um who you know has talked about the power of grounding
metaphors so grounding metaphor is if you think about something like the nation as a family. So invisibly, when we think about the nation,
it's structured, at least in English, as part of the family.
So we don't send our sons and daughters to war.
We don't want those missiles in our backyard.
There's a third one, too, I forgot.
Our founding fathers said that they're our fathers, really? Are shoot, our founding fathers. Yeah, our founding fathers said that,
is it, they're our fathers, really? Are they really our fathers? So invisibly,
we have this baked into our language at a structural level that organizes almost like a geometry of meaning about how we see the nation. Those are our sons and daughters. Those
are our founding fathers. This is our backyard, our property, you know. And it conjures up a whole bunch of assumptions
about how we see the world that then structure, you know, entire political beliefs about whether
to go to war and all this kind of stuff. And so, as you've said, it's like language is profoundly
shaping not just like our own, you know,, consequences, and what you see on a
meditation retreat, but, you know, world history, and whether or not we tackle something like
climate change, or we go to war with Iraq. These are really, really big deals. And I think that we
have to gain literacy for our minds. I mean, I actually think, I mean, this is kind of the essence
of our work now is that, you know, fundamentally, we're at this point where if we can't see our own psychological, you know, what's the words? I mean, if we can't see the way
our minds are structuring information, and we are just simply, as you said before, you know,
like run by them, like they're the automatic process that runs ahead of our choices,
then it's already done. Like it's already checkmate because we're already, you know, being led by things that don't produce, you know,
choice making that averts the kind of catastrophes that I think that we all want to avert. And I
think this is, you know, my co-founder of the Center for Humane Technology, Asa Raskin says,
the way to win the final war is to make peace with ourselves, that this is the architecture,
like this is how we work.
And the only way we're going to,
you know, either get over ourselves
and, you know, take those risks
to make the choices
we want to make in our own lives
is by understanding ourselves better.
And the only way we're going to
solve civilization's problems
is by, you know,
gaining an understanding
for the things that would stand in our way.
I agree.
And we're going to, we are going to segue to technology very, very briefly.
I want to, again, encourage people to, as a way to become more familiar with the words
that you are using and the language you're using, which is basically this, you could
think of it as the software that you're running in a sense, which is really important.
Like you might want to inspect that code.
Yeah.
Is to take a look at Byron Katie's,
the work and the 21 day,
no complaint experiment is also a great way by focusing on one particular
category of language to become meta aware more broadly of the, what the voice in your head is actually telling
you all day long and um technology let's talk about uh how you first came to know bj fog
maybe this maybe this is a place to start and then we then we can leapfrog all over the place from there.
Who is BJ Fogg?
Sure.
And do you know BJ, by the way?
Just curious.
I do. I haven't spent time with him in years, but there was a period of time when I was living in Mountain View that we had a chance to spend a decent amount of time together.
And we have spent some time together, and we just have actually recently started emailing again.
Oh, cool. Yeah.
Yeah, so BJ is a psychology professor at Stanford,
and he ran something, I think continues to run something,
called the Stanford Persuasive Technology Lab
that basically applies everything we know about the psychology of persuasion to technology.
And basically you're asking the question in the lab, how can technology persuade people's attitudes, beliefs, and behaviors?
And a lot of alumni have come out of this lab. I mean, I was project partners with the co-founder
of Instagram, Mike Krieger. A lot of people went on to work at LinkedIn and Facebook and
the early social media companies because this was the perfect set of tools to apply to the way that we design technology. But in the lab, you
know, you study everything from clicker training for dogs, you know, like, how do you know what
to train a dog to do the behaviors you want and not the ones you don't want. We read a book called
Don't Shoot the Dog by Karen Pryor. Amazing book. Amazing. Yeah. Oh, you know this one? Oh, yes,
I do. Yeah. Yeah, I it's it's i recommend to everyone
yeah it's funny i mean it's like i programmed myself to to enjoy boxing and kickboxing because
i just get a smoothie right afterwards it's sort of pavlovian conditioning uh click a clicker train
in the form of a smoothie um and uh it's you learn that you learn social psychological dynamics
i remember cialdini i mean a lot of the marketing stuff that you have already pointed out to many of your listeners, I'm sure. But it's really just a study,
again, of the code. This is like delving into the code of the human mind. And this is what we find
persuasive. And this is in 2006. So it's the year before the iPhone. So the iPhone hadn't even come
out yet. And we had a class on persuasion through
video and through mobile apps. And the founder of Instagram and I, before he had anything close to
the idea for Instagram, we worked on applying these principles for good. That's the thing
people get wrong about the lab. They think it was this sort of diabolical training ground,
evil psychological manipulation, tech leaders or something like that.
And it wasn't that way at all. It was actually a really powerful, you know, three hours once a week
deep dive into this, this world and asking the question, how would you use it for good? So the
founder of Instagram and I worked on this thing called send the sunshine, where, you know, we
thought, well, what if we could persuade people in a way that alleviated depression, but using our social psychology. And so, and this is again, before the iPhone. So
imagine the kind of thoughts you'd had to be thinking back then. But, you know, the idea was,
imagine there's some server that knows that there's two friends who are friends, and they
have both their phone numbers. And it tracks the zip code of one phone number and realizes that
you've been in a place, you know, with bad weather for six days in a row. And because we know from
seasonal affective depression disorder, that's a big deal. Just having bad weather for a while can kind of
put someone down. And so what if upon hitting that condition, it then sends a text message
through something like a Twilio to your, you know, this is before Twilio too, and sends it to your
friend Mike and says, hey, would you take a photo of the sunshine and send it to your friend, Tristan, who's had bad weather?
And the idea is we'd just be sending each other the sunshine.
And this was a really nice idea behind alleviating depression.
There's all sorts of positive applications like that we thought of around helping people go to the gym and meet their goals. BJ has this nice model for behavior equals MAT, B equals MAT, which is behavior equals motivation times ability times trigger.
So whether or not someone does a desired behavior, like going to the gym, involves them being first motivated, then having the ability.
Like do they have, you know, if they're trying to go to a boxing class, do they have a pair of boxing gloves and the clothes and the shoes?
Or are they staying with a friend where they don't have those things?
So they have to have the second ability.
And then the third is, is there a trigger?
Is there an opportunity?
Is there a moment?
Is there a snap of the fingers?
Is there a ding on your smartphone?
Is there a reason why right now you should consider doing that behavior?
And if you have all three of those things aligned, then people will do it.
So we learn all these kinds of things. But, you know, this also became relevant in the story about Cambridge Analytica, because I remember back in that class, there was one
student group that actually had done, there was this one segment in the class on the future of
persuasive technology and ethical persuasive technology. And there was one group that came up with the idea, well, what if in the future you had a profile
on every single person on earth? And the profile was specifically, what does their mind respond to
that is persuasive? How is their mind uniquely, what's their map? And what are their set of
psychological biases? If you said, well, Harvard School of Medicine said that this thing is true,
you know, that would be persuasive for them because appeals to authority work with them. Or
if for you, Tim, I said, hey, you know, Eric Weinstein said X, Y, and Z, you know,
we both know Eric Weinstein, he's a really smart guy. You know, each of us are responsive
to different stimuli. And what if in the future you had this
map of what is perfectly persuasive to each person? And then we built technology that would
automate persuasive messages based on your unique characteristics. And this is actually exactly what
Cambridge Analytica later was, right? It uses your big five personality traits. If you don't
know the big five framework, it's your openness, it's the ocean.
So it's openness, conscientiousness, agreeableness, extroversion, I got this too reversed, and then
neuroticism. And so open, yeah, I won't go into the details. But basically, based on your personality
traits, you would deliver different political messages. And that's what happened in the 2016
election. You know, so this all relates to the conversation
we just had about language and about Byron Katie and beliefs. It's because once you understand the
code and you can dip into the code, it's incredibly dangerous what you can do with that. Because if
you think about what do you do when you wake up in the morning, it's the product of what's running,
the software that's running inside your,
between your ears.
And this is,
this is the kind of stuff that we studied in BJ's lab.
So I have so many things to ask.
Yeah.
It's a super, super helpful background.
BJ's a good guy.
I just want to,
just to reiterate something you said,
which is,
this is not Dr.
Evil's lair for,
you know, malevolent 20-year-old code wizards. And BJ actually, in other classes, focused on things like
peace and world peace, and it was difficult to get people – this is a great example of language.
It was very difficult to get people to agree on what that actually meant.
So he would focus on defining antecedents.
What are some components, antecedents,
that would be necessary to lead to what anyone in the class
would consider world peace?
And then he was able to get people to agree on some of the smaller antecedents,
and that ended up being the focus of the class.
So it's a very smart way to approach it. He's a good guy, so I want to underscore that. the smaller antecedents and that ended up being the focus of the class, right?
So it's very smart way to approach it.
He's a good guy.
So I want to underscore that.
And you just add onto what you're saying.
I mean, do you know the,
the,
the full story of the peace thing was awesome.
He actually had for a while back in,
I think it was 2006 or seven,
um,
multiple tech companies start a peace dot domain.
So it was like peace dot Facebook.com,
peace dot LinkedIn.com, peace dot couchsurfing.com. peace.linkedin.com, peace.couchsurfing.com.
Yeah, he petitioned a few of the tech companies. And the idea was, could they
each do something that would be the way that they're contributing to world peace?
And so with Facebook, they had a running wall of new friendships and connections formed between
Israelis and Palestinians. It was like a live feed of how many new relationships had formed in the last, you know, whatever day or something like that.
And Couchsurfing, the CTO of Couchsurfing actually was my collaborator on this time well spent
initiative, which later we'll talk maybe about, took over Facebook and Apple and Google in terms
of some recent changes that they've been making to their products. You know, he had started
Couchsurfing or worked on couch surfing,
which was a website before Airbnb for finding free space to crash out
when you were trying to stay with a friend.
And they also were part of this Peace.initiative that BJ started,
and they showed, I think, the number of people who had stayed on each other's couches
that were also from different ethnic backgrounds
that would have been otherwise at war or something like that.
And so to BJ's credit, and so people really understand and get this, it was not a diabolical
Dr. Evil lab for training psychological manipulation. It was explaining the techniques,
and he also petitioned the FTC in the late 1990s about the ethics and the need for ethics
in persuasive technology. But I just wanted to make sure people got that before we go deeper.
Yeah. And on top of that, or just to add to that, technologies are tools.
And tools of almost any type can be used, misused, abused.
They can be applied in many different ways. And one of the questions I've been
kind of dying to ask you is focused on incentives. I mean, we have so many different directions we
could go with this conversation, but ultimately, when I've read a lot of what you've written, what I've listened to you speak, it becomes clear that,
at least to me, that much like the quote you sometimes use from sociobiologist E.O. Wilson,
the quote, the real problem of humanity is the following. We have paleolithic emotions,
medieval institutions, and godlike technology, right? And so, this hypothetical situation
that was more of a thought experiment or a question from students in BJ's class,
then manifested in a political campaign, can really paint a foreboding picture of the future,
right? This very dystopian picture. And what I'd love to
hear from you as we look at some of the risks involved, where companies who are fueled and
driven by advertising-based models have cognitive neuroscientists, PhDs, I mean, armies of highly intelligent,
trained people developing
highly intelligent, trainable technology
to predict us better than we can predict ourselves.
Yeah.
How do you incentivize companies,
engineers, et cetera, to do the right thing?
And I mean, it's presumptive to say that I know what the right thing is,
but let's just say that for the sake of argument that we agree that,
as you've noted, or at least the data reflects, let me find it here,
because I have a note here that just is horrifying
when I look at it. Here we go. So a few examples, right? And feel free to fact check any of this
stuff. But with over a billion hours on YouTube Watch Daily, 70% are from the recommendation
system. The most recommended keywords in recommended videos were get schooled, shreds,
debunks, dismantles, debates, rips,
confronts, destroys, hates, demolishes, right? So we have this extremism reflected in technology,
which we could talk about whether that's a reflection of or informing the mass behavior. But the ones that really paint a terrifying picture for me. I'll only give two examples,
but 2018, if you were a teen girl
starting on a dieting video,
YouTube's algorithm recommended anorexia videos next
because those were better at keeping attention.
And then one more,
this was from a New York Times article,
adults watching sexual content were recommended videos
that increasingly featured young women,
then girls, to then children playing in bathing suits.
I mean, it's just like,
it's really, it's just like, oh.
It's really, it can paint a horrifying, terrifying picture.
At the same time, I know people who work at all these big companies,
as you do, and on a one-on-one basis, these are good people.
But the business models, sort of the incentives to shareholders and so on are such that these seem like very almost kind of predictable side effects, like perverse side effects of the incentives that are in place. So how do you incentivize people to change this, who are kind of at the driving, in the driver's seat, putting these things together?
Yeah, well, I'm so glad you laid all that out.
Because, I mean, that is what you last said there, which is that we shouldn't even be surprised by these consequences.
I mean, they are the direct consequences.
You know, we always say it's like these harms are not by accident, they're by design. They're not by design by the people, like you said, the good people who are,
there's no one at Facebook who's like, hey, how do we, or YouTube, who are like, how do we make
this recommend as many pedophilia style rabbit hole videos as we can, or let's recommend white
nationalism, or let's recommend, you know, hate, you know, the most extreme sort of hate inducing
speech. That is not what anyone at these companies wakes up and does. But we have to recognize this race to the bottom of the brainstem race to the
deepest paleolithic instincts towards tribal warfare tied, you know, towards survival,
we're under attack, the other side is going to come get us, we got to get those immigrants.
This is our nature. And a race for attention is a race to get consequences.
And you have to resonate at a deeper level than the other guys.
And so the game theory progresses so that you have to go deeper into social validation.
You have to go deeper into self-worth.
You have to go deeper into tribal warfare language.
And so just to first lay out that these consequences are predictable
and direct consequence of that business model. When you say the business model, we should also
be clear, it's not like the advertising business model causes this. It's not the rectangle that is
the ad, the Nike shoes that are causing outrage and polarization. It's more the engagement business
model. The fact that I am not, as YouTube or Facebook,
a neutral tool waiting here like a hammer,
waiting to be used just when you want me.
I actually have a necessity.
I'm like a hammer sitting here with a stock price
that depends on you using me in particular ways
towards particular nails that cause other hammers
to be activated so that other people keep using it.
And I have $500 billion at stake at keeping people using these hammers in be activated so that other people keep using it. And I have $500 billion at stake
at keeping people using these hammers in particular ways. And that is the disincentive.
That is the subversion of autonomy that is directly coupled with the success of the product,
the success of the business model, and the subversion of the social fabric, unfortunately.
And so in terms of your question, the first thing I
wanted to do is to make sure that we're all clear on that consequence being, you know, direct from
falling out of the business model. Because, you know, I've been working in this field for a long
time, and it's taken a while for the world to accept that that is the case. I mean, at the
beginning, I had conversations with people at some of these big attention engagement seeking companies, you know, five, six years ago saying,
hey, I think, you know, the business model here is addiction. The business model here is
whatever works at getting attention. They're like, yeah, you might be right, but
maybe culture will wake up and see that on their own. There was never a sense of responsibility
in the part of some of those people. And I think that's part of what we've had to do is just make
it utterly clear that this business model does cause predictable harms at scales that are really
hard to fathom. But now comes the question of like, okay, so now we recognize that,
what do we actually want to do about it? And, you know, I think anybody, you know, like you,
you were here in Silicon Valley 20 years ago. And I think it was, how long were you? It was like 15 years ago you were here?
I was in Silicon Valley from 2000 onward, up until about a year and a half ago.
Okay, right.
But I just mean the 2000 period to 2010-ish, you were in the thick of it.
I was.
Yeah.
And I think the point being that all the people I know and the founders of Instagram and my friend,
who was early at Mozilla and started the center for human technology with me, we all got in
the industry not because we wanted to create big, I don't know.
Maybe this is unusual, but we actually wanted just to help people.
We wanted to build really empowering tools, technology that's more like a cello.
Go back to the days of the Macintosh, where it's a bicycle for the mind. The whole point of what a computer was and Steve Jobs' idea was,
if you take a human being and they've got their own locomotive capacity to expend some energy and
then move a certain distance, and they're not very efficient compared to the condor, but if you give
them a bicycle, suddenly a human can use a little bit of energy with their legs and the pedals,
and they're going further than a condor in terms of the locomotive efficiency.
And so his metaphor was technology could be a bicycle for the mind.
And I'm all for that.
And that's what so many of us got into this industry to do.
But then somewhere along the way, the set of incentives that were at play forced the
thing we would monetize as human behavior.
And that's where the first problem comes in,
that success in the Macintosh was not directly tied
to how many of your friends I could sign up to using
and then getting them clicking on things
and sending you notifications about when they click this desktop icon
versus that desktop icon.
There was no problem with Adobe Photoshop.
There was no problem with Microsoft Word.
Microsoft Word wasn't tilting the world towards conspiracy theories or algorithmic
extremism and sending you notifications about when your friends didn't check the word document that
you did you didn't send them i mean you didn't have any of this stuff so the thing that the
fundamental place that we went wrong is when we attached financial success directly to the
capturing of human behavior the controlling and shaping of
human behavior. Because that's where the persuasive technology stuff comes in, because those principles
became applied to how do I keep you engaged. And so if you take an example like the follow button,
you know, if you remember, you know, Twitter and Instagram were two of the first services that did
this, where instead of just
adding someone as a friend, which is the Facebook model, a bi-directional connection model,
followers, that follow button and model created a reason why you would always get new email.
Every day you get new email being like, you've got two new followers, you've got five new
followers, you've got six new followers.
And you always want to say, oh, I wonder who followed me today.
And so that was this beautiful invention that got people coming back and ultimately to become
addicted to getting attention from other people. And the same thing with the like button. So,
you know, instead of persuading to get to capture your attention, it was much cheaper to get people
hooked to seeing how much attention they got from other people. Because now you autonomously,
like I don't have to do anything to you, you are now autonomously going back to see how many views
did I get on that YouTube video? How many views did I get when I played that video game and I
posted it on Twitch? How many views did I get, likes did I get when I put that post up?
And so I think that's where we went wrong, is when we tied business success and billions of dollars
to the amount that we captured attention. And we have to go through
a mass decoupling between business success and capturing human beings. And that's going to be
uncomfortable transition. It's a big transition. I think that's of the scale of, you know, going
from an extractive energy economy of fossil fuels to a regenerative energy economy. You know,
the metaphor we make is,
there's only so many environmental resources and drilling for oil. And that worked great at
generating a whole energy economy that gave us all this prosperity. But now, unless we want to
deal with climate catastrophe, we got to switch to a regenerative energy economy that doesn't
directly couple profit with extraction. And the same thing is here, except the finite substrate that
we're extracting from is our own brains, like we're scooping out the attention from ourselves,
because it takes nine months to put a new human being into the attention economy.
And, and we have to decouple this relationship that profit is directly coupled with the extraction
and move it to a more regenerative model where we are not the cow or the product, but we're the customer.
What might motivate or force, say, a Facebook to change their model in the sense that if you
look at Wall Street, which is as a metaphor for investment, and I'm not going to say all
investors are immoral, That's not true at
all. But a lot of them are somewhat morally agnostic in the sense that if Facebook can
better and better monetize the capturing of attention, this non-renewable resource of the mind. Yeah. Money will flow into Facebook.
And then Facebook will be positively reinforced
and rewarded for doing what we're describing, right?
Yeah.
So how do you, is it possible to divert the flow of that river?
I mean, what would, is it going to take high level policy change?
What levers could be pulled that would catalyze a change?
Yeah, well, I think just to name very concretely, what you're pointing out is that
this, you know, all the incentives point to continuing this sort of self-extraction, right? So why would we stop
scooping the attention out of ourselves, destroying democracy and debasing our mental health
when that's the thing that makes the most money and Wall Street's not going to stop funding it?
So to deepen that example you're giving, back last year in August when Facebook,
sorry, when Twitter shut down, 73 million fake accounts. So these were, you know were what are called sock puppet accounts or fake accounts.
It could have been Russian bots.
They could have been whatever.
They should have been rewarded for taking down these 73 million accounts.
But of course, what happened when Wall Street saw this
was that their stock price had previously been tied directly
to how many users they have.
So when they take down 73 million accounts,
they're like, oh, well, your company's worth a lot less than before.
But we actually had to do the opposite,
which is that we need to reward the companies
for basically having a high integrity public square.
And there's so many different facets to this, Tim,
but to answer your question,
we're going to need policy
that basically helps this decoupling process happen.
We're going to need shareholder activism that puts board resolutions on the companies to make this change.
We're going to need internal employees advocating for this change, saying, hey, I want to move to a more regenerative model that's like the equivalent of people last year advocating for time well spent, which ultimately became part of the design goal for Mark Zuckerberg
and Facebook in 2018. So it's a transition. It's just like moving from fossil fuels. Exxon does
not have an incentive to not be Exxon. And sometimes we wake up in these uncomfortable
circumstances where our business model is based on a thing we didn't know was bad at the time,
but we're starting to realize it was bad.
And an uncomfortable metaphor I've used for this in the past,
it's like, let's say you run the NFL,
National Football League, like great sport.
We've been doing it for decades and decades and decades.
And lucky you, you're CEO of NFL.
And one day, your sports scientist health guy
comes up to you and says,
hey, I think that when we smash people's heads together like this, it's causing concussions.
And you wake up and realize that your business model is smashing people's heads together
and selling it against advertising on TV.
And it's kind of the essence of the sport.
And no one wanted it to be this way.
But that's where we landed.
And now what do we do?
And it's really hard. I mean, everyone's going to try to put in the padding and we're going to try to increase
the safety standards and you do whatever you can, but at some point, the essence, the existential
essence of what football is about is that, you know, is this sort of process that does endanger
people's heads. And I think that's a situation that we're near now,
which is that we can't ask for internal change
from companies whose entire incentives are otherwise.
But with policy that decouples success,
we can talk more about that,
but there's some ways to do it from the outside.
I'd love to talk more about this.
And this is relatively new territory for me. I mean, not as a user, but certainly at a policy level or a sort of replacing business model perspective with some of these gigantic companies. companies, you have far more time in the trenches than I do. What are any of the kind of Archimedes
levers or proof points that could cause a shift, if any such thing could exist? For example,
is there a company that is pursuing a different model, though they could use the attention extraction model, who, if they succeed on a large scale,
could beget a trail of similar companies
or provoke a change in business model
at some of these other companies.
Are there any particular models to mimic
or companies that are doing something that reflects a viable alternative?
Or is it really just blank canvas at this point?
Yeah, I mean, we're in uncharted territory because we have this situation where there's a monopoly on attention
between a handful of major technology companies, Facebook, Twitter, YouTube, Snapchat, Instagram,
WhatsApp, own the attentional environment. There aren't an alternative place to reach
10,000 people when you want to upload a video. You can't just get that same level of audience
when you push it to Vimeo. These are attention monopolies, which is why one of the issues
and one of the fundamental things we've got to deal with is competition. One reason we're not getting different business models is you can't
compete and get access to that same attention monopoly. So we need, this is where Chris Hughes,
the Facebook co-founder, writing that op-ed in the New York Times saying we have to break up Facebook,
is there needs to be more diverse ways of people competing to produce products
that are of different business models that support society's well-being,
that better protect the public square.
But then the response from the tech companies is going to be,
I think Zuckerberg said that they spend more money on protection and trust and abuse
and Russian misinformation protection and trust and safety and all that stuff
than all of Twitter's revenue combined.
So take all of Twitter's revenue in a year and they spend more money on that than
on trust and safety than, than what Twitter spends in a year on what they gain in a year on revenue.
So that puts us in this uncomfortable position where it's going to cost us something. We can't
just do it, you know, and this is kind of like a non, um, I forget his last name, but the book,
the winner take all, you know, it's, we keep looking for these win-win solutions, but sometimes we have
to lose a little bit so that everybody wins. And that's not a good message for, you know,
capitalists, because that's not how we like to roll. But, you know, sometimes it works that way
with organic food, right? Like you realize that maybe regular food isn't so good, and we want to
get the clean food that's better, that's organic, it doesn't have the same pollutants,
even though there's some marketing and narrative that's baked into that assumption. And we can sell
it for a higher price. So the thing that's good for people, we can actually make money off of in
a premium product. But in this world, these are the products that run the public square, that run
the world belief system.
So talking about beliefs, the first, you know, however long we were talking, just consider that YouTube shapes more than a billion hours of watch time daily.
And there's 2 billion people who use it every day, which is about more than the size of Christianity in terms of a psychological footprint.
Facebook is 2.3 billion people.
YouTube is 2 billion people. If you add up
Instagram and WhatsApp, it's another billion or so. So you're talking about a couple of
Christianities of psychological influence total. This is an insane level of psychological influence.
So we better be really thoughtful. This is why I think, from my background, I mean, where I look
at these things from is let's get really nuanced and hold up a microscope to what these things are doing to the psychological timelines of people.
You know, what happens in your nervous system, whether it's with the word climate change or the word death tax or the word, you know, send our sons and daughters to war.
You know, between, you know, two billion people going down a railway where if you pull the lever, they experience these set of consequences on YouTube.
And if you don't pull the lever, they experience these set of consequences.
That's like the trolley problem in philosophy.
That's kind of what I was thinking about when I was at Google as a design ethicist is how do you ethically shape 2 billion people's thoughts where you don't even really get to make that ethical decision because your business model and your incentives are making that decision for you.
And this is where we have to decouple it. And we can talk about some concrete solutions. I mean, Apple, by the way, is kind of the government of the attention economy. They're
like the central bank. And people don't look at them that way because they're just making this
product called the iPhone. But they control the dials on basically what it means to get attention from people.
And where the app store policies on business models and things like screen time that help
you limit how much time you're spending. There are ways in which from top down, you can change
the incentives or do some quantitative easing on how people navigate through an incentive system
that's fundamentally about manipulating their attention.
But then there's some deeper changes that we can talk about, too.
Yeah, let's get into it.
What are some deeper changes?
I like the sound of it.
We'll see.
I know.
I'm sort of giving you an opening here. I'll swing at the soft pitch.
I'll take it.
Feel free to jump in. One simple example is what happened with energy companies and utilities in the United States.
So it used to be, if you think about it, energy companies make more money the more energy you use.
So technically, you know, if they're running out of profits and they want you to use more, they're incentivized to have you leave the lights on, leave the faucet on, leave the shower on, just waste as much energy as possible because that's how they rake in the money.
Right. Okay. So, and clearly that's not right. Like you don't want a world where we're basically, we profit from our own self-destruction, except that's kind of what
we're trying to avoid here in all the circumstances. So, what happened with energy is at least for,
I think at least half of US states went through this decoupling regulation where energy companies profit, the more energy you use linearly. So you use a little
bit more energy that makes them more money, more energy that makes them more money. And then at
some point you hit a tier where they want to disincentivizing you. They want to disincentivize
you from using more. So they say double charge you. So now you use the same amount of energy,
but now they're charging you twice as much. So that disincentivizes you from using it.
Except they don't hold on to all of the profits from that 2x cost.
They instead reinvest that extra cost into a renewable fund, a fund that basically invests
in renewable energy infrastructure.
So in other words, the disincentive to use more energy is used to fund
the transition to renewable energy. And now you can imagine something similar happening with
technology where you can have an attention or advertising-based business model. I'm not saying
this is the solution I believe in, by the way, but I think this is a piece in the toolkit,
is you can have a situation where you make money, the more attention you get
from someone, but up until a very small point, because beyond that point, you're basically
incentivized to create mindless consumption and zombification and teen mental health problems and
loneliness and the whole thing, right? So you can imagine a world where we decouple attention
success from business success, decouple the capturing of human behavior and manipulation
of human behavior from business
success, and then most importantly, to reinvest that money into the equivalent of what renewable
attention, renewable human life things would look like.
And that could happen.
That's something that you could help regulate with laws.
Let me ask a quick— I'm sorry. Go ahead. Don't lose track. Yeah, Paul Romer, he's a Nobel Prize winning economist, had proposed something
recently called an attention data tax that has some similar characteristics that you want to
progressively price the attention companies because they have this bad incentive.
Where would you, if it were up to you,
where would you apply those funds?
So, I mean, in the long run,
I think that you can't have...
You know, I said this on some other things.
I mean, I know you've had Mark Andreessen on the podcast,
and he has this line that's very famous from 2011,
that software is eating the world.
Because fundamentally, it's like, okay, if you could have taxis and our whole transportation infrastructure run without software, and it's not done
with any intelligence, and there's no demand side supply matching, etc.
First, you do it with technology, and you get all that efficiency. Of course, software's going to eat the world.
It's going to eat up everything. It's going to eat the world. It's going to eat up everything.
It's going to eat up media.
It's going to eat up advertising.
It's going to eat up taxis and transportation.
It's going to eat up every domain of life because it can always do it more efficiently.
But if you think about it, what that means is take a morning, take an area like Saturday
morning cartoons.
So that used to be run not by software.
It used to be run by some human beings and some laws and editors curating what happens for children. But then you let YouTube just gobble up Saturday
morning, and it also gobbles up with it all of the Saturday morning protections.
And so as software eats the world, like for example, Facebook, we used to have
equal price campaign ads on TV as regulated.
So Tuesday night, 7 p.m., it should cost the same amount for Hillary Clinton and Donald
Trump to reach the same audience.
Otherwise, it'd be unfair.
It wouldn't be a democracy.
But you suddenly let Facebook gobble up election advertising, and now the price has no assurances
that it's going to be the same equal price.
So what happens is as software starts eating the world, what happens is private incentives eat the world. We lose the public protections. So to answer your question
about where it goes for renewable funds, we have to have some notion of things that are built to
serve the public interest and not just private interests. I know this is happening in some
discussions around AI where past a certain
amount of wealth creation, because these AI things, once you really let them go, can generate
so much wealth by continuing to produce innovations and efficiencies and revenue and all this stuff
that after some certain point, shouldn't we just give that money back to the people? Give that
money back to, instead of extracting from us, shouldn't it be ultimately for improving the
greater lot for all of society? And I think that's something that we might feel uncomfortable
with, but we have to do with these large technology companies. Because if they're running a
constant for-profit shareholder maximizing extraction racket, and they've got to keep
maximizing and they've got to keep extracting, there's never a point to the end of their growth.
It's no wonder that they're over-extracting from democracy and mental health and kids and all this other stuff if they have to keep growing their footprint of attention.
I suppose also, I mean, this is just me kind of talking out my ass for a second, but it's
a bad habit I have, so here we go. Even if one can't settle on a plausible alternative, there could be a reasonable consensus would be tricky to define whatever that point is,
is applied to, say,
some mechanism for trying to alleviate
teen mental health issues, let's just say,
or fill in the blank, right?
To try to offset the damage that is being done
at the very least.
And that could be, you know, at least a
possible discussion for a plausible stopgap until a viable supplemental model or alternative model
is found towards which things get steered through some type of, I suppose it would have to be
policy or regulation or something along those
lines. Yeah, well, what I hear you saying fundamentally is about, you know, this is a
classic externalizing harm model, right? Like, you know, so oil is the most profitable, you know,
form of creating energy and moving it around the world and, you know, portable and all these great
things. But so it makes the most economic sense to go with oil,
except if you account for the externalities,
if you account for the balance sheet of society,
the balance sheet of the commons,
the balance sheet of nature,
which get hurt by this seemingly most efficient,
cheapest form of energy.
And the same thing is true of advertising.
It feels like, well, why in the world would we,
you and I, Tim, pay for Facebook when it's free?
I mean, why in the world are we doing it?
The problem is the harm shows up on the balance sheet of our sleep, of our collective democracy, of our public sphere, of the quality of our sensemaking, the information ecology, mental health.
It shows up everywhere.
And so what I hear you saying is, hey, well, let's at least put a fund aside to pay for some of those externalities,
almost like carbon offsets or mental health offsets or democracy offsets.
But the challenge is that, you know, it's like, wouldn't it be better? It's like,
there's this joke about capitalism. It's like, capitalism would prefer to give you diabetes and
then get you subscribed to a profit maximizing diabetes cure that I keep you on a subscription
where I make money as I sell you the subscription for the solution versus just not
creating the diabetes at all in the first place. And I think the question is, how do we create
systems that don't create the diabetes, the informational diabetes, the democracy diabetes,
the mental health diabetes with technology in us? How do we not do it in the first place?
And by the way, it's totally possible. Like,
you know, Instagram at the very beginning, I remember when those guys first started,
and I was one of the early users, because we actually used, what was it called? Bourbon
was the first predecessor to Instagram. And, you know, it used to be just about friends keeping up
with each other's lives. And it had some of the addictive qualities and, you know, it had some
of the infinite scroll and all that stuff. But it didn't have this focus on celebrities and girls who basically competed on who would wear the, you know, fewer clothing and then be most recommended in the Discover tab, you know, to get maximum audience.
And then kids basically realizing they could make money and selling their Instagram page for the million followers to brands and then everyone wanting to compete and being a bigger influencer.
Like all that culture of we're all addicted to being influencers and addicted to
getting attention that is an externality of culture of cultural values that are not real
but that actually came from um instagram going down this over extractive kind of growth oriented
path that you know that that they needed to not because you know they were evil people or anything
like that but because the the business model once they were acquired by facebook they needed to, not because, you know, they were evil people or anything like that, but because the business model, once they were acquired by Facebook, they had to keep growing.
They had to get a bigger and bigger attentional footprint.
And what you really want is you want it back to the early days.
I mean, let's take it back to the Instagram guys and just following 10 friends and seeing where they are around the world and keeping in touch with our friends.
That's great.
You know, and there's people who use Instagram that way now, and that's also awesome.
But we also have to account for the fact that the interface is not tuned towards keeping it just for
that use case. Like Instagram could be, if it was truly humane, just trying to, you know, help us
pick those 10 friends that we really want to keep in touch with, as opposed to let's maximize
discovery and influencers and, you know, millions of followers and get lots of people looking at stuff, that's
an incentive of a for-profit public company that now has to run that incentive.
And the same thing was true with Facebook, by the way.
If you go back to early interviews with Zuckerberg in 2005 at Stanford, he gives a speech at
Stanford with Jim Breyer at the Entrepreneurial Thought Leaders Seminar.
And he said, well, what is Facebook?
And he said, it's like an address book.
It's like a utility for your social life.
It's a social utility is what he called it.
And that was closer to a model where it's more of a tool,
back to what you were saying about technology being just a tool.
I'm all for that.
I mean, technology being an empowerment tool.
And I think there's beautiful things that can come from these things
when they're operating as tools.
But the business model of advertising and engagement is the anti-tool. It does not want
to be a tool. It wants something from you. And that's what we have to draw that line there and
decouple business success from not being a tool. It might sound aggressive. I mean, this is the thing that's hard for people to hear. No, I get it. It's tough. I mean, we're talking about highly systems with extreme financial rewards associated with the problems that are manifesting and compounding, right?
And it's a very thorny problem. So let's, it's just like climate change though, right?
Cause it's like, you know, we're all addicted to the growth, but like growth towards what? Growth
towards our own self-terminating, you know, catastrophe. It's like, well, we can't, we can't
get off of oil. Cause that's the only way we're going to get the thing. And it's like, yeah,
but the alternative is that we have self-terminating endpoint. So we have to recognize
that, you know, it's like Paul Hawken,
if you know him and his work on Drawdown, it's like the top 100 ways to address climate change.
And he says, oh, but people tell him like, oh, but solving climate change is so expensive,
it's going to cost us so much money. It's like, no, it's actually the opposite way around.
If we don't do it, it's going to cost us way more. We have to make the transition
towards something renewable because it's actually going
to be completely self-terminating if we don't. Because the information ecology, the thing that
fuels how we make sense of the world in our democracy, like democracy only out-competes
the Chinese authoritarian model if we have really good bottom-up information sources,
like diverse, rich ideas, marketplace-type things.
And this business model of engagement, the race to the bottom of the brainstem towards the salacious,
the outrageous, the hateful speech, the extremism stuff, the pedophilia, is not fueling our democracy with the best sources. It's like we have the, talk about personal life optimization and keto diets
and nutrients. We're fueling ourselves the opposite of a democracy,
democracy, keto diet. Right. And we, we have to flip this around. And it's not a matter of
this being my opinion or something like that, or this being, you know, just being a motivated
activist. This is like, I'm actually concerned about this because if we don't, the alternative
is a thousand or billion times worse. Yeah, for sure. It reminds me of a quote by, and I never know how
to pronounce this guy's name, but Chuck Palahniuk, I think I'm getting it right. And it's the partial
quote is, big brother isn't watching, he's singing and dancing, he's pulling rabbits out of a hat,
big brother's busy holding your attention every moment you're awake, he's making sure you're
always distracted, he's making sure you're fully absorbed. And it just goes on to say that by doing so, you're no threat.
And I don't want to turn this into some, you know, vive la resistance type of...
Well, I'll tell you, I mean, this represents the... Have you ever read Amusing Ourselves
to Death by Neil Postman? I have not, but I have heard of it.
There's this quote.
I'm going to pull it up because it's just worth reading really quick.
We were all keeping an eye out for 1984,
and we thought about the dystopia that we would get was the Big Brother one.
But alongside Orwell's dark vision,
there was this other slightly older and less well-known,
but equally chilling vision of Huxley's Brave New World. It's Aldous Huxley. And he summarizes it this way. It says beautifully,
it says, what Orwell feared were those who would ban books. What Huxley feared is that there would
be no reason to ban a book because there would be no one who wanted to read one. Orwell feared
those who would deprive us of information. Huxley feared
those who would give us so much that we would be reduced to passivity and egoism. Orwell feared
that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of
irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become
a trivial culture, preoccupied with some equivalent of the feelies, the orgy-porgy, and the centrifugal bumble puppy. This is the 1930s.
As Huxley remarked in Brave New World, the civil libertarians who were ever on the alert to oppose
tyranny, they failed to take into account man's almost infinite appetite for distractions.
He ends by saying Orwell feared that what we feel will ruin us orville feared those uh that what we feel will
ruin us huxley feared that what we desire will ruin us and that's that's essentially the premise
of the work it's like there's two ways to kind of fail here as in most systems there's almost
always two ways to fail one way to fail is the authoritarian big brother censorship sort of mode
with so little information that we don't have any
and we're all restricted and top-down controlled, etc. But then the bottom-up way to fail is just
overwhelmed in irrelevance, in distraction, in overstimulating our magic trick sort of brain with
paleolithic social validation and tribal warfare and moral outrage and all that stuff that isn't
actually adding up to anything. And human agency, which is unique to the world, like choice,
is that thing that's sitting in between those two worlds.
You know, informed, effective choice, good choice.
And that's what we right now, like as a human civilization,
like that's where we got to be
because those other two models are really bad
and self-terminating in some cases.
If we cannot
i mean my biggest fear about these issues is we have to be able to agree on a common reality
a common truth because that's the only way if we don't agree on what's real or if we don't believe
there is truth then we can't construct shared agendas to solve problems like inequality or
climate change or whatever like we have real problems and we have to find ways that we actually can see those agreements
and then construct actions together to change it.
And I think that right now technology is kind of taking us away from that.
But the reason that we work on these topics is
I want to live in a world where technology is giving us the superpowers to do that,
like superpowers for common ground,
superpowers for constructing shared agendas,
superpowers for instead of getting learned helplessness by seeing climate change news pounded into our
nervous systems, dosed to two billion people a day, to instead have mass empowerment, like
mass coordinated action that we can all take and feel optimistic about all the progress
we're making and all the things we can do next.
So that's kind of the project here is like we are trapped in this one paleolithic meat
suit that's got these kinds of bends and are trapped in this one paleolithic meat suit that's got these
kinds of bends and contortions
that bend reality in a way that can be hacked
and we can also use those bends
and contortions in a way that gives us the most empowerment
and if we ever needed
those superpowers it's right now
this is a perfect segue
I have
a question for you that is
personalized and I'm going to start by
finishing the quote that I ended up only reading partially. It's a Big Brother. This is from Chuck
Palahniuk. Big Brother isn't watching. This is very close to what you were just saying with Huxley.
He's singing and dancing. He's pulling rabbits out of a hat. Big Brother's busy holding your
attention every moment you're awake. He's making sure you're always distracted. He's making sure
you're fully absorbed. He's making sure your imagination withers
until it's as useful as your appendix and that would be a problem uh both on an individual
level and certainly on a collective level and there's a there's a quote of yours that was in
the uh i guess a tedx brus, if I'm getting the location right,
uh,
presentation.
I spend a lot of my time thinking about how to spend my time.
And I'd love for you to talk about what you do on a personal level to,
to whether it's a firewall,
your attention or to sort of mitigate some of the damage slash distraction that
every economic force seems to want to impose on you.
And so there's,
there's,
you know,
on one hand there's defeating Skynet and then there's like the day to day
life of John Connor.
Right.
So if you're
John Connor like what what what are some of the things that you do on a day-to-day or week
to week basis to defend against some of these uh you know some of these forces some of these
yeah some of these technologies it's funny when you mentioned this sort of John Connor and both
living a personal life and defeating Skynet I I just realized in you saying that, that that's
basically both, both my life is both of those things right now. Like every single day of my
life is how do we defeat Skynet? Um, whether it's on Capitol Hill and just coming back from that
last week or, um, you know, the personal level of, of just being, being, uh, being effective.
So I'm well rested so I can do that. Um. You know, it's really hard. And part of why
I worked on these topics for so long, I mean, that first TED Talk in 2014 was about time well spent,
and about the power of persuasive technology to make us distracted, which is kind of how this
all started, was I found myself so easily distracted. Like, I hated seeing this happen over and over again. You get one of these
emails saying you've been tagged in a photo or someone commented and mentioned you in a comment.
This is appealing to really deep instincts. You're the protagonist of the show called your life.
And when someone tags you in a photo, it's like, oh, something about me? Social approvals online.
What did they say? Is it good? Is it bad? I have to see right now. And it's really powerful stuff. And the reason that I work on this is because I actually feel
more sensitive than other people. I feel certainly very sensitive to these forces. So it's why I
think it's so important to protect them and protect against them. And I think of it like
we have to build these like exoskeletons for our paleolithic brains. I mean, the military kind of takes this stuff seriously, right? With, you know, in military combat and
the kind of flight instrumentation you see in a aircraft, military aircraft or something like
that. It's all about managing attention, like with crazy levels of discipline and science and
research about how do you build that exoskeleton that gives us that level of focus and thinking
about through the right questions and not the wrong questions and being well rested and, you know, being able to stay up for many hours and focused on one single task and all that.
So now to concretely answer your question, what are the things that we do?
You know, I, first of all, like I said, I struggle. It's hard, especially now because defeating Skynet comes with
a lot of email and communications. And it's like being part of and running a social movement for
how to fix these things. I mean, we have a nonprofit for those who don't know called the
Center for Humane Technology where we focus on this and we get emailed by every major world
government and people who are dying to fix these problems. And we're trying to be of assistance in catalyzing that change.
So it comes with a ton of work and social obligation to get back to people.
But some things that I've found have been helpful.
I mean, one thing I've been doing since I was in college is,
since we're mentioning these tips like the grayscale tip,
which just to make sure your audience knows what that is.
The idea there is when your phone has colorful rewards, it's invisibly addictive. It's like
showing the chimpanzee part of your brain a banana every single time you look at the color of the
icons and all that stuff. And so one thing that you can do is you can go into, and I think that
you're probably going to list this in the comments, but it's something like, I think it's general,
and then if you go to the settings app on your iPhone,
and then general,
and then if you scroll to accessibility and scroll to the bottom,
there's this thing that lets you triple-click
to set your phone grayscale.
And so you say, why would I set my phone grayscale?
Well, it just strips out those color rewards.
So now when you look at your phone,
it doesn't have that,
just a little bit less luster
and psychological animation of your nervous system. And we help popularize that. And it's
mostly also for the social effect. Because when you do that, people say, oh, your phone's grayscale.
Why is that happening? And it lets you tell the story about why you would do this and the
attention economy. And if you heard about time well spent, that's kind of why we did that.
Quick, quick, quick addendum on that. So the triple-click can turn at grayscale and back to color.
And I'll put that in the show notes so people can find that at tim.blog.com forward slash podcast.
Another benefit of that, which is one way to sell it, or an additional way to sell it, is that it increases battery life also.
Yes.
Quite substantially increases battery life.
And it makes it harder to find your icons,
which some might view as a bug,
but it's a feature if you're trying to use social media less.
Yeah, well, so that speaks to a secondary thing
that I've recommended for a while,
which is if you think of it, your phone is,
you know, there's like a filter,
or rather it's unfiltered. So it accepts both unconscious mindless uses of it and conscious
mindful uses of it. And it can't tell the difference between when you're a zombie and
you're out of anxiety reaching for it to just check again, the thing you already checked 10
seconds ago. And when you're actually saying, no, no, no, I really need to find directions to that
party I'm going to, I need to find those directions right now. It can't tell.
You don't want to put up these arbitrary speed bumps or roadblocks between you and what you want generically because then you can't distinguish between those two uses.
Another thing you can do is if you basically take off all of your apps from your home screen, almost all of them except for the, we recommend we call them the tools so like tools are your quick
in and out utilities things like calendar things like lyft or uber things like you know messages
that just let you quickly do something and then you're done so those are fine to have on your
home screen but if you move everything else off the home screen and instead train yourself to
pull down from the top on an iPhone
and type, like, I want to launch Mail,
or I want to launch Instagram, or I want to launch Twitter.
Because if you type, you have to be making more of a conscious choice.
I like that. That's great.
So think of it as like you're putting a bandpass filter
between you and your phone that's only accepting conscious uses
and rejecting mindless uses. So that's like another thing you can do. Another thing that I do, if you want to be really
militant about it, is if you think about one of the problems with the way that phones vibrate,
it's gotten so bad that we now experience this thing called phantom vibrations, where
we believe that our phone is vibrated even when it hasn't. And we're, we're simulated so often
that we're, we're just constantly
reaching into our pockets
just to feel if it actually did vibrate
and we check it again just in case.
And it's just a mess.
And one of the things
that would help alleviate this
is if you have a custom vibration signature
for different kinds of notifications.
So for example,
when I get a message through iMessage
from a contact,
I actually, it buzzes three times in quick succession,
like really fast. And I can tell, therefore, when I'm getting a message from someone
versus when I'm getting a calendar notification, like you're 10 minutes late for Tim's interview.
And that is a helpful thing, because if you think about it, your phone is like a slot machine. It's
buzzing in the same ambiguous way every time, which forces you to say, oh, I wonder
if that could be that thing I was looking for.
And then that's the excuse to get sucked in and then you get sucked into the rest of the
thing.
So in general, you want to minimize your use of, you know, you're needing to even check
the thing in the first place.
And that's what that helps do.
And so you can do that by going to your notifications. And unfortunately, Apple doesn't let you split up all of your
major categories of notifications. I mean, this is why when we push on technology companies,
and this is one way Apple could be like a better government, a better central bank,
is if they enable in the next version of the phone, a thing that showed you basically,
here are the top three kinds of notifications that you're getting. Like here's like a continent map of the major five
categories of notifications. Do you want to set up a unique buzz signature for each of these five
to distinguish them? Yeah. Or disable them. Or disable them. Right. Exactly. So I mean,
but both, and the whole point is we should have a whole, this is like the environmental movement, right? It's like, imagine there's this, this is the thing we're trying to catalyze is that if everybody treated human attention as something sacred, that we're trying to minimize our footprint on it, as opposed to maximize how much we manipulate, take, extract, scoop out of your nervous system. That's the fundamental change. And if we treat it, if everything was treating your attention as something sacred, that like, we want to move and change the minimal number of pixels on your
screen, we want the minimal number of vibrations to ever occur, we want the minimal number of
psychological anxiety concerns. I mean, this is another category people will talk about,
is even when you're not looking at the phone, the anxiety loops of concerns that are looping
in your mind, as a result of the 10 minutes ago when
you were using your phone. Like, oh, did that person get back to me? Oh, I wonder if I got
new likes on that thing. Oh, I wonder if I'm going to get the address for that event, you know,
if they sent that yet. There's ways in which the phones could silence those concerns by, for
example, letting us set up a, like, let's say when you go on Do Not Disturb for two hours,
it said, is there any,
it gave you the option to say, is there anyone who, if you heard from them in the next two to
three hours, you would want it to make a special noise for, uh, and you could mark that out. And
that way you could now, you can now not use the phone and have complete like a separation from it
because you have the certainty that you won't miss something important because that fear that we can
miss something important is a really powerful. so that even when you go on do not
disturb or airplane mode, people still go back to their phones and they check. So I think people
just don't really realize the extent to which their deeper level nervous system and habits for
reaching for this thing have been hijacked. And this is about kind of un-hijacking your whole
nervous system, not just the way that the phone works, but kind of un-hijacking your whole nervous system, not just, you know,
the way that the phone works, but kind of alleviating and, um, you know, releasing your
whole nervous system from its deep connection to these expectations. Yeah, totally. And, uh, it's,
it's, it's the effect on the nervous system, right? Like the actual biological cost is something that is hard to fully take stock
of until it's removed. And it's huge. And I at least once every six months try to go a few weeks
without any use of social media. And I find it useful. I find it fun. I enjoy connecting with
people through Twitter and polling. And there are some fantastic uses of social media.
And I enjoy looking at pretty pictures on Instagram of cabins that I'm sure I will never visit and things like this.
But there is a neurobiological cost. One thing that I do that people might also consider is if you feel like you absolutely can't survive without social mediaer or Edgar or one of these other tools
for several weeks. So I'll batch my taking of photographs, those or whatever it might be,
have those scheduled out for a few weeks. And I give myself then a vacation from any type of
active monitoring or responding to social media. And the feeling at the tail end of that, let's call it week or two weeks, most pronounced after a week, is not that, this is going to sound really maybe ridiculous, but it is not that dissimilar from a uh that it's it's it's it sounds unbelievable until you
actually try it um totally i mean i think what you're speaking to in general is something that
we would call a humane technology design pattern which is you you know the there are going to be
moments when we think of a thing we need to do, and the inability to do it at that moment leads us to have to open up Twitter and write that thing or send that email to ourselves or whatever.
And if we can't do it at that moment, we have to leave it on our nervous system as a looping concern.
So now for the rest of your day, until you get to a computer or whatever, until you do, it's like looping
and you're like, don't forget this thing,
don't forget this thing.
And there's a way in which
if technology were truly respecting,
you know, the fact that we're better off
offloading these things into somewhere else
where it's not taxing our nervous system,
it could be a universal design pattern
that you could enter something you want to do
and schedule when you want it to happen
and not do it immediately. Whether it's sending an email to someone or
sending a text message when you're... I think the way the iMessage
thing works on an iPhone, you send a message to someone while you're on airplane mode, but it
won't just say, oh, I'll send this when you get back. I just won't send it. And it forces it to be on you
to go back and send it back. And imagine if it said, hey,
when do you want this thing to send?
It's baked into the way iMessage works, right?
Or baked into Slack.
Like Gmail offline, right?
It would automatically send when you're connected
as opposed to forcing you to go in,
click on this exclamation mark,
and confirm that you want to resend it.
It's like, yeah, I do want to resend it
because clearly it didn't get delivered the first time.
This should be pretty easy to logically deduce. And you have to have the certainty that it's going to work.
Because if you don't have the certainty, even if it does work like 90% of the time,
like it's going to generate that extra layer of like an anxious timeline. Like just imagine this
anxious timeline plopped down into your nervous system so that for the next two hours, there's
this extra 3% that your nervous
system is just taxed by the fact that you're not sure for sure if this thing sent. You know
Gmail's supposed to send it because it was an offline mode and they promised that they will,
but if you don't have that certainty, we have to have that kind of confidence. I think this
is actually one of the simplest things that technology could do is there's a lot of uncertainty about stuff just doesn't work consistently. A lot of the stress and the background radiation of
anxiety would go down if we just had more consistency in the way that we believed that
these things would work as opposed to the ways that they are periodically broken.
I mean, another one I wanted to mention that I do in terms of, um, creating a fortress or firewall of attention, I actually haven't
talked about this one, but if you turn on, um, uh, in accessibility settings on a Mac,
um, the zoom feature, I don't know if you ever use this, but you can like zoom in to a certain
part of the screen. Um, and I do it where you hold down the control key
and then you just use two fingers
to sort of zoom in and zoom out.
But what I do is when I'm trying to write, for example,
I easily get distracted by any other pixels
that happen to pop into the screen.
Like it really affects me.
I'm hypersensitive.
And so when I'm doing any writing,
I'll just zoom into that text field
so it actually occupies the full 15 inches of my
MacBook Pro screen. And it helps me really focus. And using things like that, if you just imagine
that you're literally trying to conserve the number of pixels that change in an unexpected way,
because that will hijack and make it easier to forget or otherwise detour you from something
that you're doing. And all of us,
again, is like currently on us to do, right? This is like this extra cost that we all have to pay
to know these tricks and listen to these podcasts and, you know, fiddle with these settings 100
times. But the whole premise of this kind of work is, imagine a humane and regenerative world where
this is how it works by default, where everything is trying to minimize its footprint on our
attention. And all the defaults are set to make it as seamless as possible
and to do it the way that you would want it to work
and to not have to double think and think,
oh, maybe it didn't send, I got to send that again.
Just that certainty that I can actually have peace of mind,
I can actually do not disturb for a day
because I know that out-of-office messages
or I'm not going to respond for two days to email
was built into the native functioning
of how email worked on every email app or messaging app, right? We don't get that chance to do it.
WhatsApp doesn't have a mode that says, hey, I want to go on vacation for a week, and this is
the message I want to send to the people that are in this class of contacts. That could be baked
into the way that messaging works, the ability to disconnect without missing something important.
And that's the premise of what has to happen, is a deeper redesign that treats human attention as
sacred, and that treats human attention as sacred
and that treats our cognition as something that we need to conserve for the areas we most need it
in the big decisions we have to make in our lives. That's what I'd love to see.
Yeah, me too. And I suppose a part of that is people developing the awareness of the value
of their attention so that they are perhaps willing to pay for things
that preserve that attention and treat it as sacred by design.
Exactly.
Attention is a scarce resource. I mean, it is certainly a limited resource. I know we only have
perhaps a handful of minutes left. And I'd love to ask you as someone who I would imagine has read quite a few books in your day, and you've mentioned a few. You mentioned metaphors, we live by, you mentioned amusing ourselves to death. Are there any particular books that you have gifted often to other people or tend to recommend most often or have recommended a lot to other people? Do any come to mind?
It's a great question. You know, Neil Postman in general, as a media thinker,
about some of the topics we've discussed today is just excellent. I mean, he foresaw so many of the
problems in his books, Amusing Ourselves to Death. And another one by him is called Technopoly,
which also is about how when culture surrenders to technology
and especially the quantification of metrics and SAT scores
and time spent and GDP and these kinds of things,
he covers in that book.
Highly recommend.
There's another book called Finite and Infinite Games.
Do you know this one?
Yes, I do, by Kars.
Yes, James Kars, the religious studies professor.
Did you interview him in your podcast? Yes, I do. By Kars. Yes, James Kars, the religious studies professor. Did you interview him in your podcast?
No, I haven't.
I would certainly be open to it.
It's a fascinating, fascinating book.
Yeah, yeah.
And that's just a general philosophy one about life and how to, I don't know, navigate in a more improvisational way and ask like, what game am I really playing in an interaction? Am I playing for a finite game outcome to win the game? Or am I
playing to keep playing, which has a lot of overlaps with improvisation and things like that?
Yeah, that's a fantastic book. People can get a very good taste of it by going to Goodreads
and looking at highlighted portions for finite and infinite games.
Also highly recommended by Stuart Brand and a lot of other really, really folks I respect a whole lot.
I'm sorry.
I was going to say, Ruggman, one more if you're into podcasts, but someone who I've learned a lot from in terms of the civilization level dynamics
around finite games operating on,
or infinite growth games
operating on a finite playing field
and the kind of the fundamental problems of capitalism,
I recommend looking at Daniel Schmachtenberger.
There's a Future Thinkers podcast episodes with him
and his thinking has been hugely informative to my own. So I recommend that for listeners. the time. These are important topics. These are timely, but only going to become more relevant
and more important. Is there anything else you would like to say? Anything else you would like
to point people to? Suggestions you'd like to make? Anything at all that you'd like to
share as closing comments before we wrap up?
Well, no, first, just thank you for having me.
I've enjoyed it as well.
It's nice to finally connect.
I think we've had many friends
who've been trying to connect us for a while.
And I think if you're interested in how we reform
the attention economy and how technology's been working,
I just recommend people check out our work at the Center for Humane Technology. You can find me on Twitter
at Tristan Harris or the Center for Humane Technology website, humantech.com. But, you know,
this is going to take a village to make these changes. And I think it might seem really hard.
But then what I would encourage people to do is recognize that our Paleolithic brains are not meant, like if you ask, is our Paleolithic instincts, are they designed to do well to look at a massive problem like climate change and just be like, great, let's get to work?
Or are they more designed to look at a huge problem like that and say, oh my God, I have no idea what to do, let me put my head in the sand?
And it's definitely the latter. And I think that the thing that we have to recognize is that when you see big problems,
recognize the way that our instincts would bias us to put our head in the sands and ask
instead, well, what if there's no one else who's going to solve these problems but us?
Because my last big lesson that I'll share with people, because I've had a crazy couple
of years, I've been in the rooms with heads of state and, um, you know, the, the highest rooms possible considering these problems, there, there are no higher rooms.
And I, I used to think in my life that there was this magic room of adults somewhere that,
you know, we're actually thinking about all these problems and they had it all figured out. And
don't you worry, Tim, you know, pat you on the head. They've, they've, you know, we've got this
one son, you know, we, we really have this one figured out. And my lesson this year is no such room exists around some of these big problems.
Like with climate change, there isn't some master plan that everyone's working on.
And with this one, there isn't some just group of people at Facebook who are like,
that's nice, Tristan, but we're going to fix this whole thing. It really is this emerging issue that I think people need to get used to, each of us who can
especially, who have the bandwidth, to take responsibility for the world that we live in
and ask, what can we do? Because it was frightening and terrifying to realize at first that there
wasn't a bunch of other adults, or at least not
that many adults in these rooms who knew the answers to these questions, and that suddenly
I was one of them. And then the second part is realizing, wow, okay, here we go. What can we
now do to navigate? What levers can we pull? And I think if everybody saw that they really were an
active agent in the system and not just a passive participant.
They would, we'd get there a lot faster.
So I really encourage people to do that.
We are all John Connor.
We are all John Connor.
That's a great episode title.
Well, this is a very important message
and look forward to hopefully spending some time together in person. Perhaps we can rope in Eric and some others.
Yeah, let's do that. Let's do that. I miss Eric. This has been a lot of fun for me and very, very enlightening, very insightful.
And I have a whole sheet of notes that I've taken on things that I want to follow up on.
I will link for everyone listening to all of the social links, the humainetech.com and so on in the show notes.
Also, all the books we've mentioned, everything else will be linked at Tim.blog.com. If you just search for Tristan or Harris, although then Sam will pop up a couple times as well.
So, I'll have to parse that.
And until next time.
Yeah.
Thank you so much, Tristan.
Thank you so much for having me, Tim.
And to everybody out there, thank you so much for listening.
Hey, guys.
This is Tim again,
just a few more things before you take off. Number one, this is five bullet Friday. Do you want to
get a short email from me? And would you enjoy getting a short email from me every Friday
that provides a little morsel of fun before the weekend and five bullet Friday is a very short
email where I share the coolest things I've found or that I've been pondering over the week.
That could include favorite new albums that I've discovered.
It could include gizmos and gadgets and all sorts of weird shit that I've somehow dug up in the world of the esoteric as I do.
It could include favorite articles that I've read and that I've shared with my close friends, for instance.
And it's very short. It's just a little tiny bite of goodness before you head off for the weekend.
So if you want to receive that, check it out. Just go to 4hourworkweek.com. That's 4hourworkweek.com
all spelled out and just drop in your email and you will get the very next one. And if you sign
up, I hope you enjoy it.
This episode is brought to you by LinkedIn Jobs. Hiring can be hard, really hard, and it can also be super, super expensive and painful if you get it wrong. I certainly have had that experience
firsthand multiple times, and I'm not eager to repeat it. So I try to do as much vetting as
possible on the front end. And today with more
qualified candidates than ever, you need a solution. You need a platform that helps you to
find the right people for your business. LinkedIn Jobs does exactly that. More than 600 million
users visit LinkedIn to learn, make connections, grow as professionals, and more than ever,
discover new job opportunities. In fact, overall LinkedIn members
add 15 new skills to their profiles and apply to 35 job posts every two seconds. That's a crazy
stat. LinkedIn does the legwork to match you to your most qualified candidates so that you can
focus on the hiring process, getting the person into your company who will transform your business.
They make sure your
job post gets in front of the people with the right hard skills and soft skills to meet your
requirements. They've made it as easy as possible. So check it out. To get $50 off of your first job
post, go to linkedin.com slash Tim. Again, that's linkedin.com slash Tim to get $50 off of your first job post.
Terms and conditions apply.
Check it out.
LinkedIn.com slash Tim.
This episode is brought to you by MeUndies.
MeUndies makes the softest undies known to man.
That's what the copy says.
And they are soft.
They're really soft.
Whether you like crazy prints or opt for classic black,
MeUndies gives you the freedom to express yourself comfortably. I wonder what expressing yourself, of course,
within legal bounds means in this case, but I do like to express myself. I'm wearing some tie-dye
MeUndies right now as I record this. In the room next to me, I've got some pizza and video game
prints. Those are not on the same pair of underwear, but two separate ones.
And I'll be packing for a trip, and I have a nice stack of MeUndies going with me. Why? Because
they're comfortable. Very, very comfortable. MeUndies has plenty of options for those looking
to up their undie game. You can join the monthly membership. You can do build a pack. That is
building a 3, 6, or 10 pack of your favorite undies or socks and saving up to
30%, you can select a matching pair to match with your better half or just pick out one pair that
strikes your fancy. And there's some pretty ridiculous ones that I specialize in personally.
MeUndies are made with soft, sustainable fabric and available in sizes from extra small to 4XL.
Fun new prints drop every Tuesday and members get access to
exclusive prints every month. MeUndies has a great offer for listeners of this podcast. For any first
time purchase, you get 15% off and free shipping. They also give you a 100% satisfaction guarantee
and I like to be satisfied with my underwear. To get your 15% off your first order,
free shipping, and a 100% satisfaction guarantee, go to meundies.com slash Tim.
That's meundies.com slash Tim.