The Joe Rogan Experience - #1558 - Tristan Harris
Episode Date: October 30, 2020Called the “closest thing Silicon Valley has to a conscience,” by The Atlantic magazine, Tristan Harris spent three years as a Google Design Ethicist developing a framework for how techn...ology should “ethically” steer the thoughts and actions of billions of people from screens. He is now co-founder & president of the Center for Humane Technology, whose mission is to reverse ‘human downgrading’ and re-align technology with humanity. Additionally, he is co-host of the Center for Humane Technology’s Your Undivided Attention podcast with co-founder Aza Raskin.
Transcript
Discussion (0)
the joe rogan experience train by day joe rogan podcast by night all day
tristan how are you good good to be here good to have you here man um you were just telling me
before we went on air the numbers of the social dilemma and they're bonkers so what to say that yeah uh
the social dilemma was seen by 38 million households in the first 28 days on netflix
which i think is broken records and if you assume you know a lot of people are seeing it with their
family because parents seeing it with their kids uh the issues that are around teen mental health
uh so if you assume one out of 10 families saw it with a few family members, we're in the
40 to 50 million people range, which is just broken records, I think, for Netflix.
I think it was the second most popular documentary throughout the month of September, or film
throughout the month of September.
It is a really well done documentary.
But I think it's one of those documentaries that affirmed a lot of people's worst suspicions
about the dangers of social media.
And then on top of that,
it sort of alerted them
to what they were already experiencing
in their own personal life
and like highlighted it.
Yeah, I think that's right.
I mean, most people were aware.
I think it's a thing everyone's been feeling
that the feeling you have when you use social
media isn't that this thing is just a tool or it's on my side it is an environment based on
manipulation as we say in the film and that's really what's changed that you know i remember
you know i was been working on these issues for something like eight or eight years or something
now you please tell people who didn't see the documentary what your background is and how you got into it.
Yeah, so I, you know, the film goes back as a set of technology insiders.
My background was as a design ethicist at Google.
So I first had a startup company that we sold to Google.
I landed there through a talent acquisition.
talent acquisition, and then started work about a year into being at Google, made a presentation that was about how essentially technology was holding the human collective psyche in its hands
that we were really controlling the world psychology. Because every single time people
look at their phone, they are basically experiencing thoughts and scrolling through
feeds and believing things about the world. This has become the primary meaning-making machine for the world.
And that we as Google had a moral responsibility to, you know, hold the collective psyche in a thoughtful, ethical way and not create this sort of race to the bottom of the brainstem attention economy that we now have.
So my background was as a kid, I was a magician.
We can get into that. I studied at a lab at Stanford called or studied in a class called the Stanford persuasive technology
class that taught a lot of the engineers at in Silicon Valley kind of how the mind works and
the co founders of Instagram were there. And then later studied behavioral economics and how the
mind is sort of influenced. I went into cults and started studying how cults work, and then later studied behavioral economics and how the mind is sort of influenced.
I went into cults and started studying how cults work,
and then arrived at Google through this lens of, you know,
technology isn't really just this thing that's in our hands.
It's more like this manipulative environment that is tapping into our weaknesses,
everything from the slot machine rewards to, you know, the way you get tagged in a photo,
and it sort of manipulates your social validation and approval, these kinds of things.
When you were at Google, did they still have the don't be evil sign up?
I don't know if there's actually a physical sign.
There was never a physical sign? I thought there was something that they actually had.
I think it was, there was this guy, was it Paul? Not Paul. What was his last name?
He was the inventor, one of the inventors of Gmail and they had a meeting and they came up with this mantra
because they realized the power that they had and they realized that
there was going to be a conflict of interest between advertising on the search results and
regular search results and so we know that they know that they could abuse that power and they
came up with this mantra i think in that meeting in the early days to don't be don't be evil there
was a time where they took that mantra down, and I remember reading about it online.
They took it off their page, I think.
That's what it was.
Yeah.
And when I read that, I was like, that should be big news.
Like, there's no reason to take that down.
Why would you take that down?
Yeah.
Why would you say, well, maybe you can be a little evil.
Let's not get crazy.
It's a good question.
I mean, I wonder what logic would have you remove a statement like that.
That seems like a standard statement.
It's a great statement.
Okay, here it is.
Google removes don't be evil clause from its code of conduct.
In 2018?
Yeah.
Yeah.
I wonder why.
Did they have an explanation?
Did it say anything?
Anything?
Don't be evil has been a part of the company's corporate code of conduct since 2000
when Google was reorganized under a new parent company, Alphabet.
In 2015, Alphabet assumed a slightly adjusted version of the model.
Do the right thing.
Do the right thing.
Oh, that's a Spike Lee movie, bitch.
However, Google retained its original don't be evil language until the past several weeks.
The phrase has been deeply incorporated into Google's company culture,
so much so that a version of the phrase has served as the Wi-Fi password on the shuttles
that Google uses to ferry its employees to its Mountain View headquarters.
I think I remember that, yeah.
Wow.
You get on the bus and you type in don't be evil.
I wonder why
they decided well i mean they they did change it to do the right thing i mean we always used to say
that um just to friends not within google but just you know instead of saying don't be evil just say
let's let's do some good here right that's nice let's do some good here's yeah think positive
think doing good instead of don't do bad yeah but the problem is when you say do good the
question is who's good because you live in a morally plural society and there's this question
of who are you to say what's good for people and it's much easier to say let's reduce harms than
it is to say let's actually do good like this it says the updated version of google's code of
conduct still retains one reference to the company's unofficial motto the final line of the document is still and
remember dot dot dot don't be evil and if you see something that you think isn't right speak up
okay well they still have don't be evil though so maybe it's much ado about nothing but uh
having that kind of power we were just before the podcast, we were watching Jack Dorsey speak to members of the Senate in regards to Twitter censoring the Hunter Biden story and censorship of conservatives, but allowing dictators to spread propaganda, dictators from other countries, and why and what this is all about.
One of the things that Jack Dorsey has been pretty adamant about is that they really
never saw this coming when they started Twitter. And they didn't think that they were ever going
to be in this position where they were going to be really the arbiters of free speech for the world,
which is essentially in some ways what they are. I think it's important to roll back the clock for
people because it's easy to think, you know,
that we just sort of landed here
and that they would know
that they're going to be influencing
the global psychology.
But I think we should really reverse engineer
for the audience.
How did these products work the way that they did?
So like, let's go back to the beginning days of Twitter.
I think his first tweet was something like
checking out the buffaloes in Golden Gate Park
in San Francisco.
You know, Jack was fascinated by the taxi cab dispatch system that you could send a message
and then all the taxis get it. And the idea is could we create a dispatch system so that I post
a tweet and then suddenly all these other people can see it. And the real genius of these things
was that they weren't just offering this thing you could do.
They found ways of keeping people engaged.
I think this is important for people to get, that they're not competing for your data or for, you know, money.
They're competing to keep people using the product.
And so when Twitter, for example, invented this persuasive feature of the number of followers that you have,
if you remember, like, that was a new thing at the number of followers that you have. If you
remember, like that was a new thing at the time, right? You log in and you see your profile.
Here's the people who you can follow. And then here's the number of followers you have. That
created a reason for you to come back every day to see how many followers do I have. So that was
part of this race to keep people engaged. As we talk about in the film, like these things are
competing for your attention, that if you're not paying for the product, you are the product,
but the thing that is the product
is your predictable behavior.
You're using the product in predictable ways.
And I remember a conversation I had
with someone at Facebook who was a friend of mine
who said in a coffee shop one day,
people think that we, Facebook,
are competing with something like
Twitter, that one social network is competing with another social network. But really, he said,
our biggest competitor is YouTube, because they're not competing for social networks,
they're competing for attention. And YouTube is the biggest competitor in the digital space for
attention. And that was a real light bulb moment for me, because you realize that as they're
designing these products, they're finding new clever ways to get your attention.
That's the real thing that I think is different in the film The Social Dilemma, rather than talking about censorship and data and privacy and these themes.
It's really what is the core influence or impact that the shape of these products have on how we're making meaning of the world when they're steering our psychology.
shape of these products have on how we're making meaning of the world when they're just hearing our psychology. Do you think that it was inevitable that someone manipulates the way people use these
things to gather more attention? And do you think that any of this could have been avoided if there
was laws against that? If instead of having these algorithms that specifically target things that
you're interested in or things that you click
on or things that are going to make you engage more if they just allow these things to if someone
said listen you can have these things you can allow people to communicate with each other but
you can't manipulate their attention span yeah i mean i think the so we've always had an attention
economy right and you're competing for it right now.
And politicians compete for it.
Can you vote for someone you've never paid attention to,
never heard about, never heard them say something outrageous?
No.
So there's always been an attention economy.
And so it's hard to say we should regulate who gets attention or how.
But it's organic in some ways.
Right.
Like this podcast is an organic... I mean, if we're in competition, it's organic. I just put it out there.
And if you watch it, you don't or you don't.
I don't, you know, I don't have any say over it.
And I'm not manipulating it in any way.
Sort of.
So, I mean, let's imagine that the podcast apps were different.
And they actually, while you're watching, they had like the hearts and the stars and the kind of voting up in numbers.
And you could like send messages back and forth.
And Apple Podcasts worked in a way that didn't just reward, you know, the things that you clicked follow on.
It actually sort of promoted the stuff that someone said the most outrageous thing.
Then you as a podcast creator have an incentive to say the most outrageous thing.
And then you arrive at the top of the Apple Podcasts or Spotify app.
And that's the thing is that we actually are competing for attention it felt like it was neutral and it was relatively neutral and to progress that story back in time with um
you know twitter competing for attention let's look at some other things that they did so
they also added this retweet this instant resharing feature right and that made it more
addictive because suddenly we're all playing the fame lottery right like i could retweet retweet your stuff, and then you get a bunch of hits, and then
you could go viral, and you could get a lot of attention. So then instead of the companies
competing for attention, now each of us suddenly win the fame lottery over and over and over again,
and we're getting attention. And then, oh, I had another example I was going to think about,
and I forgot it. What was it?
You can jump in if you want. Apple has an interesting way of handling sort of the way they have their algorithm for their podcast app.
It's secret.
It's weird.
But one of the things that it favors is it favors new shows and it favors uh engagement and new subscribers so
comments engagement and new shows there you go like and that's the same as competing for attention
because engagement must mean people like it and that's yeah and there's going to be a fallacy as
we go down that road but go on well it's interesting because you could say if you have a podcast and
your podcast gets like let's say a hundred thousand, a new podcast can come along and it can get 10,000 downloads and it'll be ahead of you in the rankings.
And so you could be number three and it could be number two.
And you're like, well, how is that number two?
And it's got 10 times less, but they don't do it that way.
And their logic is they don't want the podcast world to be dominated by, you know, New York
Times.
The big ones.
Yeah.
And whatever, whatever's number one and number two and number three forever.
We actually just experienced this.
We have a podcast called Your Undivided Attention.
And since the film came out in that first month, we went from being, you know, in the
lower 100 or something like that to we shot to the top five.
I think we were the number one tech podcast for a while.
And so we just experienced this through the fact,
not that we had the most listeners,
but because the trend was so rapid
that we sort of jumped to the top.
I think it's wise that they do that
because eventually it evens out over time.
You know, you see some people rock it to the top,
like, oh my God, we're number three.
And you're like, hang on there, fella.
Just give it a couple of weeks. And three weeks later four weeks later now they're
number 48 right they get depressed right well that was really where you should have been but
the the thing that apple does that i really like in that is it gives an opportunity for these new
shows to be seen and where they might have gotten just stuck because these these rankings and the ratings for a lot of these shows these shows are so
consistent and they have such a following already yeah it's very
difficult for these new shows to gather attention right and the problem was that
there were some people that game the system and there was companies that
could literally move like Earl Skake. Remember Earl became the number one podcast,
and no one was listening to it?
Earl has money, and he hired some people to game the system,
and he was kind of open about it and laughing about it.
Now isn't he banned from iTunes now or something?
I think he got banned because of that,
because it was so obvious he
gamed the system. He had like a thousand downloads and he was number one. I mean, the thing is that
we're, Apple podcasts, you can think of as like the federal reserve or the government of the
attention economy, because they're setting the rules by which you win, right? They could have
set the rules, as you said, to be, you know, who has the most listeners and then you just keep
rewarding the kings that already exist versus who is the most trending.
There's actually a story a friend of mine told me.
I don't know if it's true,
although it was a fairly credible source
who said he was in a meeting with Steve Jobs
when they were making the first podcast app
and that they had made a demo of something
where you could see all the things
your friends were listening to.
So just like making a news feed like we do with Facebook and Twitter, right?
And then he said was, well, why would we do that?
If something is important enough, your friend will actually just send you a link and say,
you should listen to this.
Like, why would we automatically just promote random things that your friends are listening to?
And again, this is kind of how
you get back to social media. How is social media so successful? Because it's so, it's much more
addictive to see what your friends are doing in a feed, but it doesn't reward what's true or what's
meaningful. And this, and this is the thing that people need to get about social media is it's,
it's really just rewarding the things that tend to keep people back addictively. The business
model is addiction in this race to the bottom of the brainstem for attention.
Well, it seems like if we, in hindsight, if hindsight is 2020, what should
have been done or what could have been done had we known where this would pile out is that they
could have said, you can't do that. You can't manipulate these algorithms to make sure that
people pay more attention and manipulate them to ensure
that people become deeply addicted to these platforms what you can do is just let them
openly communicate right but it has to be organic and then the problem is so if you this is the
thing i was going to say about twitter is um when one company does the call it the engagement feed
meaning showing you the things that the most people are clicking on and retweeting, trending, things like that.
Let's imagine there's two feeds.
So there's the feed that's called the reverse chronological feed, meaning showing in order in time, you know, Joe Rogan posted this two hours ago.
But that's, you know, after that, you have the thing that people posted an hour and a half ago, all the way up to 10 seconds ago.
That's the reverse chronological.
They have a mode like that on Twitter.
If you click the sparkle icon, I don't know if you know this, it'll show you just in time, here's what people said, you know, sorted by recency.
But then they have this other feed called what people click on, retweet, et cetera, the most, the people you follow.
And it sorts it by what it thinks you'll click on and want the most.
Which one of those is more successful at getting your attention?
The sort of recency, what they posted recently,
versus what they know people are clicking on retweeting on the most.
Certainly what they know people are clicking on and retweeting the most.
Correct.
And so once Twitter does that,
let's say Facebook was sitting there with the recency feed,
like just showing you here's the people who posted in this time order sequence.
They have to also switch to who is like the most relevant stuff, right?
The most clicked, retweeted the most.
So this is part of this race for attention that once one actor does something like that
and they algorithmically, you know, figure out what people, what's most popular, the
other companies have to follow because otherwise they won't get the attention.
So it's the same thing if, you know, Netflix adds the autoplay 5, 4, 3, 2, 1 countdown to get people
to watch the next episode.
If that works at, say, increasing Netflix's watch time by 5%, YouTube sits there, says
we just shrunk how much time people were watching YouTube because now they're watching
more Netflix.
So we're going to add 5, 4, 3, 2, 1 autoplay countdown.
And it becomes, again, this game theoretic race of who's going to add 5, 4, 3, 2, 1 autoplay countdown. And it becomes again, this game theoretic race
of who's going to do more. Now, if you open up TikTok, TikTok doesn't even wait. I don't know
if you know, or your kids use TikTok, but when you open up the app, it doesn't even wait for you to
click on something. It just actually plays the first video the second you open it, which none
of the other apps do, right? And the point of that is that causes you to enter into this engagement
stream even faster. So this is this, again, this race for attention produces things that are not good for society.
And even if you took the whack-a-mole sticker, you took the antitrust case, and you whack
Facebook, and you got rid of Facebook, or you whack Google, or you whack YouTube, you're
just going to have more actors flooding in doing the same thing.
And one other example of this is the time it takes to reach, let's say, 10 million followers.
So if you remember back in the, wasn't it Ashton Kutcher who raced for the first million
followers?
Raced with CNN.
Raced with CNN, right?
Yeah.
So now if you think of it, the companies are competing for our attention.
If they find out that each of us becoming a celebrity and having a million people we
get to reach, if that's the currency of the thing that gets us to come back to get more attention,
then they're competing at who can give us that bigger fame lottery hit faster.
So let's say 2009 or 2010 when Ashton Kutcher did that,
it took him, I don't know how long it took, months for him to get a million?
I don't remember.
It was a little bit though, right?
And then TikTok comes along and says,
hey, we want to give kids the ability
to hit the fame lottery and make it big,
hit the jackpot even faster.
We want you to be able to go from zero
to a million followers in 10 days, right?
And so they're competing to make that shorter
and shorter and shorter.
And I know about this because, you know,
speaking from a Silicon Valley perspective,
venture capitalists fund these new social platforms based on how fast
they can get to like 100 million users. There was this famous line that like, I forgot what it was,
but I think Facebook took like 10 years to get to 100 million users. Instagram took, you know,
I don't know, four years, three years or something like that. TikTok can get there even faster. And
so it's shortening, shortening, shortening. And that's what people are, that's what we're
competing for. It's like who can win the fame lottery faster?
But is a world where everyone broadcasts
to millions of people
without the responsibilities of publishers,
journalists, et cetera,
does that produce an information environment
that's healthy?
And obviously the film, The Social Dilemma,
is really about how it makes the worst of us
rise to the top, right?
So our hate, our outrage, our polarization,
what we disagree about, black and white thinking,
more conspiracy-oriented views of the world,
QAnon, Facebook groups, things like that.
And we can definitely go into it.
There's a lot of legitimate conspiracy theories,
so I don't want to make sure I'm not categorically dismissing stuff.
But that's really the point, is that we have landed in a world
where the things that we are paying attention to are not necessarily the agenda of topics that we would say in a
reflective world, what we would say is the most important. So there's a lot of, there's a lot of
conversation about free will and about letting people choose whatever they enjoy viewing and watching and paying attention to.
But when you're talking about these incredibly potent algorithms
and the incredibly potent addictions that people develop to these things,
and we're pretending that people should have the ability
to just ignore it and put it away.
Use your willpower, Joe.
Yeah, that seems...
Have your kids use your willpower.
I have a folder on my phone called Addict,
and it's all caps,
and it's at the end of my...
You have to scroll through all my other apps to get to it,
and so if I want to get to Twitter or Instagram,
I've got to go there.
The problem is that the app switcher
will put it in the most recent. So once you switch apps and you have twitter in a recent it'll be
right there so that's if i want to go left and yeah if i want to see that yeah you can do that
yeah it's um it's insanely addictive and uh if you can control yourself it's not that big a deal
but how many people can control themselves?
Well, I think the thing we have to hone in on is the asymmetry of power.
As I say in the film, it's like we're bringing this ancient brain hardware, the prefrontal cortex, which is like what you use to do goal-directed action, self-control, willpower, holding back, you know, marshmallow test,
don't do the, don't get the marshmallow now, wait later for the two marshmallows later.
All of that is through our prefrontal cortex. And when you're sitting there and you think,
okay, I'm going to go watch, I'm going to look at this one thing on Facebook because my friend
invited me to this event, or it's this one post I have to look at. And the next thing you know,
you find yourself scrolling through the thing for like an hour. And you say, man, that was on me. I should have had more self-control. But there
behind the screen, behind that glass slab is like a supercomputer pointed at your brain
that is predicting the perfect thing to show you next. And you can feel it. Like it's,
this is really important. So like if I'm Facebook and when you flick your finger, you think when you're using Facebook,
it's just going to show me the next thing that my friend said.
But it's not doing that.
When you flick your finger, it actually literally wakes up this sort of supercomputer avatar
voodoo doll version of Joe.
And the voodoo doll of Joe is, you know, the more clicks you ever made on Facebook is like
adding the little hair to the voodoo doll.
And the more likes you've ever made adds little clothing to the voodoo doll. And the more watch time on videos
you've ever had adds little shoes to the voodoo doll. So the voodoo doll is getting more and more
accurate the more things you click on. This is in the film The Social Dilemma. If you notice,
the character, as he's using this thing, it builds a more and more accurate model that the AIs,
the three AIs
behind the screen are kind of manipulating. And the idea is it can actually predict and prick
the voodoo doll with this video or that post from your friends or this other thing, and it'll figure
out the right thing to show you that it knows will keep you there because it's already seen how that
same video or that same post has kept 200 million other voodoo dolls there because you just look like another voodoo doll so here's an example and this works the same on all the
platforms if you are we're a teen girl and you opened a dieting video on youtube um 70 of youtube's
watch time comes from the recommendations on the right hand side right so the things that are
showing recommended videos next and it will will show you, it'll show,
what did it show that the girls who watched the teen dieting video?
It showed anorexia videos because those were better
at keeping the teen girls' attention.
Not because it said these are good for them,
these are helpful for them.
It just says these tend to work at keeping their attention.
So again, these tend to work if you are already watching diet videos? Yeah, So if you're a 13 year old girl and you watch a diet video, YouTube wakes up its voodoo
doll version of that girl and says, Hey, I've got like a hundred million other voodoo dolls of 13
year old girls. Right. And they all tend to watch these, these other videos. I don't know what,
I just know that they have this word Thinspo, Thinspiration is the name for it to be inspired
for anorexia. Yeah. It's a real thing. YouTube addressed this problem a couple years ago.
But when you let the machine run blind, all it's doing is picking stuff that's engaging.
Why did they choose to not let the machine run blind with one thing, like anorexia?
Well, so now we're getting into the Twitter censorship conversation and the moderation conversation.
So the real—this is why I don't focus on censorship and moderation,
because the real issue is if you blur your eyes
and zoom way out and say,
how does the whole machine tend to operate?
Like no matter what I start with,
what is it going to recommend next?
So, you know, if you started with,
you know, a World War II video,
YouTube would recommend
a bunch of Holocaust denial videos, right?
If you started Teen Girls with a dieting video, it would recommend these bunch of holocaust denial videos right if you started teen girls with
a dieting video it would recommend these anorexia videos uh in facebook's case if you joined there's
so many different examples here because facebook recommends groups to people based on what it
thinks is most engaging for you so if you were a new mom you had renee di resta my friend on this
podcast we've done a bunch of work together and she has this great example of as a new mom,
she joined one Facebook group for mothers who do do-it-yourself baby food, like organic baby food.
And then Facebook has this sidebar.
It says, here's some other groups you might recommend, you might want to join.
And what do you think was the most engaging of those?
Because Facebook, again, is picking on which group, if I got you to join it,
would cause you to spend the most time here, right?
So for some do-it-yourself baby food groups, which group do you think it selected?
Probably something about vaccines.
Exactly.
Really?
So anti-vaccines for moms, yeah.
Oh.
Okay, so then if you join that group, now it does the same, runs the process again.
So now look at Facebook.
So it says, hey, I've got these voodoo dolls.
I've got like 100 million voodoo dolls, And they just joined this anti-vaccine moms group. And then what do they
tend to engage with for a very long time if I get them to join these other groups? Which of those
other groups would show up? I don't know. Chemtrails. Oh, okay. The Pizzagate. Flat Earth?
Flat Earth. Absolutely. Yep. And YouTube recommended... So I'm interchangeably going from youtube to facebook
because it's the same dynamic they're competing for attention and youtube recommended flat earth
conspiracy theories hundreds of millions of times and so when you when you're a parent during covid
and you sit your kids in front of youtube because you're like i'm i've got a this is the digital
pacifier i gotta let them do their thing i gotta do work right and then you come back to the dinner
table and your kid says you know the holocaust didn't happen and the earth is flat and people I've got to, this is the digital pacifier. I've got to let them do their thing. I've got to do work. And then you come back to the dinner table,
and your kid says, you know, the Holocaust didn't happen,
and the Earth is flat.
And people are wondering why.
It's because of this.
And now, to your point about this sort of moderation thing,
we can take the whack-a-mole stick after the public yells,
and Renee and I make a bunch of noise or something,
and a large community, by the way,
of people making noise about this.
And they'll say, okay, shoot, you're right. Flat Earth,, we got to deal with that. And so they'll tweak the algorithm. And
then people make a bunch of noise about the Thinspiration videos for anorexia for kids,
and they'll deal with that problem. But then they start doing it based reactively. But again,
if you zoom out, it's just still recommending stuff that's kind of from the crazy town section of is the problem the recommendation
because i i don't mind that people have ridiculous ideas about hollow earth because i think it's
humorous but i'm also a 53 year old man right right i'm not i'm not a 12 year old boy with a
limited education that is like oh my god the government's lying to us there's lizard people
that live under the earth.
But if that's the real argument about these conspiracy theories
is that they can influence young people
or the easily impressionable
or people that maybe don't have a sophisticated sense
of vetting out bullshit.
Right.
Well, and the algorithms aren't making a distinction
between who is just laughing at it
and who is deeply vulnerable to it.
And generally, it just says who deeply vulnerable to it. And generally,
it just says who's vulnerable to it. Because another example, the way I think about this is
if you're driving down the highway and there's Facebook and Google trying to figure out like,
what should I give you based on what tends to keep your attention? If you look at a car crash
and everybody driving on the highway, they look at the car crash. According to Facebook and Google,
it's like the whole world wants car crashes. We just feed them car crashes after car crashes after car crashes.
And what the algorithms do, as Guillaume Chasleau in the film says, who's the YouTube whistleblower
for the YouTube recommendation system, is they find the perfect little rabbit hole for
you that it knows will keep you there for five hours.
And the conspiracy theory, like dark corners of YouTube, were the dark corners that tends
to keep people there for five hours.
And so you have to realize that we're now something like 10 years in to this vast psychology
experiment, where it's been, you know, in every language in hundreds of countries, right? And
in hundreds of languages, it's been steering people towards the crazy town. When I say crazy
town, I think of, you know, imagine there's a spectrum on YouTube. And there's on one side,
of, you know, imagine there's a spectrum on YouTube. And there's on one side, you have like,
the calm Walter Cronkite, Carl Sagan, you know, slow, you know, kind of boring, but like educational material or something. And the other side of the spectrum, you have, you know, the
craziest stuff you can find. Crazy Town. No matter where you start, you could start in Walter
Cronkite, or you could start in Crazy Town. But if I'm YouTube and I want you to watch more, am I going to steer you towards the calm stuff or am I going to steer you more towards Crazy Town?
Crazy Town.
Always more towards Crazy Town.
So then you imagine just tilting the floor of humanity just by like three degrees, right?
And then you just step back and you let society run its course.
As Jaron Lanier says in the film, if you just tilt society by one degree, two degrees, that's the whole world. That's what everyone is thinking
and believing. And so if you look at the degree to which people are deep into rabbit hole conspiracy
thinking right now, and again, I want to acknowledge COINTELPRO, Operation Mockingbird,
there's a lot of real stuff, right? So I'm not categorically dismissing it, but we're asking,
what is the basis
upon which we're believing
the things we are about the world?
And increasingly that's based on technology
and we can get into, you know,
what's going on in Portland.
Well, the only way I know that
is I'm looking at my social media feed
and according to that,
it looks like the entire city's on fire
and it's a war zone.
But if you, I called a friend there the other day
and he said, it's a beautiful day,
there's actually no violence anywhere near where I am. It's just like these two blocks or something
like that. And this is the thing is warping our view of reality. And I think that's what really,
for me, The Social Dilemmas was really trying to accomplish as a film. And the director,
Jeff Wolowski, was trying to accomplish is how did this society go crazy everywhere all at once,
you know, seemingly, you know,
this didn't happen by accident,
happened by design of this business model.
When did the business model get implemented?
Like when did they start using these algorithms
to recommend things?
Because initially YouTube was just a series of videos
and it didn't have that recommended section.
When was that?
You know, that's a good question.
I mean, you know, originally YouTube was just post a video and you can get people to, you know,
go to that URL and send it around. They needed to figure out, once the competition for attention
got more intense, they needed to figure out how am I going to keep you there? And so recommending
those videos on the right-hand side, I think that was there pretty early, if I remember, actually. Because that was sort of the innovation,
is keeping people within this YouTube wormhole. And once people were in the YouTube wormhole,
constantly seeing videos, that was what they could offer the promise to a new video uploader,
hey, if you post it here, you're going to get way more views than if you post it on Vimeo.
Right? And that's the thing. If I open up TikTok right now on my phone,
do you have TikTok on your phone?
Um,
well,
I'm not supposed to,
obviously,
but more for research purposes,
research.
Do you know how to TikTok at all?
No.
My 12 year old is obsessed.
Oh really?
Oh yeah.
She can't even sit around.
If she's standing still for five minutes,
she just starts like,
she starts TikTokingcking and that's the thing
i mean 2012 2012 so the mayans were right right 2012 the platform announced an update to the
discovery system uh designed to identify the videos people actually want to watch by prioritizing
videos that hold attention throughout as well as increasing the amount of time a user spends on
the platform overall youtube youtube could assure advertisers that it was providing a valuable, high-quality experience for people.
Yeah.
So that's beginning of the end.
Yeah.
So 2012 on YouTube's timeline, I mean, you know, the Twitter and Facebook world,
I think, introduces the retweet and reshare buttons in the 2009 to 2010 kind of time period
So you end up with this world where the things that were most paying attention to are based on algorithms choosing for us
and so sort of deeper argument that's in the film that I'm not sure everyone picks up on is
These technology systems have taken control of human choice
They've taken control of humanity because they're
controlling the information that all of us are getting. Think about every election. Like,
I think of Facebook as kind of a voting machine, but it's a sort of indirect voting machine because
it controls the information for four years that your entire society is getting. And then everyone
votes based on that information. Now you could say, well, hold on, radio and television were there and were partisan before that. But actually, radio and TV are often getting
their news stories from Twitter, and Twitter is recommending things based on these algorithms.
So when you control the information that an entire population is getting, you're controlling
their choices. I mean, literally in military theory, if I want to screw up your military,
I want to control the information that it's getting. I want to confuse the enemy. And that
information funnel is the very thing that's been corrupted. And it's like the Flint water supply
for our minds. I was talking to a friend yesterday, and she was saying that there were articles that
she was laughing that there's articles that are written about negative tweets that random people make about
a celebrity doing this or that and she was like and she was quoting this article she's like look
how crazy this is this is a whole article that's written about someone who decided to say something
negative about some something some celebrity had done and then it becomes this huge art and then
the tweets are prominently featured right and then the response to those i mean like really like arbitrary like weird because it's a values blind
system that just cares about what will get attention exactly and that's what the article
was it was just an attention grab it's interesting because um prince harry and megan have become very
interested in these issues and are actively working on these issues and um getting to know
them just a little bit are they really really? Because it affects them personally?
Well, it's actually interesting. I mean, I don't want to speak for them, but I think Megan has been
the target of the most vitriol hate-oriented stuff on the planet, right? From just the amount of
sort of criticism that they get. Really?
And scrutiny. Yeah. I mean, she's just like newsfeeds filled with hate about just what she
looks like, what she says, just constantly. Boy, I'm out of the loop. I've, she's just like newsfeeds filled with hate about just what she looks like, what she says just constantly.
Boy, I'm out of the loop. I've never seen anything. She's pretty. What do they think she looks like?
Honestly, I don't follow it myself because I don't fall into these attention traps. I try not to.
But people, she just faces the worst vitriol. I mean, this is the thing with teen bullying, right?
So I think they work on these issues because teenagers are now getting a micro version of this thing where each of us are scrutinized, you know? And I think that's what's not, I mean, think about what celebrity status
does and how it screws up humans in general, right? Like take an average celebrity, like it
warps your mind, it warps your psychology and you get scrutiny, right? When you suddenly are
followed, each person gets thousands or project forward into the future a few years. Each of us
have, you know, tens of thousands to hundreds of thousands of people that are following what we say. That's a lot of
feedback. And, you know, as Jonathan Haidt says in the film, I know you've had him here, you know,
it's made kids much more cautious and less risk-taking and more bullied overall. And
there's just huge problems in mental health around this. Yeah, it's really bad for young girls,
right?
Especially for celebrities.
And I've had quite a few celebrities in here, and we've discussed it.
I just tell them that you can't read that stuff.
Just don't read it.
Yeah.
Like, there's no good in it. Like, I had a friend, she did a show.
She's a comedian.
She did a show.
And she was talking about this one negative comment that was inaccurate.
It said she only did a half an hour and her show sucked.
She's like, fuck her.
I go, why are you reading that?
She's like, because it's mostly positive.
I go, but how come you're not talking about most of it then?
You're talking about this one person.
This one negative person.
We were both laughing about it.
She's healthy.
She's not completely fucked up by it,
but this one person got into her head.
I'm like, I'm telling you, the juice is not worth the squeeze.
But don't read those things.
But this is exactly right.
And this is based on how our minds work.
I mean, our minds literally have something called negativity bias.
So if you have 100 comments and 99 are positive and one is negative, where does the average human's mind go?
Right.
They go to the negative.
And it also goes to the negative even when you shut down the screen. Your mind is sitting there. They go to the negative. And it also goes to the negative.
Even when you shut down the screen, your mind is sitting there looping on that negative comment.
And why?
Because evolutionarily, it's really important that we look at social approval, negative social approval, because our reputation is at stake in the tribe.
Yes.
So it matters.
Yes. But it's never been easier now for not just that one comment to sort of gain more airtime, but then for that to build a hate mob and then to see the interconnected clicks.
And I can go in and see 10 other people that responded to that that are negative.
And so especially when you have teenagers that are exposed to this and you can keep going down the tree and see all of the hate fest on you.
This is the psychological environment that is the default way that kids are growing up now. I actually faced this recently with the film itself because actually the film has gotten just crazy positive acclaim for the most part. And
there's just a few negative comments. And for myself even, right? Here comes a conjunction,
but. I was glued to a few negative comments. And then you could click and you would see other
people that you know who positively like or respond to those comments.
You're like, why did that person say that negative thing?
I thought we were friends, that whole kind of psychology.
And we're all vulnerable to it.
Unless you learn, as you said, to tell your celebrity friends, just don't pay attention to it. Even mild stuff I see people fixate on.
Even mild disagreement or mild criticism people fixate on.
And it's also a problem because you realize that someone's saying this and you're not there and you can't defend yourself
So you have exactly feeling of helplessness like hey, that's not true
I didn't and then you you don't get it out of your system
You never you never get to express it and people can share that false negative stuff
I mean, I didn't not all negative stuff is false
But you can assert things and build on the hate fest right and? And start going crazy and saying, this person's a white supremacist,
or this person's even worse. And that'll spread to thousands and thousands of people. And next
thing you know, you check into your feed again at, you know, 8pm that night, and your whole
reputation has been destroyed. And you didn't even know what happened to you. And this happened to
teenagers too. I mean, they're anxious, like I'll post, you know, a teenager will post a photo.
They're high school, they make a dumb comment
without thinking about it.
And then next thing they know, you know,
at the end of the day, the parents are all calling
because like 300 parents saw it
and are calling up the parent of that kid.
And it's, you know, we talk to teachers a lot
in our work at the Center for Humane Technology
and they will say that on Monday morning,
this is before COVID, but on Monday morning morning they spend the first like hour of class having to
clear all the drama that happened on social media from the weekend for the kids jesus and again like
this and these kids are in what age group this was like eighth ninth ninth tenth grade that kind of
thing and the the other problem with these kids
is there's not like a long history of people
growing up through this kind of influence
and successfully navigating it.
These are the pioneers.
Yeah, and they won't know anything different,
which is why we talk about in the film,
like they're growing up in this environment.
And one of the simplest principles of ethics
um uh is the ethics of symmetry doing unto others as you would do to yourself and as we say at the
end of the film like one of the easiest ways you know that there's a problem here is that many of
the executives at the social media tech companies don't let their own kids use social media right
they literally say at the end of the film, like, we have a rule about it.
We're religious about it.
We don't do it.
The CEO of Lunchables Foods
didn't let his own kids eat Lunchables.
That's when you know.
If you talk to a doctor or a lawyer,
a doctor, and you say,
you know, would you get this surgery for your own kid?
They say, oh no, I would never do that.
Like, would you trust that doctor?
Right.
And it's the same for a lawyer.
So this is the relationship
where we have a relationship of asymmetry and technology is influencing all of us.
And we need a system by which, you know, when I was growing up, you know, I grew up on the
Macintosh and technology and I was creatively doing programming projects and whatever else.
The people who built the technology I was using would have their own kits,
use the things that I was using because they were creative and they were about tools and empowerment. And that's what's changed. We
don't have that anymore, because the business model took over. And so instead of having just
tools sitting there, like hammers waiting to be used to build, you know, creative projects,
or programming to invent things or paintbrushes, or whatever, we now have a manipulation based
technology environment where everything you use has this incentive to not only addict you, to have you play the fame lottery get social feedback because those are all the
things that keep people's attention isn't this also a problem with these information technologies
being attached to corporations that have this philosophy of unlimited growth yes so they're
they're no matter how much they make i i applaud apple because i think they're the only company that takes steps to
protect privacy to uh block advertisements to make sure that at least like when you when you
use their maps application they're not saving your data and sending it to everybody and it's
one of the reasons why apple maps is really not as good as Google Maps.
But I use it.
And that's one of the reasons why I use it.
And when Apple came out recently and they were doing something to block your information being sent to other places.
And I forget, what was the exact thing that it
was in the new iOS they released a thing that blocks the tracking identifiers
that's right and it's not actually out yet it's gonna be out in January or
February I think someone told me and what that's do that's a good example of
they're putting a tax on the advertising industry because just by saying you
can't track people individually that you know
takes down the value of an advertisement by like 30 or something here it is pops up and you when
i do safari i get this whole privacy report thing right it says it's like in the last seven days
it's prevented 125 trackers from profiling me right yeah and you can opt out of that if you'd
like if you're like no fuck that track me yep yeah you can do that you can let them send your data but that that seems to me a much more ethical approach to be able to decide
whether or not these companies get your information i mean those things are great um the challenge is
imagine you get the privacy equation perfectly right look at this apple working on its own
search engine as google ties oh yeah it could be cut soon. I started using DuckDuckGo for that very reason,
just because they don't do anything with it.
They give you the information, but they don't take your data
and do anything with it.
The challenge is, let's say we get all the privacy stuff perfectly, perfectly right,
and data production and data controls and all that stuff.
In a system that's
still based on attention and grabbing attention and harvesting and strip mining our brains uh
you still get maximum polarization addiction mental health problems isolation teen depression
and suicide um polarization breakdown of truth right right so that's we really focus in our work
uh on those topics because that's the we really focus in our work on those topics, because that's the direct
influence of the business model on warping society. Like we need to name this mind warp,
we think of it like the climate change of culture, that, you know, they seem like,
they seem like different disconnected topics, much like with climate change, you'd say like,
okay, we've got species loss in the Amazon, we've got we're losing insects, we've got melting
glaciers, we've got ocean acidification, we've got the coral reefs,'ve got, we're losing insects. We've got melting glaciers. We've got ocean
acidification. We've got the coral reefs, you know, getting, dying. These can feel like disconnected
things until you have a unified model of how emissions change all of those different phenomena,
right? In the social fabric, we have shortening of attention spans. We have more outrage-driven
news media. We have more polarization. We have more breakdown of truth. We have more outrage-driven news media. We have more polarization. We have more breakdown of truth.
We have more conspiracy-minded thinking.
These seem like separate events and separate phenomena, but they're actually all part of
this attention extraction paradigm, that the company's growth, as you said, depends on
extracting more of our attention, which means more polarization, more extreme material,
more conspiracy thinking, and shortening attention spans. Because we also say, like, you know, if we want to double the
size of the attention economy, I want your attention, Joe, to be split into two separate
streams. Like, I want you watching the TV, the tablet, and the phone at the same time,
because now I've tripled the size of the amount of extractable attention that I can get for
advertisers, which means that by fracking for attention and splitting you into more junk attention that's
like thinner, we can sell that as if it's real attention, like the financial crisis,
where you're selling thinner and thinner financial assets as if it's real, but it's really just a
junk asset. And that's kind of where we are now, where it's sort of the junk attention economy,
because we can shorten attention spans and we're debasing the substrate of that makes up our society because
everything in a democracy depends on individual sense making and meaningful choice meaningful
free will meaningful independent views but if that's all basically sold to the highest bidder
that debases the soil from which independent views grow because all of us are jacked into
this sort of matrix of social media manipulation that, that's ruining and degrading our democracy.
And that's really, there's many other things that are ruining and degrading our democracy,
but that's this sort of invisible force that's upstream that affects every other thing downstream.
Because if we can't agree on what's true, for example, you can't solve any problem.
I think that's what you talked about in your 10-minute on the social dilemma. I think I saw on YouTube. Yeah. Um, your organization highlights all
these issues and, you know, in an amazing way and it's very important, but do you have any solutions?
It's hard, right? So I just want to say that this is, is a complex, a problem as climate change.
So I just want to say that this is as complex a problem as climate change in the sense that you need to change the business model.
I think of it like we're on the fossil fuel economy and we have to switch to some kind of beyond that thing, right?
Because so long as the business models of these companies depend on extracting attention, can you expect them to do something different?
You can't, but how could you?
I mean, there's so much money involved, and now they've accumulated so much wealth that they have an amazing amount of influence.
Yeah.
You know, and...
And the asymmetric influence can buy lobbyists, can influence Congress, and prevent things
from happening.
So this is why it's kind of the last moment.
That's right.
But, you know, I think we're seeing signs of real change. We have the antitrust case that was just filed against Google
in Congress. We're seeing more hearings. What was the basis of that case?
You know, to be honest, I was actually in the middle of the social dilemma launch when I think
that happened and my home burned down in the recent fires in Santa Rosa. So I actually missed
that happening. Sorry to hear that. Yeah. sorry. That was a big thing to drop.
But yeah, no, it's awful.
There's so much that's been happening in the last six weeks.
I was evacuated three times where I lived in California.
Oh, really?
Yeah, so it got real close to our house.
Justice Department sues monopolist Google for violating antitrust laws.
Department files complaint against Google to restore competition in search and search
advertising markets.
Okay, so it's all about search.
Yeah, this is, right, this was a case that's about Google using its dominant position to privilege its own search engine in its own products and beyond, which is similar to sort of Microsoft bundling in the Internet Explorer browser.
But, you know, this is all good progress, but really it misses
the kind of fundamental harm of like these things are warping our society. They're warping how our
minds are working. And there's no, you know, congressional action against that because it's
a really hard problem to solve. I think that the reason the film for me is so important is that
if I look at the growth rate of how fast Facebook has been recommending people into conspiracy groups and
kind of polarizing us into separate echo chambers, which we should really break down, I think,
as well for people, like exactly the mechanics of how that happens. But if you look at the growth
rate of all those harms compared to, you know, how fast has Congress passed anything to deal with it,
like basically not at all. They seem a little bit unsophisticated in that regard.
It might be an understatement, yeah.
Yeah, they, I'm trying to be charitable.
I want to be charitable too.
And I want to make sure I call out,
and there's Senator Mark Warner, Blumenthal,
several other senators we've talked to
have been really on top of these issues
and led, I think, Senator Warner's white paper
on how to regulate the tech platforms is one of the best.
It's from two years ago in 2018.
And Rafi Martina, his staffer, is an amazing human being.
He works very hard on these issues.
So there are some good folks.
But when you look at the broad, like the hearing yesterday, it's mostly grandstanding to politicize the issue, right?
Because you turn it into, on the right, hey, you're censoring conservatives.
And on the left, it's, hey, you're not taking down enough misinformation and dealing with the hate speech and all these kinds of things
right and they're not actually dealing with how would we solve this problem they're just trying
to make a political point to win over their base now facebook recently banned the q anon pages
which uh i thought was kind of fascinating because i'm like well this is a weird sort of slippery
slope isn't it?
Like if you decide that you, I mean, it almost seemed to me like,
well, we'll throw them a bone.
We'll get rid of QAnon because it's so preposterous.
Let's just get rid of that.
But what else?
Like if you keep going down that rabbit hole, where do you draw the line?
Like where, are you allowed to have JFK conspiracy conspiracy theories are you allowed to have flat earth are
you allowed i mean i guess flat earth is not dangerous is that where they make the distinction
so i think their policy is evolving in the direction of when things are causing offline
harm when online content is known to precede offline harm that's when the platform that's
the standard by which platforms are acting what um what offline harm, that's when the platform, that's the standard by which platforms are acting.
What offline harm has been caused
by the QAnon stuff, do you know?
There's several incidents.
We interviewed a guy on our podcast about it.
There's some armed at gunpoint type thing.
I can't remember.
And there's things that are priming people
to be violent, you know.
These are, I just want to say, these are really tricky
topics, right? I think what I want to make sure we get to, though, is that there are many people
manipulating the groupthink that can happen in these echo chambers. Because once you're in one
of these things, like I studied cults earlier in my career, and the power of cults is like they're
a vertically integrated persuasion stack, because they control your social relationships, they
control who you're hearing from and who you're not hearing from. They give you meaning, purpose, and belonging.
They've a custom language. They have an internal way of referring to things.
And social media allows you to create this sort of decentralized cult factory where it's easier to
grab people into an echo chamber where they only hear from other people's views.
And Facebook, I think, even just recently announced that they're going to be promoting more of the Facebook group content into feeds,
which means that they're actually going to make it easier for that kind of manipulation to happen.
But did they make the distinction between group content and conspiracy groups?
Like, how do you, when does group content, when does it cross a line?
I don't know.
I mean, the, the policy teams that work on this are coming up with their own standards.
So I'm not familiar with it.
If you think about, you know, think about how hard it is to come up with a law at the federal level that all states will agree to.
Then you imagine Facebook trying to come up with a policy that will be universal to all the countries that
are running Facebook, right? Well, then you imagine how you take a company that never thought they
were going to be in the position to do that. Correct. And then within a decade, they become
the most prominent source of news and information on the planet Earth. Correct. And now they have
to regulate it. And you know, I actually believe Zuckerberg when he says, I don't want to make
these decisions. I shouldn't be in this role where my beliefs decide the whole world's views.
He genuinely believes that.
And to be sure of that.
But the problem is he created a situation where he is now in that position.
I mean, he got there very quickly.
And they did it aggressively when they went into countries like Myanmar, Ethiopia, all
throughout the African continent where they gave, do you know about Free Basics?
No.
So this is the program that I think has gotten something like 700 million accounts onto Facebook,
where they do a deal with like a telecommunications provider, like their version of AT&T in Myanmar
or something.
So when you get your smartphone, it comes-
Facebook's built in.
Facebook's built in.
Yes, I do know about that.
And there's a asymmetry of access where it's free to access Facebook, but it costs money to do the other things for the data plan.
So you get a free Facebook account.
Facebook is the internet, basically, because it's the free thing you can do on your phone.
And then we know that there's fake information that's being spread there.
So the data doesn't apply to Facebook use?
Yeah, I think the cost, you know how we pay for data here?
I think you don't pay for Facebook,
but you do pay for all the other things,
which creates an asymmetry where of course you're going to use Facebook
for most things.
Right, so you have Facebook Messenger,
video calls.
Yeah, and WhatsApp.
I don't know exactly with video,
because different levels of...
Facebook has video calls as well, right?
In general, they do.
Yeah, I just don't know how that works
in the developing world. But there's a joke within within facebook well i mean this has caused genocides
right so in myanmar which is in the film um the rohingya muslim minority group um many rohingya
were persecuted and murdered because of fake information spread by the government um on
facebook using their asymmetric knowledge with fake accounts i mean even just a couple weeks ago
facebook took down a network of i think several hundred thousand fake accounts in Myanmar.
And they didn't even have at the time more than something like four or five people
in their extended Facebook network who even spoke the language of that country.
Oh, God.
So when you realize that this is like the, I think of like the Iraq war, Colin Powell,
Pottery Barn rule, where like, you know,
if you go in and you break it, then you are responsible for fixing it. This is Facebook
actively doing deals to go into Ethiopia, to go into Myanmar, to go into the Philippines or
whatever and providing these solutions. And then it breaks the society. And they're now in a position
where they have to fix it. There's actually a joke within Facebook that if you want to know which countries will be quote-unquote at risk in two years from
now, look at which ones have Facebook free basics. Jesus. And it's terrifying that they do that and
they don't have very many people that even speak the language. So there's no way they're going to
be able to filter it. That's right. And so now if you take it back, I know we were talking outside
about the congressional hearing and Jack Dorsey and the questions from the senator about are you taking down the content from the Ayatollahs or from the Chinese Xinjiang province about the Uyghurs?
You know, when there's sort of speech that leads to offline violence in these other countries.
The issue is that these platforms are managing the information commons for countries they don't even speak the language of.
And if you think the conspiracy theory sort of dark corners, crazy town of the English Internet are bad, and we've already taken out like hundreds of whack-a-mole sticks and they've hired hundreds of policy people and hundreds of engineers to deal with that problem.
problem. You go to a country like Ethiopia where there's something like 90 major, there's 90 something dialects I think in the country and six major languages where one of them is the dominant
Facebook sort of language and then the others get persecuted because they actually don't have
a voice on the platform. This is really important that the people in Myanmar who got persecuted and murdered didn't have to be on
Facebook for the fake information spread about them to impact them, for people to go after them,
right? So this is the whole, I can assert something about this minority group. That
minority group isn't on Facebook. But if it manipulates the dominant culture to go, we have
to go kill them, then they can go do it.
And the same thing has happened, you know, in India, where there's videos uploaded about, hey, those Muslims, I think they're called flesh killings, where they'll say that these Muslims killed this cow.
And in Hinduism, the cows are sacred.
The, did I get that right?
Anyway. I believe that right? Anyway.
I believe you did.
Yeah.
The, they will post those, they'll go viral on WhatsApp and say, we have to go lynch those
Muslims because they killed our sacred, the sacred cows.
And they went from something like five of those happening per year to now hundreds of
those happening per year because of fake news being spread, again, on Facebook about them,
on WhatsApp about them. And again, they don't have to be on the platform for this to happen to them, right? So this is critical that, you know, imagine you and I are in all, let's
imagine all of your listeners, you know, I don't even know how many you have, like tens of millions,
right? And we all listen to this conversation. We say, we don't want to even use Facebook and
Twitter or YouTube. We all still, if you live in the U.S., still live in a country that everyone else will vote based on everything that they're seeing on these platforms.
If you zoom out to the global context, all of us, we don't use Facebook in Brazil.
But if Brazil, which was heavily, the last election was skewed by Facebook and WhatsApp, where something like 87% of people saw at least one of the major fake
news stories about Bolsonaro, and he got elected, and you have people in Brazil chanting,
Facebook, Facebook, when he wins, he wins, and then he sets a new policy to wipe out the Amazon.
All of us don't have to be on Facebook to be affected by a leader that wipes out the Amazon
and accelerates climate change timelines because of those interconnected effects. So, you know, we at the Center for Immune Technology are looking at this
from a global perspective, where it's not just the US election, Facebook manages something like
80 elections per year. And if you think that they're doing all the monitoring that they are
for, you know, English speaking American election, most privileged society, now look at the hundreds
of other countries that they're operating in. Do you think that they're devoting the same resources to the other countries?
This is so crazy. It's like, is that you, Jamie? What's that weird noise?
You hear like a squeaky?
I heard it too.
Yeah.
Maybe it was me.
I don't think it is. Just my feedback. There it is.
It might be me.
Is it?
Breathing. I don't know.
You have asthma? I think I had an allergy coming up. might be feedback there it is it might be me is it breathing i don't know you have a you have
asthma i think i had an allergy coming oh i was like making some noises um what's terrifying is
that we're talking about from 2012 to 2020 um youtube implementing this program and then what
is even the birth of facebook what is that like 2002 or three? 2004. 2004.
This is such a short timeline and having these massive worldwide implications from the use of these things.
When you look at the future, do you look at this like a runaway train that's headed towards
a cliff?
Yeah.
I mean, I think right now this thing is a Frankenstein that it's not like even if Facebook was aware of all these problems, they don't have the staff unless they hired like hundreds of, you know, tens, hundreds of thousands of people, definitely used to be, we had editors and journalists, or at least editors,
or, you know, people edited even what went on television saying, what is credible, what is true,
like, you know, you sat here with, you know, with Alex Jones, even yesterday, and you're trying to
check him on everything he's saying, right, you're researching and trying to look that stuff up,
you're trying to be doing some more responsible communication. The premise of these systems is that you don't do that. The reason venture
capitalists find social media so profitable and such a good investment is because we generate
the content for free. We are the useful idiots, right? Instead of paying a journalist $70,000 a
year to write something credible, we can each be convinced to share our political views and we'll
do it knowingly for free. Actually, we don't really know that we're the useful idiots. That's the kind of
the point. And then instead of paying an editor $100,000 a year to figure out which of those
things is true that we want to promote and give exponential reach to, you have an algorithm says,
hey, what do people click on the most? What people like the most. And then you realize the quality
of the signals that are going into the information
environment that we're all sharing is a totally different process. We went from a high quality
gated process that costs a lot of money to this really crappy process that costs no money,
which makes the company so profitable. And then we fight back for territory, for values,
when we raise our hands and say, hey,
there's a thinspiration video problem for teenagers and anorexia.
Hey, there's a mass conspiracy sort of echo chamber problem over here.
Hey, there's flat earth sort of issues.
And again, these get into tricky topics because we want to, I know we both believe in free
speech, and we have this feeling that the solution to bad speech is better,
you know, more speech that counters the things that are said. But in a finite attention economy,
we don't have the capacity for everyone who gets bad speech to just have a counter response. In
fact, what happens right now is that that bad speech rabbit holes into not only call worse
and worse speech, but more extreme versions of that view that confirms it. Because once Facebook knows that that flat earth rabbit hole is good for you at getting
your attention back, it wants to give you just more and more of that. It doesn't want to say,
here's 20 people who disagree with that thing. Right? So I think if you were to imagine a
different system, we would ask, who are the thinkers that are most open-minded and synthesis
oriented where they can actually steel man the other side? Actually, they can do, you know, for this speech, here is the opposite
counter-argument. They can show that they understand that. And imagine those people get
lifted up. But notice that none of those people that you and I know, I mean, we're both friends
with Eric Weinstein. And, you know, I think he's one of these guys who's really good at sort of
offering the steel manning, here's the other side of this, here's the other side of that. But the people who generally do that aren't the
ones who get the tens of millions of followers on these surfaces. It's the black and white,
extreme outrage-oriented thinkers and speakers that get rewarded in this attention economy.
And so if you look at how, if I zoom way out and say, how is the entire system behaving? Just like
if I zoom out and say climate, you know, the climate system, like how is the entire overall system behaving?
It's not producing the kind of information environment
on which democracy can survive.
Jesus.
The thing that troubles me the most
is that I clearly see your thinking
and I agree with you.
Like, I don't see any holes in what you're saying.
Like, I don't know how this plays out,
but it doesn't look good.
And I don't see any holes in what you're saying. Like, I don't know how this plays out, but it doesn't look good. And I don't see a solution.
It's like if there are a thousand bison running full steam towards a cliff and they don't realize the cliff is there, I don't see how you pull them back.
So I think of it like we're trapped in a body and that's eating itself.
it like we're trapped in a body and that's eating itself. So like it's kind of a cannibalism economy because our economic growth right now with these tech companies is based on eating our own organs.
So we're eating our own mental health organs. We're eating the health of our children. We're
eating the, sorry for being so gnarly about it, but it's a cannibalistic system in a system that's
hurting itself or eating itself or punching itself. If one of the neurons wakes up in the body,
it's not enough to change that. It's going to keep punching itself. But if enough of the neurons wake
up and say, this is stupid, why would we build our system this way? And the reason I'm so excited
about the film is that if you have 40 to 50 million people who now recognize that we're living
in this sort of cannibalist system in which the economic incentive is to debase the life support
systems of your democracy,
we can all wake up and say, that's stupid.
Let's do something differently.
Let's actually change the system.
Let's use different platforms.
Let's fund different platforms.
Let's regulate and tame the existing Frankensteins.
And I don't mean regulating speech.
I mean, really thoughtfully, how do we change the incentives so it doesn't go to the same
race to the bottom?
And we have to all recognize that we're now 10 years into this hypnosis experiment of
warping of the mind.
And like, you know, as Friends of the Submitment says, it's like, how do we snap our fingers
and get people to say that, that artificial, there's an inflated level of polarization
and hatred right now that especially going into this election, I think we all need to
be much more cautious about what's running in our brains right now.
Yeah, I don't think most people are generally aware of what's causing this polarization.
I think they think it's the climate of society because the president and because of Black Lives
Matter and the George Floyd protests and all this jazz. But I don't think they understand
that that's exacerbated in a fantastic way by social media and the last 10 years of our addictions to social
media and these echo chambers that we all exist in. Yeah. So I want to make sure that we're both
clear. And I know you agree with this, that these things were already in society to some degree,
right? So we want to make sure we're not saying social media is blamed for all of it. Absolutely not. No, no, no.
It's gasoline.
It's gasoline, right.
Exactly.
It's lighter fluid for sparks of polarization.
It's lighter fluid for sparks of, you know,
more paranoid conspiracy.
Which is ironically what everybody,
it was the opposite of what everybody hoped
the internet was going to be.
Right.
Everybody hoped the internet was going to be
this bottomless resource of information
where everyone was going to be educated in a way they had never experienced before in the history of
the human race, where you'd have access to all the answers to all your questions. You know,
Eric Weinstein describes it as the library of Alexandria in your pocket. But no.
Well, and I want to be clear so that I'm not against technology or giving people access. In
fact, I think a world where everyone had a smartphone and a Google search box and Wikipedia
and like a search oriented of YouTube so you can look up health issues and how to do it
yourself, fix anything.
Sure.
Would be awesome.
That would be great.
I would love that.
Just want to be really clear because this is not an anti-technology conversation.
It's about, again, this business model that depends on recommending stuff to people, which
just to be clear on the polarization front, social media is more profitable when it gives you your own Truman Show that affirms
your view of reality every time you flick your finger.
That's going to be more profitable than every time you flick your finger.
I actually show you, here's a more complex, nuanced picture that disagrees with that.
Here's a different way to see it.
That won't be nearly as successful.
And the best way for people to test this we actually recommend even after seeing the
film to do this is open up Facebook on two phones especially like a you know
two partners or people who have the same friends so you have the same friends on
Facebook you would think if you scroll your feeds you'd see the same thing you're
the same people you're following so why wouldn't you see the same thing but if
you swap phones and you actually scroll through their feed for 10 minutes and you scroll through mine for 10 minutes,
you'll find that you'll see completely different information. And it won't, you'll also notice that
it won't feel very compelling. Like if you asked yourself, my friend Emily just did this with,
with her husband after seeing the film, and she literally has the same friends as her husband.
And she scrolled through the feed. She's like, this isn't interesting. I wouldn't come back to this.
Right?
And so we have to, again, realize how subtle and, yeah, just how subtle this has been.
I wonder what would happen if I scrolled through my feed because I literally don't use Facebook.
What do you use?
I don't use it at all.
I only use Instagram.
Use Instagram.
I stopped using Twitter because it's like a bunch of mental patients throwing shit at each other.
And I very rarely use it, I
should say. Occasionally I'll check some things
to see what the climate
is,
cultural climate. But I use
Instagram and Facebook.
I used to use Instagram
to post to Facebook, but I kind of stopped
even doing that.
It just seems gross.
Yeah.
It's just, and it's these people in these verbose arguments about politics and the economy
and world events and just...
We have to ask ourselves, is that medium constructive to solving these problems?
No.
Just not at all.
And it's an attention casino, right?
The house always wins.
And we're, you know,
Eric, you might see Eric Weinstein
in a thread, you know,
battling it out
or sort of duking it out with someone
and maybe even reaching
some convergence on something,
but it just whizzes by your feet
and then it's gone.
Yeah.
And all the effort
that we're putting in
to make these systems work,
but then it's just all gone.
What do you do?
I mean, I try to very minimally
use social media overall.
Luckily, the work is so busy that that's easier.
I want to say first that, you know, on the addiction fronts of these things, I, you know, myself, I'm very sensitive and, you know, easily addicted by these things myself.
And that's why I think I notice.
You were saying in the social dilemma, it's email for you, huh?
Yeah.
For me, if I refresh my email and pull to refresh like a slot machine, sometimes I'll get invited to meet the president of such and such to advise on regulation.
And sometimes I get a stupid newsletter from a politician I don't care about or something, right?
So email is very addictive.
It's funny.
I talked to Daniel Kahneman, who wrote the, he's like the founder of behavioral economics.
He wrote the book, Thinking Fast and Slow,
if you know that one.
And he said as well that email was the most addictive for him.
And he, you know, the one thing you'll find
is that the people who know most
about these sort of persuasive manipulative tricks,
they'll say, we're not immune to them
just because we know about them.
Dan Ariely, who's another famous persuasion,
behavioral economics guy, talks about flattery and how flattery still feels good even if I tell you
I don't mean it. Like, I love that sweatshirt. That's an awesome sweatshirt. Where'd you get it?
You're just going to bullshit me, but that's the, it feels good to get flattery even if you know
that it's not real. And the point being that, like, again, we have so much evolutionary wiring
to care about what other people think of us
that just because you know that they're manipulating you
in the likes or whatever,
it still feels good to get those 100 extra likes
on that thing that you posted.
Yeah.
When do the likes come about?
Well, let's see.
Well, actually, you know, in the film,
you know, Justin Rosenstein,
who's the inventor of the like button talks about,
I think the first version was something called beacon and it arrived in 2006,
I think.
But then the simple,
like one click like button was like a little bit later,
like 2008,
2009.
Are you worried that it's going to be more and more invasive?
I mean,
you think about the problems we're dealing with now with Facebook and Twitter
and Instagram,
all these within the last decade or so, what, what do we have to look forward to? I mean, is there something on the
horizon that's going to be even more invasive? Well, we have to change the system because,
as you said, technology is only going to get more immersed into our lives and infused into our
lives, not less. Is technology going to get more persuasive or less persuasive? More, for sure. Is AI going to get better at predicting our next move or less good
at predicting our next move? It's almost like we have to eliminate that. And I mean, it would be
really hard to tell them you can't use algorithms anymore that depend on people's attention spans.
It would be really hard, but it seems like the only way for the internet to be pure.
Correct.
I think of this like the environmental movement.
I mean,
some people have compared the film,
the social dilemma to Rachel Carson's silent spring,
right?
Where that was the birth.
That was the book that birthed the environmental movement.
And that was in a Republican administration,
the Nixon administration.
We actually passed,
we created the EPA,
the environmental protection agency.
We went from a world where we said the environment's something we don't pay attention to
to we passed a bunch i forgot the laws we passed between 1963 and 1972 over a decade
we started caring about the environment we created things that protected the national parks we
and i think that's kind of what's going on here that you know imagine for example it is illegal to show advertising on youth oriented
social media apps between 12 a.m and 6 a.m because you're basically monetizing loneliness and lack
of sleep right like imagine that you cannot advertise during those hours because we say
like a national park our children's attention between this is a very minimal example by the
way this would be like you know taking the most obvious piece of low-hanging fruit and land and say let's quarantine
this off and say this is sacred but isn't the problem like the the environmental protection
agency it resonates with most people the idea oh let's protect the world for our children right
there's not a lot of people profiting off of polluting the rivers right but when you look
there's i mean over over hunting you know certain lands or overfishing certain fisheries
and collapsing them.
I mean, there are, if you have big enough corporations that are based on an infinite
growth profit model, you know, operating with less and less, you know, resources to get,
this is a problem we faced before.
For sure.
But it's not the same sort of scale as 300 and X amount of millions of people.
And a vast majority of them are using some form of social media.
And also, this is not something that really resonates in a very clear, like, one plus one equals two way.
Like, the Environmental Protection Agency, it makes sense.
Like, if you ask people, should you be able to throw garbage into
the ocean everyone's gonna say no that's a terrible idea right should you be able to make
an algorithm that shows people what they're interested in on youtube like yeah what's wrong
with that well it's more like sugar right because sugar is always going to taste way better than
something else because our evolutionary heritage says like that's rare and so we should pay more attention to it this is like sugar for the fame lottery for our
attention for social approval and so it's always going to feel good and we need to have consciousness
about it and we haven't banned sugar but we have created a new conversation about what healthy
you know eating is right i mean there's a whole new fitness movement and sort of yoga and all
these other things that people care more about their bodies and health than they probably ever
have i think many of us wouldn't have thought we'd ever
reach it through, you know, get through the period of soda being at the sort of pinnacle popularity
that is. I think in 2013 or 14 was the year that water crossed over as being more of a successful
drinking product than soda, I think. I think that's true. You might want to look that up.
So I think we could have
something like that here we have to i think of it this way if you want to even get kind of weirdly
i don't know spiritual or something about it which is we're the only species that could even know
that we've we're doing this to ourselves right like we're the only species with the capacity
for self-awareness to know that we have actually like roped ourselves into this matrix of like literally the matrix of sort of undermining our own psychological weaknesses.
Like a lion that somehow manipulated its environment so that there's gazelles everywhere and is like overeating on gazelles doesn't have the self-awareness to know, wait a second, if we keep doing this, this is going to cause all these other problems. It can't do that because its brain doesn't have
that capacity. Our brain, we do have the capacity for self-awareness. We can name negativity bias,
which is that if I have 100 comments and 99 are positive, my brain goes to the negative. We can
name that. And once we're aware of it, we get some agency back. We can name that we have a draw
towards social approval. So when I see I've
been tagged in a photo, I know that they're just manipulating my social approval. We can name
social reciprocity, which is when I get all those text messages and I feel, oh, I have to get back
to all these people. Well, that's just an inbuilt bias that we have to get back reciprocity. We have
to get back to people who give stuff to us. The more we name our own biases, like confirmation
bias, we can name that
my brain is more likely to feel good getting information that I already agree with than
information that disagrees with me. Once I know that about myself, I can get more agency back.
And we're the only like species that we know of that has the capacity to realize that we're in
a self-terminating sort of system and we have to change that by
understanding our own weaknesses and that we've created the system that is undermining ourselves
and i i think the film is doing that for a lot of people it certainly is but i think it needs
more it's like inspiration it needs a refresher on a regular basis right do you feel this massive
obligation to be that guy that is out there sort of as the Paul Revere of the technology
influence invasion. I just see these problems and I want them to go away. You know, I didn't,
you know, didn't desire and wake up to run a social movement. But honestly, right now,
that's what we're trying to do. With the Center for Humane Technology. We realized that before the success of the film, we were actually more focused on working with technologists inside the industry.
I come from Silicon Valley. Many of my friends are executives at the companies, and we have these inside relationships.
We focused at that level. We also worked with policymakers, and we were trying to speak to policymakers.
We weren't trying to mobilize the whole world
against this problem. But with the film, suddenly we as an organization have had to do that. And
we're, frankly, I wish we had, I'm just being really honestly, we, I really wish we'd had those
funnels, so that people who saw the film could have landed into, you know, a carefully designed
funnel where we actually started mobilizing people to deal with this issue, because there are ways we
can do it, we can pass certain laws, we have to have a new cultural sort of set of norms about
how do we want to show up and use the system. You know, families and schools can have whole
new protocols of how do we want to do group migrations, because one of the problems is that
if a teenager says by themselves, well, I saw the film, I'm going to delete my Instagram account by
myself or TikTok account by myself, that's not enough
because all their friends are still using Instagram and TikTok, and they're still going
to talk about who's dating who or gossip about this or homework or whatever on those services.
And so the services, Instagram and TikTok, prey on social exclusion, that you will feel excluded
if you don't participate. And the way to solve that is to get whole schools or families together,
like different parent groups or whatever together, and do a group migration from Instagram to Signal
or iMessage or some kind of group thread that way. Because notice that when you, as you said,
Apple's a pretty good actor in this space. If I make a FaceTime call to you, FaceTime isn't trying
to monetize my attention. It's just sitting there being like, yeah, how can I help you have a good face, as close
to face-to-face conversation as possible?
Jamie pulled up an article earlier that was saying that Apple was creating its own search
engine.
Yeah.
I hope that is the case.
And I hope that if it is the case, they apply the same sort of ethics that they have towards
sharing your information that they do with other things to their search engine.
But I wonder if there would be some sort of value in them creating a social media platform that doesn't rely on that sort of algorithm.
Yeah.
Well, I think in general, one of the exciting trends that has happened since the film is there's actually many more people trying to build alternatives, social media products that are not based on these business models.
Yeah.
I could name a few, but I don't want to be endorsing it. I mean, there's people building Marco Polo, Clubhouse, Wikipedia is trying to build a sort of nonprofit version.
I always forget the names of these things.
version. I always forget the names of these things. But the interesting thing is that for the first time, people are trying to build something else because now there's enough people
who feel disgusted by the present state of affairs. And that wouldn't be possible unless
we created a kind of a cultural movement based on something like the film that reaches a lot of
people. It's interesting that you made this comparison to the Environmental Protection
Agency because there's kind of a parallel in the way other countries handle the environment versus the way we do and how it makes them competitive.
I mean, that's always been the Republican argument for not getting rid of certain fossil fuels and
coal and all sorts of things that have a negative consequence, that we need to be competitive with
China. We need to be competitive with these other countries
that don't have these regulations in effect.
The concern would be, well, first of all,
the problem is these companies are global, right?
Like Facebook is global.
If they put these regulations on America
but didn't put these regulations worldwide,
then wouldn't they use the income
and the algorithm in other countries unchecked
and have this tremendous negative consequence and gather up all this money?
Which is why, just like sugar, it's like everyone around the world has to understand
and be more antagonistic.
Not like sugar is evil, but just you have to have a common awareness about the problem.
But how could you educate people?
If you're talking about a country like Myanmar or these other countries that have had these like
serious consequences because of Facebook, how could you possibly get our ideas
across to them if we don't even know their language and it's just this system
that's already set up in this very advantageous way for them where Facebook
comes on their phone. Like how could you hit the brakes on that?
Well, I mean, first I just want to say this is an incredibly hard and depressing problem.
Yes.
It's the scale of it, right?
Right.
You need something like a global, I mean, language-independent global self-awareness about this problem.
Now, again, I don't want to be tooting the horn about the film,
but the thing I'm excited about is it launched on Netflix in 190 countries
and in 30 languages.
So we at least have-
You should toot the horn.
Well, yeah.
Yeah, toot it.
Yeah.
Well, I think, you know,
the film was seen in 30 languages.
So, you know, the cool thing is
I wish I could show the world my inbox.
I think people see the film
and they feel like,
oh my God, this is huge.
And I'm a huge problem and I'm all alone. How are we ever going to fix this? But I get emails every day from Indonesia, Chile,
Argentina, Brazil, people saying, oh my god, this is exactly what's going on in my country. I mean,
I've never felt more optimistic. And I've felt really pessimistic for the last eight years
working on this, because there really hasn't been enough movement. But I think for the first time,
there's a global awareness now
that we could then start to mobilize.
I know the EU is mobilizing,
Canada is mobilizing, Australia is mobilizing,
California State is mobilizing with Prop 24.
There's a whole bunch of movement now in the space
and they have a new rhetorical arsenal
of why we have to make this bigger transition.
Now, are we gonna get all the bigger transition. Now, you know, are we going to
get all the countries that, you know, where there's the six different major dialects in Ethiopia,
where they're going to know about this? I don't think the film was translated into all those
dialects. I think we need to do more. It's a really, really hard, messy problem. But on the
topic of if we don't do it, someone it someone else will you know one interesting thing in the
environmental movement was um there's a great um wnyc radio piece about the history of lead and
when we regulated lead i don't do you know anything about this yeah i do yeah yeah the
of course this matches up with with your experience the my understanding is that obviously lead was this sort
of miracle thing we put it in paint we put it in gas it was like great and then um the way we
figured out that we should regulate lead out of our sort of infused product supply is by proving
there was this guy um who proved that it dropped kids' IQ by four points
for every, I think, microgram per deciliter, I think.
So in other words, for the amount of,
if you had a microgram of lead per deciliter of either, I'm guessing air,
it would drop the IQ of kids by four points.
And they measured this by actually doing a sample on their teeth or something,
because lead shows up in your bones, I think.
And they proved that if the IQ points dropped by four points,
it would lower future wage-earning potential of those kids,
which would then lower the GDP of the country,
because it would be shifting the IQ of the entire country
down by four points,
if not more, based on how much lead is in the environment. If you zoom out and say,
is social media... Now, let's replace the word IQ, which is also a rot term because there's like
a whole bunch of views about how that's designed in certain ways and not others and measuring
intelligence. Let's replace IQ with problem-solving capacity. What is your problem-solving
capacity, which is actually how they talk about it in this radio episode. And imagine that we have a
societal IQ or a societal problem-solving capacity. The U.S. has a societal IQ. Russia has a societal
IQ. Germany has a societal IQ. How good is a country at solving its problems? Now imagine that what does social
media do to our societal IQ? Well, it distorts our ideas. It gives us a bunch of false narratives.
It fills us with misinformation and nonsense. It makes it impossible to agree with each other.
And in a democracy, if you don't agree with each other and you can't even do compromise,
people recognize that politics is invented to avoid warfare, right?
So we have compromise and understanding
so that we don't like physically
are violent with each other.
We have compromise and conversation.
If social media makes compromise, conversation,
and shared understanding and shared truth impossible,
it doesn't drop our societal IQ by four points.
It drops it to zero
because you can't
solve any problem, whether it's human trafficking or poverty or climate issues or racial injustice,
whatever it is that you care about. It depends on us having some shared view about what we agree on.
And by the way, and on the optimistic side, there are countries like Taiwan that have actually built
a digital democratic sort of social
media type thing. Audrey Tang, you should have Audrey Tang on your show. She's amazing. She's
the digital minister of Taiwan. And they've actually built a system that rewards unlikely
consensus. So when two people who would traditionally disagree, post something online.
And when when they actually two people who traditionally disagree actually agree on something,
that's what gets boosted to the top
of the way that we look at our information feeds.
Really?
Yeah.
So it's about finding consensus where they'd be unlikely
and saying, hey, actually, you know, you, Joe, and Tristan,
typically you disagree on these six things,
but you agree on these three things,
and of things that we're going to encourage you
to talk about on a menu,
we hand you a menu of the things you agree on.
And how do they manipulate that um honestly we did a great interview with her on our podcast um that people can listen to uh i think you should have her on
she honestly i would love to but what is your podcast again tell people it's called uh you're
undivided attention um and with the interview is with audrey tang is her name uh and i think that's
this is one model of how do you have,
you know, sort of digital media bolted onto the top of a democracy and have it work better,
as opposed to how do you, it just degrades into kind of nonsense and polarization and inability to agree. And that's what we need. Taiwan is such a unique situation too, right?
Because China doesn't recognize them and there's a real threat that they're going to be invaded by
China. Correct. And so what's interesting about Taiwan is there's, we haven't talked about the
disinformation issues, but it's under, like you said, not just physical threat from China, but
massive propaganda disinformation campaigns are trying to run there, right? I'm sure.
And so what's amazing is that their digital media system is good at dealing with these
disinformation campaigns and conspiracy theories and other things, even in the face of a huge threat like China, but there's more binding energy in the country because they all know that
there's a tiny island and there's a looming threat of this big country. Whereas the United States,
we're not this tiny island with a looming threat elsewhere. In fact, many people don't know or
don't think that there's actually information warfare going on. I actually think it's really
important to point out to people that the social media is one of our biggest national security risks, because while we're obsessed with protecting our physical borders and building walls and, you know, spending a trillion dollars redoing the nuclear fleet, we left the digital border wide open.
Like if Russia or China try to fly a plane into the United States, our Pentagon and billions of dollars of defense infrastructure from Raytheon and Boeing or whatever
will shoot that thing down
and it doesn't get in.
If they try to come into the country,
they'll get stopped
by the passport control system, ideally.
If they try to fly,
if Russia or China try to fly
an information bomb into the country,
instead of being met
by the Department of Defense,
they're met by a Facebook algorithm
with a white glove
that says exactly which zip code
you want to target.
Like it's the opposite of protection. So social media makes us more vulnerable. I think of it like, if you imagine like a bank that spent billions of dollars, you know, surrounding the
bank with physical bodyguards, right? Like just the buffest guys and every single quarter, you
just totally secured the bank. But then you installed on the bank, a computer system that
everyone interacts with. And no one changes the default password from like lowercase password. totally secured the bank. But then you installed on the bank a computer system that everyone
interacts with. And no one changes the default password from like lowercase password. Anyone
can hack in. That's what we do when we install Facebook in our society, or you install Facebook
in Ethiopia. Because if you think Russia or China, you know, or Iran or South Korea,
excuse me, North Korea, influencing our election is bad. Just keep in mind the like
dozens of countries throughout Africa, where we actually know, recently, there was a huge campaign
that the Stanford Cyber Policy Center did a report on of Russia targeting, I think something like
seven or eight major countries and disinformation campaigns running in those countries. Or the
Facebook whistleblower who came out about a month ago, Sophie Zhang, I think is her name,
saying that she personally had to step in to deal with disinformation campaigns in Honduras, Azerbaijan,
I think Greece or some other countries like that.
So the scale of what these technology companies are managing,
they're managing the information environments for all these countries,
but they don't have the resources to do it.
Not only that, they're not trained to do it.
They're not qualified to do it.
They're making up as they go along.
They're 20 to 30 to 40.
And they're way behind the curve.
When I had Renee DeRest on, and she detailed all the issues
with the Internet Research Agency in Russia
and what they did during the 2016 campaign for both sides.
I mean, the idea is they just promoted Trump,
but they were basically sowing the seeds
of just the decline of the democracy.
They were trying to figure out how to create turmoil,
and they were doing it in this very bizarre,
calculated way that it didn't seem,
it was hard to see, like, what's the endgame here?
Well, the endgame is to have everybody fight.
Yeah.
I mean, that's really what the end game was.
And if I'm, you know, one of our major adversaries, you know, after World War II,
there was no ability to use kinetic, like, nukes or something on the bigger countries, right?
Like, that's all done.
So the, what's the best way to take down the biggest country on the planet, on the
block? You use its own internal tensions against itself. This is what Sun Tzu would tell you to do.
And that's never been easier because of Facebook and because of these platforms being open to do
this manipulation. And if I'm looking now, we're four days away from the U.S. elections or something like that when this goes out.
Jesus Christ.
There is never – we have never been more destabilized as a country until now.
I mean, this is the most destabilized we probably have ever been, I would say, and polarized.
Maybe people would argue the Civil War was worse, but in recent history, there is maximum incentive for foreign actors to drive up, again,
not one side or the other, but to drive us into conflict. So I would really, you know, I think
what we all need to do is recognize how much incentive there is to plant stories, to actually
have so physical violence on the streets. I think there was just a story, wasn't we talking about
this morning, that there was some kind of truck,
I think in Philadelphia or DC,
loaded with explosives or something like this.
There's such an incentive to try to,
you know, throw the agent provocateur,
like throw the first stone,
throw the first, you know, Molotov cocktail,
throw the first, you know,
make the first shot fired to drive up that conflict.
And I think we have to realize
how much that may be artificially motivated.
Very much so.
And the Renee DiResta podcast that I did,
where she went into depth about all the different ways that they did it,
and the most curious one being funny memes.
Yeah.
That there's so many of the memes that you read that you laugh at.
Yeah.
Well, there's, It was just so weird.
They were humorous.
And she said she looked at probably 100,000 memes.
And the funny thing is you actually can agree with them, right?
Yeah, they're funny.
You would laugh at them.
It's like, oh, you know.
And they're being constructed by foreign agents
that are doing this to try to mock certain aspects
of our society and pit people against each other and
create a mockery. And, you know, back in 2016, there was no, there was very little collaboration
between our defense industry and CIA and DOD and people like that, and the tech platforms. And the
tech platform said it's government's job to deal with if foreign actors are doing these things.
How do you stop something like the IRA?
Like, say, if they're creating memes in particular,
and they're funny memes.
Well, so one of the issues that Renee brings up,
and I'm just a huge fan of her and her work, is... As am I.
Yeah, is that if I'm, you know, China,
I don't need to invent some fake news story.
I just find someone in your society
who's already saying what I want you to be talking about. And I just like amplify them up. I take that dial and I just turn
it up to 10, right? So I find your Texas secessionists and like, oh, Texas secessionists,
that would be a good thing if I'm trying to rip the country apart. So I'm going to take those
Texas secessionists and the California secessionists, and I'm just going to dial them up to 10.
So those are the ones we hear from. Now, if you're trying to stop me in your Facebook and you're the integrity team or something, on what grounds are you trying to stop
me? Because it's your own people, your own free speech. I'm just the one amplifying the one I want
to be out there, right? And so that's what gets tricky about this is I think our moral concepts
that we hold so dear of free speech are inadequate in an attention economy that is hackable. And it's
really more about what's getting the attention rather than what are individuals saying or can't
say. And, you know, again, they've created this Frankenstein where they're making mostly automated
decisions about who's looking like what pattern behavior or coordinated inauthentic behavior here
or that, and they're shutting down. I don't know if people know this, people, Facebook shut down
authentic behavior here or that, and they're shutting down. I don't know if people know this,
people, Facebook shut down 2 billion fake accounts. I think this is a stat from a year ago.
They shut down 2 billion fake accounts. They have 3 billion active real users. Do you think that those 2 billion were the perfect, like, real, you know, real fake accounts, and they didn't miss any,
or they didn't overwhelm and took some real accounts down with it? You know, our friend
Brett Weinstein, he just got taken down by Facebook.
I think you saw that.
That seemed calculated, though.
Facebook has shut down 5.4 billion fake accounts this year.
And that was in November 2019.
Oh, my God.
Oh, my God.
That is insane.
That's so many.
And so, again, it's the scale that these things are operating at.
And that's why, you know, when Brett got his thing taken down, I didn't like that.
But it's not like there's this vendetta against Brett, right?
Oh, I don't know about that.
That seemed to me to be a calculated thing because, you know, Eric actually tweeted about it saying that, you know, you could probably find the tweet because I retweeted it.
Like, basically, it was reviewed by a person.
So you're lying.
He's like, this is not something that was taken down by an algorithm.
He believes that it was because it was Unity 2020 platform, where they were trying to bring together conservatives and liberals and try to find some common ground and create, like, a third-party candidate that combines the best of both worlds.
I don't understand what policy his Unity 2020 thing was going up against.
Like, I have no idea
what they would take down.
It's going against the two-party system.
The idea is that it's taking away votes from Biden
and then it may help Trump win.
They banned him off Twitter as well.
You know that, too.
They blocked the account or something from him.
They banned the account.
They banned the entire account.
They banned the Unity 2020 account.
Yeah.
Unity.
Yeah.
I mean, literally, Unity.
They're like, nope, no Unity.
Fuck you. We want Biden. Yeah. The political bias on social media account yeah unity yeah i mean literally unity they're like nope no unity fuck you we want biden
yeah the the political bias on social media is undeniable and that's maybe the least of our
concerns in the long run but it's a tremendous issue and it also it it for sure sows the seeds
of discontent and it creates more animosity and it creates more conflict the interesting thing is
that if i'm one of our adversaries i see that there is this view that people don't like the social media platforms and I want them to be more.
Like, let's say I'm Russia or China.
Right.
And I'm currently using Facebook and Twitter successfully to run information campaigns.
And then I want them, I can actually plant a story so that they end up shutting it down and shutting down conservatives or shutting down one side, which then forces the platforms to open up more so that either Russia or China can keep manipulating even more.
I understand. Yeah.
So right now, they want it to be a free for all where there's no moderation at all,
because that allows them to get in and they can weaponize the conversation against itself.
I don't see a way out of this, Tristan.
We have to all be aware of it.
But even if we are all aware of it,
it seems so pervasive.
Yeah.
Well, it's not just pervasive.
It's like I said,
we're 10 years into this hypnosis experiment.
This is the largest psychological experiment
we've ever run on humanity.
It's insane.
It is insane.
And it's also with tools that never existed before evolutionarily.
So we really are not designed just the way these brightly lit metal devices
and glass devices interact with your brain.
They're so enthralling.
Right.
We've never had to resist anything like this before.
The things we've had to resist
is don't go to the bar you know you have an alcohol problem stop smoking cigarettes it'll
give you cancer right we've never had a thing that does so much right you can call your mom
you can text a good friend you can you can receive your news you can get amazing email
about this project you're working at and it could suck up your time staring at butts.
And the infusion of the things that are necessary for life, like text messaging or looking something up, are infused in right next to all of the sort of corrupt stuff.
And if you're using it to order food, and if you're using it to get an Uber.
Right.
But imagine if we all wiped our phones of all the extractive business model stuff
and we only had the tools.
Well, have you thought about using a light phone?
Yeah, it's funny.
Those guys used to be brought up in my awareness more often.
For those who don't know, it's like a mini spin.
One of the guys on the documentary
is one of the creators of it, right?
No, I think you're thinking of Tim Kendall,
who started, he's the guy who invented,
who brought in Facebook's business model of advertising,
and he runs a company now called Moment
that shows you the number of hours you spend
in different apps and helps you use it less.
I thought someone involved in the documentary
was also a part of the Light Phone team.
No, no, not officially.
No, I don't think so.
But the Light phone is like a
basically a thin black and white black and white phone thing text and i think it does it plays
music now which i was like oh that's a mistake right like that's a slippery slope that's the
thing and we have to all be comfortable with losing access to things that we might love right
like oh maybe you do want to take notes this time but you don't have your full keyboard to do that
and are you willing to i think the thing is one thing people can do want to take notes this time, but you don't have your full keyboard to do that. And are you willing to do that? I think the thing is, one thing people can do is just take like a digital Sabbath one day a week off completely.
Because imagine if you got several hundred million people to do that.
That drops the revenue of these companies by like 15%.
Because that's one out of seven days that you're not on the system, so long as you don't rebalance and use it more on the other days.
I'm inclined to think that Apple's, their solution is really the way out of this.
That to opt out of all sharing of your information
and if they could come up
with some sort of a social media platform
that kept that as an ethic.
Yeah.
I mean, it might allow us to communicate with each other,
but stop all this algorithm nonsense.
And look, if anybody has the power to do it,
they have so much goddamn money. Totally. Well, and also they're like the, you know,
people talk about, you know, the government regulating these platforms, but Apple is kind
of the government that can regulate the attention economy. Because when they do this thing we talked
about earlier of saying, do you want to be tracked? Right. And they give you this option
when like 99% of people are going to say, no, I don't want to be tracked. and they give you this option when like 99% of people are gonna say no I don't want to be tracked right when they do that they
just put a 30% tax on all the advertising based businesses because now
you don't get as personalized an ad which means they make less money which
means that business model is less attractive to venture capitalists to
fund the next thing which means so they're actually enacting a kind of a
carbon tax but it's like a you know on the polluting stuff right there enacting
a kind of social media polluting stuff they're taxing by 30%, but they could do more than that.
Like imagine, you know, they have this 30-70 split on app developers get 70% of the revenue
when you buy stuff and Apple keeps 30%. They could modify that percentage based on how much
sort of social value that those things are delivering to society.
So this gets a little bit weird.
People may not like this, but if you think about who's the real customer that we want to be,
like how do we want things oriented?
How should we, if I'm an app developer, I want to make money the more I'm helping society
and helping individuals, not how much I'm extracting and stealing their time and attention.
And imagine that governments in the future actually paid like some kind of budget into
let's say the app store there's antitrust issues with this but you pay money into the app store
and then as apps started helping people with more social outcomes like let's say learning programs
or schools or things like Khan Academy things like this that more money flows in the direction
of where people got that value and it was was that revenue split between Apple and the app
developers ends up going more to things that end up helping people as opposed to things that were
just good at capturing attention and monetizing zombie behavior. One of my favorite lines in the
film is Justin Rosenstein from the Like button saying that, you know, so long as a whale is worth more dead than alive, and a tree is worth more as
lumber and two by fours than a living tree. Now we're the whale, we're the tree, we're worth more
when we have predictable zombie like behaviors, when we're more addicted, distracted, outraged,
polarized and disinformed than if we're a living thriving citizen or a growing child that's like playing with their
friends. And I think that that kind of distinction that just like we protect national parks or we
protect, you know, certain fisheries and we don't kill the whales in those areas or something,
we need to really protect, like, we have to call out what's sacred to us now.
Yeah, it's an excellent message. My problem that I see is that I just don't know how well that message is going to be absorbed on the people that are already in the trance.
I think it's so difficult for people to put things down.
I was telling you how difficult it is for me to tell my friends, don't read the comments.
It's hard to have that kind of discipline.
It's hard to have that kind of discipline and it's hard to have that kind of,
because people do get bored. And when they get bored,
like if you're waiting in line for somewhere,
you pull out your phone,
you're at the doctor's office,
you pull out your phone.
Like totally.
I mean,
and that's why,
you know,
and I do that,
right?
I mean,
this is incredibly,
right.
This is incredibly hard.
Back in the day when I was at Google trying to change,
I tried to change Google from the inside for two years before leaving.
What was it like there?
Please share your experiences because when you said you tried to change it from the inside,
what kind of resistance were you met with
and what was their reaction to these thoughts that you had
about the unbelievable negative consequences?
Well, this is in 2013, so we didn't know about all the negative consequences.
But you saw the writing on the wall, at least some of it.
Some of it, yeah. I mean, the notion that things were competing for attention,
which would mean that they would need to compete to get more and more persuasive
and hack more and more of our vulnerabilities, and that that would grow,
that was the core insight.
I didn't know that it would lead to polarization or conspiracy theory,
like recommendations, but I did know, you know, more addiction, kids having less, you know,
weaker relationships. When did it occur to you? Like what, what were your initial feelings?
Um, I was on a hiking trip in the Santa Cruz mountains with our co-founder now, um, Aza Raskin.
Um, it's funny enough, our co-founder Aza, his dad was Jeff Raskin, who invented the
Macintosh project at Apple. I don't know if you know the history there, but he started the Macintosh
project and actually came up with the word humane to describe the humane interface. And that's where
our name and our work comes from, is from his father's work. He and I were in the mountains
in Santa Cruz and just experiencing nature and just came back and
realized like this, all of this stuff that we've built is just distracting us from the stuff that's
really important. And that's when coming back from that trip, I made the first Google deck
that then spread virally throughout the company saying never before in history have, you know,
50 designers, you know, white 20 to 35 year old engineers who look like me to hold
the collective psyche of humanity. And then that presentation was released. And about, you know,
10,000 people at Google saw it, it was actually the number one meme within the company, they have
this internal thing inside of Google called MoMA, that has like, people can post like GIFs and memes
about various topics. And it was the number one meme that, hey, we need to talk about this at this week's TGIF,
which is the like weekly, thank God it's Friday type company meeting. It didn't get talked about,
but I got emails from across the company saying, we definitely need to do something about this.
It was just very hard to get momentum on it. And really the key interfaces to change within Google are Chrome
and Android, because those are the neutral portals into which you're then using apps and
notifications and websites and all of that. Like those are the kind of governments of the attention
economy that Google runs. And when you worked there, did you have to use Android? Was it part
of the requirement to work there?
No, I mean, a lot of people had Android phones.
I still used an iPhone.
Was it an issue?
No.
No, I mean, people, because they realized that they needed products to work on all the phones.
I mean, if you work directly on Android, then you would have to use an Android phone.
But we tried to get some of those things like the screen time features that are now launched.
Everyone now has on their phone, it shows you the number of hours or whatever.
Is that on Android as well?
It is, yeah.
And actually that came, I think, as a result of this advocacy.
And that's shipping on a billion phones, which shows you, you can change this stuff, right?
Like that goes against their financial interest.
People spending less time on their phones, getting less notifications.
It sort of does, but it doesn't work.
Well, correct. So it doesn't work. Well, correct.
So it doesn't actually work is the thing.
Yeah.
And let's separate the intention and the fact that they did it from the behavioral.
It's like labels on cigarettes that tell you it's going to give you cancer.
Like by the time you're buying them, you're already hooked.
Correct.
I mean, it's even worse.
Imagine like every cigarette box had like a little pencil inside so you can mark.
There was like little streaks that said the number of days in a row you haven't smoked and you could like mark each day it's like it's too late
right right like yeah um it's just the wrong paradigm um the the fundamental thing we have
to change is the incentives and how money flows because we want money flowing in the direction
of the more these things help us like let me give you a concrete example like let's say um you want
to learn a musical instrument and and you go to YouTube to
pick up ukulele or whatever. And you're seeing how to play the ukulele. Like, from that point,
in a system that was designed in a humane and sort of time well spent kind of way,
it would really ask you, instead of saying, here's 20 more videos that are gonna just like
suck you down a rabbit hole, it would sort of be more oriented towards what do you really need help
with? Like, do you need to buy ukulele? Here's a link to Amazon to get the ukulele. Are you looking for a
ukulele teacher? Let me do a quick scan on your Facebook or Twitter search to find out which of
those people are ukulele teachers. Do you need instant like tutoring? Because there's actually
this service you never heard of called Skillshare or something like that, where you can get instant
ukulele tutoring. And if we're really designing these things to be about what would most
help you next, you know, we're only as good as the menu of choices on life's menu. And right now,
the menu is, here's something else to addict you and keep you hooked instead of here's a next step
that would actually be on the trajectory of helping people live their lives better.
But you would have to incentivize the companies because like, there's so much incentive on getting
you addicted because there's so much financial reward what would be the financial reward that they could have to
get you to something that would be helpful for you like lessons or this i mean so one way that
could work is like let's say people pay a monthly subscription of like i don't know 20 bucks a month
or something so it's never gonna work i get you but like let's say you pay some you put money into
a pot where
the pot but then we have the problem the problem is like it costs some money versus free like there
was a um there's a company that still exists for now that uh was trying to do the netflix of
podcasting oh and uh they they approached us and they're like we're just gonna get all these people
together and they're gonna make them people gonna pay to use your podcast i'm like why would they do
that when podcasts are free yeah like that's one of the reasons why podcasts work is and they're going to make them people going to pay to use your podcast i'm like why would they do that when podcasts are free yeah like that's one of the reasons why podcasts work
is because they're free right when things are free they're they're attractive it's easy when
things cost money you have to have something that's extraordinary like netflix yeah like when
you say the netflix of podcasting well netflix makes their own shows right they spend millions
of dollars on special effects and
all these different things and they're really like enormous projects like you're you're just
talking about people talking shit and you want money right well that's the thing is we have to
actually deliver something that's totally qualitatively better so and it would also
have to be like someone like you or someone who's really aware of the issues that we're dealing with with addictions to social media, should have to say this is the best possible alternative.
Like in this environment, yes, you are paying a certain amount of money per month, but maybe
that could get factored into your cell phone bill and maybe with this sort of an ecosystem, you're no longer being drawn in by your addictions and it's not playing for your attention span.
It's rewarding you in a very productive way.
And imagine, Joe, if like 15% more of your time was just way better spent.
Like you were actually doing the things you cared about.
And it actually helped improve your life.
Yeah, like imagine when you use email if it was truly designed
i mean forget email many people don't relate to that because email isn't that popular but
whatever it is that's a huge time sink for you for me email is a huge one for me you know web
browsing or whatever is a big one imagine that those things were so much better designed that
i actually wrote back to the right emails and i mostly didn't think about the rest that when i
was spending time on you know whatever i was spending time on, you know,
whatever I was spending time on,
that it was really, more and more of my life
was a life well lived and time well spent.
That's like the retrospective view.
I keep going to Apple,
because I think they're the only social media company,
or excuse me, the only technology company
that does have these ethics to sort of protect privacy.
Have you thought about coming to them?
Yep.
Have you?
Well, I mean, I think that they've made great first steps and they were the first along with
Google to do those, the screen time management stuff. But that was just this barely scratching
the surface, like baby, baby, baby steps. Like what we really need them to do is radically
reimagine how those incentives and how the phone fundamentally works.
So it's not just all these colorful icons.
And one of the problems, they do have a disincentive, which is a lot of their revenue comes from gaming.
And as they move more into Apple TV competing with HBO and Hulu and Netflix and that whole thing where they need subscriptions.
So Apple's revenue on devices and hardware is sort of maxing out.
And where they're going to get their next bout of revenue to keep their stock price up is on these subscriptions i'm less concerned with
those addictions i'm less concerned with gaming addictions that i have information addictions
because at least it's not fundamentally altering your view of the world right it's growing up
democracy yeah impossible to agree and this is coming from a person that's had like legitimate
video game addictions in the past yeah
but uh like my wife is addicted to subway surfer like i don't know what is it that's a crazy game
it's like you're riding on the top of subways you're jumping around it's like it's really
ridiculous but it's fun like you watch like whoa but i i don't fuck with video games but i watch it
and it's those games at least are enjoyable.
There's something silly about it.
Like, ah, fuck.
And then you start doing it again.
When I see people getting angry about things on social media,
I don't see the upside.
I don't mind them making a profit off games.
There is an issue, though, with games that addict children.
And then these children,
there's like, you could spend money on like Roblox, and you can have all these different
things you spend money on, you wind up having these enormous bills.
Sub-on bills, yeah.
You leave your kid with an iPad, and you come back, you have a $500 bill.
Like, what did you do?
Yeah.
This is an issue, for sure, but at least it's not an issue in that it's changing their view of the world.
Right.
And I feel like there's a way for, I keep going back to Apple, but a company like Apple to rethink the way, you know, they already have a walled garden, right, with iMessage and FaceTime and all this different.
They can totally build those things out.
I mean, iMessage and iCloud could be the basis for some new neutral social media.
It's not based on instant social approval and rewards, right?
Yes.
They can make it easier to share information with small groups of friends and have that all synced.
And even, you know, in the pre-COVID days, I was thinking about Apple a lot.
I think you're right, by the way, to really poke on them.
I think they're the one company that's in a position to lead on this.
And they also have a history of thinking along those lines. You know, they had this feature that's kind of hidden now, but the Find My Friends, right? They call it Find My Now.
It's all buried together so you can find your devices and find your friends. But in a pre-COVID
world, imagine they really built out the, you know, where are my friends right now and making it
easier to know when you're nearby someone so you can easily more easily get together in person because right now all the like to the extent
facebook wants to bring people closer together they don't want to and again this is pre-covid
but they don't want to incentivize lots and lots of facebook events they really care about groups
that keep people posting and online and looking at ads because of the category of bringing people
closer together they want to do the online screen time based version of that as opposed to the offline.
Apple, by contrast, if you had little iMessage groups of friends, you could say, hey, does everyone in this little group want to opt into being able to see where each other are, where we all are on, say, weekdays between 5 and 8 p.m. or something like that.
So you could like time bound it and make it easier for serendipitous connection and availability to happen. That's hard to do. It's hard to design that. So you could like time bound it and make it easier for serendipitous connection and availability to happen. That's hard to do. It's hard to design that. But there's things like that,
that Apple's in a position to do, if it really took on that mantle. And I think as people get
more and more skeptical of these other products, they're in a better and better position to do
that. One of the antitrust issues is do we want a world where our entire well being as a society
depends on what one massive corporation worth over a trillion dollars does or doesn't do.
Right.
Like we need more openness to try different things.
And we're really at the behest of whether one or two companies, Apple or Google, does something more radical.
And there has to be some massive incentive for them to do something that's really going to change the way we interface with these devices and the way we interface with social media. And I don't know what incentive exists that's more potent than
financial incentives. Well, and this is where the, you know, if the government, in the same way that
we want to transition long-term from a fossil fuels oriented economy to something that doesn't,
that changes the kind of pollution levels, you know, we have a hugely emitting, you know, society ruining kind of
business model of this attention extractive paradigm. And we could long term sort of just
like a progressive tax on that transition to some other thing, the government could do that, right?
That's not like, who do we censor? It's how do we disincentivize these businesses to pay for the
sort of life support systems of society that they've ruined. A good example of this, I think in Australia, is there, I think it's Australia,
that's regulated, that Google and Facebook have to pay the publishers who they're basically
hollowing out, because one of the effects we've not talked about is the way that Google and
Facebook have hollowed out the fourth estate in journalism. I mean, because journalism has turned
into and local news websites can't make any money except by basically producing clickbait. So even to the extent that local newspapers exist, they only exist by basically clickbaitification of even lower and lower paid, you know, workers who are just generating content farms, right? those companies to pay to revitalize the fourth estate and to make sure we have a very sustainably
funded fourth estate that doesn't have to produce this clickbait stuff. That's another direction.
Yeah, that's interesting that they have to pay.
I mean, these are the wealthiest companies in the history of humanity. So that's the thing.
So we shouldn't be cautious about how much they should have to pay.
Except we also don't want it
to happen on the other end, right?
You don't want to have a world
where, you know,
we have Roundup
making a crazy amount of money
from giving everybody cancer
and lymphoma
from, you know,
the chemicals.
Glyphosate.
Right, glyphosates.
And then they pay everybody
on the other end
after a lawsuit of a billion dollars,
but now everyone's got cancer.
Let's actually do it in a way
so we don't want a world where Facebook and Google
profit off of the erosion of our social fabric,
and then they pay us back for it.
How do you quantify how much money they
have to pay to journalism?
It seems like it's almost a form of socialism.
Yeah.
I mean, this is where the IQ lead lead example is interesting because they were able to
disincentivize and tax the lead producers because they were able to produce some
Results on how much this lowered the wage earning potentials of the entire population
I mean like how much does this cost our society?
We used to say free is the most expensive business model
We've ever created because we get the free downgrading of our attention spans our mental health our kids like our ability to agree with each other our
capacity to do anything as a democracy like yeah we got all that for free wonderful obviously we
get lots of benefits and i want to acknowledge that but that's just not sustainable the real
question i mean right now we're we have huge existential problems. We have a global competition, power competition going on.
I think China just passed the GDP of the US, I believe.
There is, you know, if we care about the US
having a future in which it can lead the world
in some meaningful and enlightened way,
we have to deal with this problem.
And we have to have a world where digital democracy
out-competes digital authoritarianism, which is is the china model and right now that builds more coherence and is more
efficient and doesn't devolve the way that our current system you know does i think taiwan
estonia and countries like that where they are doing digital democracies are good examples that
we can learn from but we are behind right Well, China also has a really fascinating situation with Huawei, where Google has banned Huawei. So you can't have Google applications
on Huawei. So now Huawei is creating their own operating system. And they have their own ecosystem
now that they're building up. And that's, you know, it's weird that there's only a few different operating systems now
i mean there's a very small amount of people that using linux phones yeah then you have a large
amount of people using android and iphones and if china becomes the first to adopt their own
operating system and then they have even more unchecked rules and regulations
in regards to like the influence that they have over their people with an operating system that
they've developed and they control and who knows what kind of back doors and spying tons yeah it's It's weird. When you see this, do you, like, it feels so futile for me on the outside looking in,
but you're working on this.
How long do you anticipate this is going to be a part of your life?
I mean, what does it feel like to you?
I mean, it's not easy, right?
The film ends with this question,
do you think we're going to get there?
Yeah.
I just say we have to.
Like, I mean, if you care about this going well,
I wake up every day and I ask,
what will it take for this whole thing to go well?
And how do we just orient each of our choices
as much as possible towards this going well?
And we have a whole bunch of problems.
I do look a lot at the environmental issues,
the permafrost, methane bombs.
The timelines that we have to deal with certain problems
are crunching, and we also have certain dangerous
exponential technologies that are emerging,
decentralization of CRISPR.
There's a lot of existential threats.
I hang out a lot with the existential threats community. It's going to take... like there's a lot of existential threats i hang out a lot with the sort of existential threats community it's gonna take must be a lot of fun
it's uh there's a lot of psychological problems in that community actually a lot of depression
there's only imagine some suicide as well it's it's uh you know it's it's hard but i i think we
each have a responsibility when you see this stuff to say,
what will it take for this to go well?
And I will say that really seeing the film
impact people the way that it has,
I used to feel like, oh my God,
how are we ever going to do this?
No one cares.
Like, none of people know.
At the very least, we now have about 40 to 50 million people
who are at least introduced to the problem the
question is how do we harness them into a collective movement and that's what we're trying
to do next i mean i i'll say also these issues get more and more weird over time my co-founder
aza raskin will say that it's making reality more and more virtual over time because we haven't
talked about how as technology advances advances at hacking our weaknesses,
we start to prefer it over the real thing.
We start, for example, there's a recent company,
VC funded, raised like,
I think it's worth like over $125 million.
And what they make are virtual influencers.
So these are like virtual people,
virtual video that is more entertaining, more interesting, and that fans like more than real people.
Oh, boy.
And it's kind of related to the kind of deepfake world, right?
Where like people prefer this to the real thing.
And Sherry Turkle, you know, who's been working at MIT, wrote the book Reclaiming Conversation and Alone Together.
She's been talking about this forever, that over time, humans will prefer connection to robots and bots and the computer generated thing more than the
real thing. Think about AI generated music being more, it'll start to sweeten our taste buds and
give us exactly that thing we're looking for better than we will know ourselves. Just like
YouTube can give us the perfect next video that actually every bone in our body will say, actually,
I kind of do want to watch that, even though it's a machine pointed at my brain
calculating the next thing.
There's an example from Microsoft
writing this chatbot called Xiaoice,
I can't pronounce it,
that after nine weeks,
people preferred that chatbot to their real friends.
And 10 to 25% of their users actually said,
I love you to the chatbot.
Oh, boy.
And there are several who actually said
that it convinced them not to commit suicide to have this relationship with this chatbot so it's
her it's her it's the movie exactly which is what so all these things are the same right we're
veering into a direction where technology if it if it's so good at meeting these underlying
paleolithic emotions that we have the way out of it is we have to see that this is what's going on. We have
to see and reckon with ourselves saying, this is how I work. I have this negativity bias. If I get
those 99 comments and one's positive comments and one's negative, my mind is going to go to
the negative. If I don't see that, I'm hijacked. I see you in the future wearing an overcoat.
You are literally Lawrence Fishburne in the matrix trying to tell people to wake up well
that's there's a line in the social dilemma where i say how do you wake up from the matrix if you
don't know you're in the matrix well that is the issue right and i even in the matrix we at least
had a shared matrix the problem now is that in the matrix each of us have our own matrix that's
the real kicker i struggle with the idea that this is all inevitable because this is a natural course of progression with technology
and that it's sort of figuring out the best way
to have us, with as little resistance,
embed ourselves into its system
and that our ideas or what we are with emotions
and with our biological issues,
that this is just how life is
and this is how life always should be.
But this is just all we've ever known.
It's all we've ever known.
Einstein didn't write into the laws of physics
that social media has to exist for humanity, right?
We've gotten rid, again,
the environmental movement is a really interesting example
because we passed all sorts of laws. We got rid lead we've changed from you know some of our pesticides
um you know we're slow on some of these things and corporate interest and asymmetric power of
large corporations you know which i want to say markets and capitalism are great because when you
have asymmetric power for predatory systems that that cause harm they're not going to terminate themselves. They have to be bound in by the public, by culture, by the state.
And we just have to point to the examples where we've done that.
And in this case, I think the problem is that how much of our stock market is built on the
back of like five companies generating a huge amount of wealth.
So this is similar.
I don't mean to make this example, but there's a great
book by Adam Hochschild called Bury the Chains, which is about the British abolition of slavery,
in which he talks about how for the British Empire, like if you think about it, when we
collectively wake up and say, this is an abhorrent practice that has to end. But then at that time in the 1700s, 1800s
in Britain, slavery was what powered the entire economy. It was free labor for a huge percentage
of the economy. So if you say, we can't do this anymore, we have to stop this. How do you decouple
when your entire economy is based on slavery? And the book is actually inspiring because it tracks
a collective movement that was
through networked all these different groups, the Quakers in the US, the people testifying before
parliament, the former slaves who did firsthand accounts, the graphics and art of all the people
had never seen what it looked like on a slave ship. And so by making the invisible visceral
and showing just how abhorrent this stuff was, through a period
of about 60 to 70 years, the British Empire had to drop their GDP by 2% every year for
60 years and willing to do that to get off of slavery.
Now, I'm not making a moral equivalence.
I want to be really clear for everybody taking things out of context.
But just that it's possible for us to do something that isn't just in the interest
of economic growth.
And I think that's the real challenge.
That's actually something that should be on the agenda, which is how do we, one of the
major tensions is economic growth, you know, being in conflict with dealing with some,
with many of our problems, whether it's some of the environmental issues or, you know,
with some of the technology issues we're talking about right now.
Artificial intelligence is something that people are terrified of as an existential threat.
They think of it as one day you're going to turn something on and it's going to be sentient
and it's going to be able to create other forms of artificial intelligence
that are exponentially more powerful than the one that we created
and that will have unleashed this beast that we cannot control.
What my concern is with all of this...
We've already gotten there. Yeah, that's my concern. My concern is that this is a slow
acceptance of drowning. It's like a slow, oh, we're okay, I'm only up to my knees, oh,
it's fine, it's just my waist high, I can still walk around... It's very much like the frog in boiling water, right?
Exactly, exactly. It seems like...
This is like humans have to fight back to reclaim our autonomy and free will from the machines.
I mean, one clear...
Okay, Neo.
It's very much the matrix.
It's Neo.
And one of my favorite lines is actually
when the Oracle says to Neo,
and don't worry about the vase.
And he says, what vase?
And he knocks it over.
It says, that vase.
And so it's like, she's the AI who sees so many moves ahead on the chessboard she can say something
which will cause him to do the thing that verifies the thing that she predicted would happen yeah
that's what ai is doing now except it's pointed at our nervous system and figuring out the perfect
thing to dangle in front of our dopamine system and get the thing to happen which instead of
knocking off the vases to be outraged at the other political side and be fully certain that you're right, even though it's
just a machine that's calculating shit that's going to make you, you know, do the thing.
When you're concerned about this, how much time do you spend thinking about simulation theory?
The simulation?
Yeah. The idea that it, if not currently, one day there'll be a simulation that's indiscernible
from regular reality. And it seems we're on that path.
I don't know if you mess around with VR at all.
Well, this is the point about, you know, the virtual chatbots out competing on real sense.
Yeah, exactly.
And the technology, you know.
I mean, that's what's happening is that reality is getting more and more virtual.
Right.
Because we interact with a virtual news system that's all this sort of clickbait economy outrage machine that's already a virtual political environment that then translates into real world action and then becomes real.
And that's the weird feedback loop.
Go back to 1990, whatever it was, when the Internet became mainstream, or at least started becoming mainstream.
And then the small amount of time that it took, the 20 plus years to get to where we are now.
And then think, what about the virtual world and once
this becomes something that's has the same sort of rate of growth that the internet has experienced
or that we have experienced through the internet i mean we're looking at like 20 years from now
being unrecognizable yeah we're looking at i mean it's it almost seems like that is what life does the
same way bees create beehives you know a caterpillar doesn't know what the fuck's going on when it gets
into that cocoon but it's becoming a butterfly yeah we seem to be a thing that creates newer
and better objects correct more effective but we have to realize AI is not conscious
and won't be conscious the way we are.
And so many people think that...
But is consciousness essential?
I think so.
To us?
I don't know.
Essential in the sense that we're the only ones who have it?
No, I don't know that.
Well, no, there might be more things that have consciousness,
but is it essential?
I mean, to the extent that
choice exists it would exist through some kind of consciousness and there's choice is choice essential
it's essential to us as we know it like as life as we know it but my worry is that we're
inessential that would like we're thinking now like single-celled organisms being like hey i
don't want to gang up with a bunch of other people and become an object that can walk i like being a single-celled
organism this is a lot of fun i mean i hear you saying you know are we a bootloader for the ai
that then runs the world that's elines yeah perspective i mean i think this is a really
dangerous way to think i mean we have to yeah So are we then the species... Dangerous for us. Yeah. I mean, are... But what if
the next version of what life is
is better? But the next version being run by machines
that have no values, that don't care, that don't
have choice, and are just maximizing for things
that were programmed in by our little miniature brains
anyway. But they don't cry,
they don't commit suicide. But then consciousness
in life dies. That could be
the future. I think this is
the last chance to try to snap out of that.
Is it important in the eyes of the universe
that we do that?
I don't know.
It feels important.
How does it feel to you?
It feels important,
but I'm a monkey.
You know, the monkey's like,
God, I'm staying in this tree, man.
You guys are out of your fucking mind.
I mean, this is the weird paradox of being human
is that, again,
we have these lower level emotions.
We care about social approval.
We can't not care. At the same time, like like i said there's this weird proposition here we're the
only species that if this were to happen to us we would have the self-awareness to even know that
it was happening right like we can concept like this two-hour interview we can conceptualize
that this this thing has happened to us right that we have built this matrix this external object
which has like ai and supercomputers and voodoo doll versions of each of us. And it has perfectly figured out how to
predictably move each of us in this matrix. Let me propose this to you. We are what we are now,
human beings, homo sapiens in 2020. We are this thing that if you believe in evolution,
I'm pretty sure you do. We've evolved over the course of millions of years to become who we are right now.
Should we stop right here?
Are we done?
No, right?
We should keep evolving.
What does that look like?
What does it look like if we go ahead?
Just forget about social media.
What would you like us to be in a thousand years or a hundred thousand
years or 500,000 years? You certainly wouldn't want us to be what we are right now, right?
No one would. No, I mean, I think this is what visions of Star Trek and things like that were
trying to ask, right? Like, hey, let's imagine humans do make it and we become the most enlightened
we can be. And we actually somehow make peace with these other alien tribes
and we figure out space travel and all of that.
I mean, actually a good heuristic
that I think people can ask is,
on an enlightened planet where we did figure this out,
what would that have looked like?
Isn't it always weird that those movies,
it's people are just people,
but they're in some weird future,
but they haven't really changed that much
right i mean and which is to say that the fundamental way that we work is just unchanging
but there are such things as more wise societies more sustainable societies more peaceful or
harmonious societies but james you know ultimately biologically we have to evolve as well
but our version of like the best version of that is probably the gray aliens.
Right?
Maybe so.
That's the ultimate future.
I mean, we're going to get into gene editing and becoming more perfect, perfect in the sense of, you know, that.
But we are going to start optimizing for what are the outcomes that we value.
that we value.
I think the question is,
how do we actually come up with brand new values that are wiser than we've ever thought of before,
that actually are able to transcend the win-lose games
that lead to omni-lose-lose,
that everyone loses
if we keep playing the win-lose game
at greater and greater scales.
I, like you, have a vested interest
in the biological existence of human beings.
I think people are pretty cool.
I love being around them.
I enjoyed talking to you today.
Me too.
My fear is that we are,
we're,
we're a model T.
Right.
You know,
and there's,
there's no sense in making those fucking things anymore.
The brakes are terrible.
They smell like shit when you drive them.
They don't go very fast.
We need a better version.
You know,
the funny thing is,
God,
there's some quote by someone i think like
i wish i could remember it it's something about how much would be solved if we were at peace with
ourselves like if we were able to just be okay with nothing like just being okay with living
and breathing i don't mean to be you know playing the woo new age card i just genuinely mean
how much of our lives is just running away from, you know,
anxiety and discomfort and aversion.
It is, but, you know, in that sense,
some of the most satisfied and happy people
are people that live a subsistence living,
that have these subsistence existences
in the middle of nowhere,
just chopping trees and catching fish.
Right, and more connection, probably,
that's authentic than something else.
And I think that's what this is really about.
It resonates biologically too
because of the history of human beings living like that.
It's just so much longer and greater.
Totally, and I think that those are more sustainable societies.
We can never obtain peace in the outer world
until we make peace with ourselves.
Dalai Lama, yeah, but I don't buy that guy.
You know, that guy, he's an interesting case.
I was thinking there was a slightly different quote,
but actually there's one quote that I would love to,
if it's possible.
But one of the reasons why I don't buy him,
he's just chosen.
They just chose that guy.
Yeah.
Also, he doesn't have sex.
How much can you be enjoying life
if that's not a part of it?
Come on, bro.
You wear the same outfit every day?
Get the fuck out of here with your orange robes.
There's a really important quote that I think would really be good to share.
It's from the book.
Have you read Amusing Ourselves to Death by Neil Postman?
No.
From 1982?
No.
Especially when we get into big tech and we talk about censorship a lot
and we talk about Orwell,
he has this really wonderful opening to this book.
It was written in 1982.
It literally predicts everything that's going on now.
I frankly think that I'm adding nothing
and it's really just Neil Postman called it all in 1982.
He had this great opening.
It says, let's see. we were all looking out for, you know, 1984,
when the year came and the prophecy didn't, thoughtful Americans sang softly in praise of
themselves. The roots of liberal democracy had held. This is like we made it through the 1984
gap. Wherever else the terror had happened, we at least had not been visited by Orwellian nightmares.
But we had forgotten that
alongside Orwell's dark vision, there was another slightly older, slightly less well-known, equally
chilling vision of Aldous Huxley's brave new world. Contrary to common belief, even among the educated,
Huxley and Orwell did not prophecy the same thing. Orwell warns that we will become overcome
by an externally imposed oppression.
But in Huxley's vision, no big brother is required to deprive people of their autonomy,
maturity, or history. As he saw it, people will come to love their oppression, to adore the
technologies that undo their capacities to think. What Orwell feared were those who would ban books.
What Huxley feared was that there would be no reason to ban a book,
for there would be no one who wanted to read one.
Orwell feared those who would deprive us of information.
Huxley feared those who would give us so much
that we would be reduced to passivity and egoism.
Orwell feared the truth would be concealed from us.
Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared
we would become a captive culture, but Huxley feared we would become a trivial culture, preoccupied
with some equivalent of the feelies and the orgy-porgy and the centrifugal bumble puppy. Don't
know what that means. As Huxley remarked in Brave New World Revisited, the civil libertarians and
rationalists who are ever on the alert to
oppose tyranny failed to take into account man's almost infinite appetite for distractions.
Lastly, in 1984, Orwell added, people are controlled by inflicting pain. In Brave New
World, they are controlled by inflicting pleasure. In short, Orwell feared that what we fear will ruin us.
Huxley feared that what we desire will ruin us.
Holy shit.
Isn't that good?
That's the best way to end this.
God damn.
But again, if we can become aware that this is what's happened,
we're the only species with the capacity
to see that our own psychology,
our own emotions,
our own paleolithic evolutionary system has been hijacked. I like that you're optimistic. We have to be. If we want to remain
people, we have to be. Optimism is probably the only way to live in a meat suit body and keep
going. Otherwise... It certainly helps. Yeah. It certainly helps. Thank you very much for being
here, man. I really enjoy this,
even though I'm really depressed now. I really don't want you to be depressed. I really hope people, you know, I'm kidding. We, we really want to build a movement and, and, uh, you know,
we're just, I wish I could give people more resources. We do have a podcast, um, called
your undivided attention, and we're trying to build a movement at humane tech.com, but.
Well, listen to any new revelations or new developments that you have I'd be more than happy
to have you on again
we'll talk about them
and send them to me
and I'll put them
on social media
and whatever you need
awesome
I'm here to help
awesome man
great to be here
resist
resist
we're in this together
humanity
resist
humanity
we're in this together
thank you Tristan
I really really appreciate it
goodbye everybody Thank you.