The Megyn Kelly Show - Tristan Harris on Secrets of Social Media Algorithms, Tech Manipulation and Addiction, and How Our Feeds Divide Us | Ep. 244
Episode Date: January 20, 2022Megyn Kelly is joined by Tristan Harris, president and co-founder of the Center for Humane Technology and co-host of the "Your Undivided Attention" podcast, to talk about, the secrets of social media ...algorithms, how technology is designed to make us addicted, how technology and our feeds divides our society and furthers stereotypes, cultural reliance on phones, the dangers of tech for our kids, solutions to the problem of technology, attention manipulation, how tech is a danger to our democracy, the "amplifiganda" of China, how our adversaries and tech companies are trying to harm us through technology and social media, the dangerous future of the metaverse, and more. Plus, Charles C.W. Cooke breaks down President Biden's disastrous press conference and first year in office.Follow The Megyn Kelly Show on all social platforms: YouTube: https://www.youtube.com/MegynKellyTwitter: http://Twitter.com/MegynKellyShowInstagram: http://Instagram.com/MegynKellyShowFacebook: http://Facebook.com/MegynKellyShow Find out more information at: https://www.devilmaycaremedia.com/megynkellyshow
Transcript
Discussion (0)
Welcome to The Megyn Kelly Show, your home for open, honest, and provocative conversations.
Hey everyone, I'm Megyn Kelly. Welcome to The Megyn Kelly Show.
In just a few minutes, we're going to be taking a deep dive into the dangers of big tech with Tristan Harris.
Do you remember this guy? He was in the great, great film, The Social Dilemma.
And he is going to be speaking to us about what it's like on the inside at Google and elsewhere
and how we are intentionally being manipulated. Even worse now than when they made that movie a
few years ago. But we begin today with President Biden, the 46th president of the United States,
giving just his second formal
news conference in a year, taking questions for just under two hours yesterday.
Today, already, his administration is now trying to do some major cleanup after in that
press conference, he not only almost started a ground war in Europe by seeming to green light a Russian incursion, minor incursion in Ukraine,
but went on from there suggesting that the 2022 midterms would probably be illegitimate
unless his entire voting rights bill were passed.
Charles C.W. Cook is a senior writer for National Review.
He has a new piece out today entitled Biden's Year of Failure.
Charles, great to have you here. So he claimed last night that he did not overpromise about what
he could do as president, that he actually outperformed what was expected of him in his
first year as president, saying, quote, no one has ever accomplished more in their first term.
Your thoughts on whether that's true?
Well, it's not true. None of that's true. He certainly overpromised because he vowed that he could change things that are largely beyond his control. I think on the assumption, especially
with coronavirus, that everything would improve once he became president, which it hasn't, he has not overperformed.
He has fallen foul to delusions of grandeur
that have destroyed his agenda.
And as for his supposed productive first year,
and we can find recent examples that demonstrate that that isn't true.
But if you go back through history, especially, you will see that it isn't true. He wanted to be
FDR for a reason, and he hasn't been. The words, Mark Thiessen of AEI, and he's a Fox News
contributor, used to come on my show every night when I did the Kelly file. He had a piece out before the presser saying it's a problem for a president trying to put a rosy spin on the Iraq war and the American people knew it wasn't true.
And the messaging fell flat, he pointed out, until they decided on the surge in 2007 and things started to turn around.
That's exactly what happened last night, right?
He was talking about the country in a way that did not seem to reflect the reality on the ground,
whether it has to do with our economics, COVID, even Afghanistan.
No apologies. No apologies.
Meanwhile, two thirds of the American public disapproves of how he handled Afghanistan,
a huge portion of them strongly.
So his words didn't match people's experiences or the facts.
No, and he squared that circle by saying he doesn't believe the polls, but he doesn't have
to believe the polls, given that we have results in New Jersey and Virginia, real results, the
product of real people voting, that should show him something important. I thought more broadly, though, the reason that
it failed yesterday, beyond the usual Joe Biden shortcomings, is that he promised a reset but
didn't change anything. The classic example that is given in political circles of a pivot is Bill
Clinton, who was forced to pivot after the 1994 midterms by a Republican wave, what was
called the Republican Revolution. But Bill Clinton actually changed course. He dropped the agenda
that he'd been trying to pass in 1993 and 1994. He worked with the Republicans on areas of agreement.
But what did Biden do yesterday? He said that he hadn't messed up in Afghanistan, which he did.
He said that the inflation problem, sorry, I should say,
Bill Clinton probably didn't slip there,
that the inflation problem would be helped by spending more money,
which nobody believes. He then, oddly enough,
channeled President Trump in some important ways by casting doubt on America's election system.
And you looked at his talk of his agenda, of his Build Back Better bill. Well,
the furthest he would go is to say it would probably have to be broken up. Where's the reset? Where's the change? Where is the pivot?
Yeah, there isn't one. His message seemed to me more like, what the American people really need
is to see more of me. I need to get out there on the road and just do a better job of explaining
how amazing I've been. And it's like, well, I mean, again, going back to
his column yesterday, read as follows in part, sorry, but you can't boast about your COVID
strategy when 55% disapprove it. You can't brag about your economic performance when 60% say it's
been dismal. You can't crow about your foreign policy when 55% believe you're doing a terrible
job as commander in chief. Can't talk about how you've united the country when a 49% plurality say you've done more to divide us. And you can't
say you've had a great year in office when 63% say we're on the wrong track. And that same number,
about two thirds of the American public, Charlie, according to, I think it was the latest Suffolk
poll, say he shouldn't run for a second term. So, I mean, can you really say it's just a comms problem?
What he really needs to do is just be more persuasive about what he's done?
No, it's not a comms problem. And we've talked about this on your show before.
The problem is that Joe Biden's behavior as president has been at odds with how Joe Biden
ran for president and why the American public hired him to be president.
Joe Biden ran as almost the anti-Twitter candidate. He ignored Twitter. The moment that he entered
office, he's been driven by Twitter. Joe Biden ran promising honor and moderation and competence
and experience, but he allowed himself in the early days to be talked into the idea that
he would be transformational. Nobody asked for that. Joe Biden expected until January 5th of
last year to preside over divided government with a Republican Senate, a narrow Democratic House,
and of course, a Democrat himself in the White House. And the moment he got to 50 seats, not 65, not 75,
but just 50 seats, he brought out every single agenda item that the Democratic Party has wanted
to institute for the last 10 years. This is a profound mistake, because the American public,
whether it should have or not, wanted a caretaker president. They wanted a president who wasn't Donald Trump and who would return the country to normality after both Trump
and COVID and the economic fallout. Biden totally misinterpreted his mandate and he's still
suffering from that. And again, there was no sign yesterday that he is going to change that.
And until he changes that, he's going to get the
same results. I know I was talking to the BBC today, Charlie, back in your your old home country,
though you're an American now. And I was struck by the questioning because it was very focused on how
Trump and the tweets and, you know, Joe Biden has, you know, things have gotten calmer. And isn't that a better thing for America? And they really look at him as not radical, as moderate. And, you know, no president could really bring the country together. It's so divided. And they're very focused on Trump's tweets and so on. And I realized that they were ridiculous. They were terrible and they were absurd. And OK, I get all that. Nobody would dispute that.
But they completely miss Joe Biden's radicalism and his divisiveness.
And, you know, even lately, his rhetoric's getting even worse. But he still wants us to believe he's the unity president.
Yeah. And I will put Donald Trump in a class of his own. He is unique in every way.
But oddly enough, the last week, Joe Biden has been more like Donald Trump than he would like to admit.
On policy yesterday, he cast doubt on the legitimacy of elections twice.
He was weak on the question of a Russian invasion of Ukraine, also twice.
And he snapped at a reporter, Philip Wegman, who asked him about comments that he had made about his voting rights agenda, so-called, which in turn was the product of a speech he gave last week that was frankly
outrageous, that was so outrageous that it was condemned by the likes of Mitt Romney,
who tend not to raise their voice, tend not to indulge in hyperbole, and also acknowledged
by his own party.
Dick Durbin said that perhaps Biden had gone too far.
He did go too far.
What he did was to divide the country up into two groups of people,
good people who agree with Joe Biden and bad people,
whom he likened to insurrectionists and segregationists
and really some of the worst people from the darkest periods in American
history. And that's not what Joe Biden said he was going to do as president. And Mitch McConnell
on the Senate floor compared Biden's comments to his inaugural address unfavorably.
Mm hmm. Yeah, we actually have that because worse, Joe Biden tried to deny that he had made that comparison between people who oppose his voting rights bill and the George Wallace's of the world.
And it's on camera. We all it just happened. It's not like it happened two years ago.
We just heard him say that earlier this week. So here is a soundbite showing his denial and then what he said earlier this week. You called folks who would oppose those voting bills
as being Bull Connor or George Wallace, but you said that they would be sort of in the same camp.
No, I didn't say that. Look what I said. Go back and read what I said and tell me
if you think I called anyone who voted on the side of the position taken by Bull Connor, that they were Bull Connor.
And that is an interesting reading of English.
I assume you got into journalism because you like to write.
So I ask every elected official in America, how do you want to be remembered?
At consequential moments in history, they present a choice. Do you want to be on the side of Dr. King or George Wallace? Do you want to be on the side of John Lewis or Bull Connor? Do you want to be on the side of Abraham Lincoln or Jefferson Davis? The indignation at having been called out for something he is on camera doing.
And the incoherence.
His answer was totally incoherent.
He contradicted himself.
He rambled.
He became angry.
And all Philip Wegman had done, very politely, it must be said, is ask him a question that accurately characterized his previous remark.
Look, we know why one would invoke Jefferson Davis or Bull Connor in a political speech.
We know why people invoke Hitler in political speeches or slavery in political speeches. Once you've done
it, you can't backpedal and say, well, I didn't mean it literally. It was clearly heard by a good
number of Americans, and not all of them Republicans, certainly not all of them involved
in politics, as a Manichean exercise in bullying, an attempt to cast bills that really aren't responding to much
as the future of the country. And I think Biden should own that. If you want to engage in that
sort of language, own it afterwards. But he's trying to have it both ways. And again, where's the reset?
He also tried to blame the Republicans for not getting more accomplished. On the one hand,
he accomplished more than anybody ever in the first year of his presidency. But on the other hand, the reason he hasn't accomplished more is those Republicans who he could never have
anticipated would be this way, would be this determined to block his agenda.
Meanwhile, I mean, my first thought on that, Charlie, was that's rich coming from a guy
who heads up a party that called themselves the resistance during Trump's presidency.
They weren't working with Trump on anything.
But secondly, the Republicans have worked with him on a couple of key items.
You know, that's why he got his $1.9 trillion COVID relief plan through.
It's why he got his infrastructure bill nine trillion dollar covid relief plan through. It's why he got his infrastructure bill through because they worked with him.
And a lot of Republicans voters didn't like the fact that the Republican lawmakers did that, but they did it.
And his most recent defeats were caused not by Republicans, but by Democrats.
Yeah. So, as you know, I am a staunch defender of the separation of powers.
And I don't like the way that when we have a president of a different party than the
Congress, that the press describes the Congress as obstructionist.
I didn't like it when Trump was president, the Democrats were obstructionist.
I don't like it now we have a Democratic president, the Republicans are obstructionists. Congress is in charge of legislation. There's nothing
written in stone that Joe Biden should get any of his agenda. And so this framing,
which we hear, especially from Democrats, I find irritating as a general rule. But as you point out,
it was not just irritating, it was churlish, because on November 15, which is two months ago, Joe Biden signed and heralded a bipartisan infrastructure bill that got 69 votes in the Senate and that was endorsed and shepherded through by Mitch McConnell. So to suggest that Mitch McConnell is not likely to do anything that
would make Joe Biden look good is not only to falsify the record, but to hide under a blanket
the most recent victory that Joe Biden himself trumpeted from the White House. Meanwhile,
there is a bipartisan group of senators in the Senate who are working on
a reform to the Electoral Count Act, the very instrument that was used by President Trump to
try to steal the election in 2020. So that wasn't just an annoying framing that puffed up the role
of the president in our system. It was factually wrong and it was ungrateful to boot.
You point out in your piece today at National Review,
Joe Biden did inherit some challenges, no question.
It wasn't as bad as when Trump was dealing with COVID
because at that point it was brand new.
We were trying to figure it out.
We didn't know what it was.
And Joe Biden also inherited vaccines.
But it's not like we didn't have a covid problem
when he took over and inflation was already starting to rear its ugly head. So there were
some headwinds against him, though it must be said he was he was in a pretty good position at that
point versus Trump when it came to the vaccines. But your point is what that he inherited those
challenges. And yet the American people, what, they don't, if I were running
for president, not that I'm allowed to, I wouldn't make promises of the sort that Biden did, because
I don't believe that the president is the king. I don't believe that he's a pope. I don't think
that he has some sort of spiritual control over the country and its economy and infectious diseases and so forth.
But Joe Biden ran as if he did believe that. He said on television that he was going to shut down
the virus. Now, whether or not he could do that, and I don't blame him for the persistence of
COVID any more than I blame Donald Trump for its arrival. He hasn't done that. He promised that
he was going to restore the economy on a broad basis and that the middle class would be better
off under him than it was under Trump. Again, I think the president has a limited ability to do
that. But Biden made that promise and it hasn't happened. And when you do that, you create a hostage to fortune for which you have only yourself
to blame.
So, yes, there were many challenges when Joe Biden came in.
And yes, he had it more difficult than did, say, John F. Kennedy in 1960, although, of
course, he had his own challenges.
But people judge you based on what
you promise you will do. And Joe Biden, contra his argument yesterday, has overpromised and he
has underdelivered. What do you make of, yes, he was angry in response to that one question. He was
meandering at times. Last night on Hannity, they put together a mashup, which is actually quite
helpful just to see some of the moments strung together. So we've repurposed it here. Take a look.
We passed a lot of things that people don't even understand what's all that's in it,
understandable. One of the things that I remember saying, and I'll end this,
I think it's extremely realistic to say to people,
because let me back up. So whether or not we can actually get election, and by the way,
I haven't given up, we haven't finished the vote yet on what's going on on the on voting rights and the John Lewis bill and others. The Allison Harris, please. Very few schools are closing. Over 95 percent are still open. the motives of some of the political players and some of the political parties.
One more question, Mr. President.
By the way, it's a quarter of guys. So I'm going to do this.
Just listen.
If you may ask me easy questions, I'll give you quick answers.
Charlie, I felt watching him at times the way I feel watching a hurt gymnast on the beam or doing the horse, you know, the pommel.
Like it's terrifying.
You are not at all certain he's going to be able to land it.
And, you know, he's the president. So you're kind of a good thing. But one
doesn't need to get into any sort of medical claims in order to evaluate the man as the
speaker of English. And at times yesterday, it was not clear to me as a native speaker of English what on earth he meant. He is decreasingly
able to express himself and communicate coherently and in a timely fashion. And I can't imagine that
that's going to get better over the next few years. We have the oldest president we've ever had that does matter especially in the modern era
if you look at people who go into that job uh and then you look at what they look like when they
leave that job they age far faster than anyone would want to in barack obama did george w bush Barack Obama did, George W. Bush did. Goodness knows how it's going to age, Joe Biden.
It's a stressful role.
So by the time that he would run again, he's going to be like that,
plus another two and a half years of stress, and he's going to be 82, I believe.
This is a real predicament for that party, not least because the vice presidential
candidate that they have chosen is even less popular and oddly enough, often even less able
to express herself in the English language as well, at least not without sounding as if she's
late for a book report. So they have created a straitjacket for themselves that is going to be really difficult to resolve
because they've really only got three choices, haven't they?
One is that you stick with Biden.
The other is you try and replace Biden with Kamala Harris.
And the third one is you open it up.
You have a primary.
But that primary would be conducted while Joe Biden and Kamala Harris are serving their
term out. And that would be brutal, I think,
for the Democratic Party, just as it was in 1979-80, when Ted Kennedy challenged Jimmy Carter,
and as it was in 1991-92, when Pat Buchanan challenged George H.W. Bush, both of which,
it should be noted, ended up with those presidents losing.
And he did say last night that if he runs again, which he has said he will do,
Kamala Harris will be his running mate, though there was a long pause that's given other people
pause in deciding whether they believe that, since she seems even less likely to win than he does,
assuming they really are prepared to run an 82-year-old
to run a second term.
And could even he do it, given the fall in his poll numbers and the way things are going?
It's a long ways away.
We'll find out.
Charles, it's always a pleasure.
Thank you so much.
And I encourage you to talk to your friend, Rich Lowry, to whom I sent a text today about
a very funny exchange I heard on your podcast, The Editors.
You've got a couple. I love that one. And I love Mad Dogs and Englishmen, too.
But there is a very funny moment between Charles Cook and Jim Garrity. I think it was last Friday's
show that my husband and I have been laughing about for a week. And I'll just leave it there
as a tease. Don't forget to stay tuned now now because up next, Tristan Harris. Tristan is
from the huge Netflix hit, The Social Dilemma. He was on the inside of Google for years and has
ever since been demanding more ethics from big tech and insiders view on what they're doing to us
then, now, and in the future. Don't miss that. The very popular Netflix documentary,
The Social Dilemma, pulled back the curtain on the tech industry and the ways we all can become
addicted to our phones, our social media, and just instant gratification. A very prominent player in that documentary is Tristan
Harris. He has been called the closest thing Silicon Valley has to a conscience. A former
design ethicist at Google who has since gone on a mission to raise awareness against the everyday
devices, about and against, that we have become addicted to. He joins me today to discuss it all.
Tristan, thank you so much for being here. Real pleasure to be here with you, Megan.
Fascinating stuff. So I was just looking at your background just to set it up. You're from the San
Francisco Bay Area, as I understand, raised by a single mom and very young when you started
practicing magic, which would become relevant to what you're doing today.
Tell us how.
Yeah, I love talking about being a kid and studying magic.
Actually, my mom used to take me to a little magic shop in San Francisco growing up. independent of the age or education or PhD level of the person you're doing magic with,
that magic is about understanding the vulnerabilities that are universal to all
human minds, right? And even sometimes if you know how the trick works, the psychology is so
powerful that it still works anyway. And that really plays into how technology is designed,
because when I was later at Stanford, and actually, I was classmates with the founders of Instagram, and many of the people who joined the early ranks of
Facebook and Twitter and a lot of these companies, so I really know the culture and the people
intimately. And we, many of us studied at a lab called the Stanford persuasive technology lab,
which is part of a whole space and discipline of persuasive technology. How do you design technology
to persuade people's attitudes, beliefs, and behaviors? When I say that, I don't mean
political persuasion. I mean things like, can I persuade someone to fill out a form? Can I persuade
someone to tag their friend in a photo on Facebook? Can I persuade someone to add a filter
to their photo on Instagram. And persuasive technology
is a whole discipline that is at the root of changing, I think, how we see our relationship
to technology, which is it's not just this mirror that's, you know, people often think,
oh, there's all these problems with social media and polarization and addiction, but we're just
holding up a mirror to society. Those are your addictive people. Those are your extreme folks. And that's how people behave. But I think what that picture misses is that
technology is actively persuading us and eliciting certain things from us. And those are design
choices made by technology companies. So when I was later at Google, I became a design ethicist.
They actually acquired a small company. I used to be a tech entrepreneur.
They acquired that company.
And I became interested in how do you ethically shape when you know more about their mind that they might know about their own and you're designing persuasive technology.
What does it mean to be ethically persuasive?
And I became very interested in that.
I tried to change Google from within for a few years.
And I just saw the incentives that were fundamental to this industry about capturing human attention. How much have you paid for your
Facebook account or your Twitter account in the last year? Nothing. But how are they worth a
trillion dollars? It's because they mine what? Well, they mine our attention. People think it's
just their data, but they actually make money the more time you spend because you have to look at
the ads. And the more time you spend, the higher their stock price, but there's only
so much attention. So it becomes this race to the bottom of the brainstem. Who can go lower in the
brainstem to elicit responses persuasively from you and get outcomes from you? So I think that's
really the situation we find ourselves. And I think that lens of persuasive technology and magic
are critical to understanding what's really going on with how technology is influencing us as opposed to
we're actively using it. Oh, it's fascinating because watching The Social Dilemma, my biggest
takeaway was you are being manipulated, right? I mean, that's really the message of it. It's not
totally, it's not your fault, though it's not entirely your fault if you have an addiction to your phone or your social media.
There is culpability and intentionality on the side of big tech.
They are trying to addict you.
And so you're there, a normal human with all the vulnerabilities of a normal human thinking, oh, what a fun device.
I can talk to my friends and I can take pictures and I can look at my calendar. Oh, and there's this thing that lets me connect
with people or follow a newsfeed. And before you know it, huge portions of your life are devoted
to this little device by design. Yeah. And I think just to make that really concrete,
because a lot of people might hear that as kind of an extreme statement. Oh,
they're manipulating us. Well, what do you mean? That sounds like a conspiracy theory or
exaggerated. Let's make it very concrete. I mean, people are really at home and they're
looking at their kids and their kids are sucked into their phones. And they think
that if they're addicted, that's their responsibility, right? But let's just make
that example concrete. So you have a couple of daughters, is that right?
I have three kids, 12, 10, 10 and eight boy, girl, boy.
Okay. Got it. Well, so like, let's say, you know, um, one of your kids watches the social dilemma and says, wow, I really don't want to, um, uh, get sucked into that anymore. I want to,
you know, use this less. And so they stop using, let's say Instagram. Well, um, as we depicted in
the social dilemma, the AI kind of wakes up and it notices that
one of the users goes dormant.
There's actually a name for this.
There's a feature called user resurrection or comeback emails.
So like a digital drug lord, it notices that you stopped using.
And instead, if it was a neutral product and we were responsible for our own addictive
behavior, then they wouldn't actively say, hey, user four, five, six, seven,
eight, eight, two, five, seven, three. They stopped using the product. We're going to find
out what were the things that used to keep them here and keep them coming back. And it just
calculates with their, with their big artificial intelligence supercomputer. These are the ex
boyfriend photos that, that had that person coming back. So we're going to show the ex
boyfriend photos and it works to draw us back in. And notice if you stop using Facebook, if I stop using one of these systems, they get
more aggressive. They actually start doing more text messages, more notifications, more emails.
It's like a digital drug lord. And that's the part where we can be very clear at assigning
responsibility at the manipulative aspect. Right. It's like you try to quit alcohol because you've become
addicted to it. And yet somehow the people at Seagram's find a way to keep a bottle in your
pocket to uncap it, to have it spill a little on the table in front of you. I mean, it's like,
of course, it makes it even harder for anybody who's got an addiction to get away from it.
And worse than that is the entanglement. So actually one of the things that I know we'll
talk about later, Frances Haugen, who is the Facebook whistleblower and the Facebook file,
she leaked thousands of documents of Facebook's own internal research. And one of the things that
in Facebook's case, but really when we talk about Facebook or Instagram, you can apply it to all of
them. Twitter, TikTok, it's very similar across the entire social media industry, is that they actually know
that kids get entangled. So for example, Megan, you know, you and I probably use what texting is
our primary way of talking to your friend, right? I'm assuming you open up your iPhone and you fire
off the text. What parents don't realize is that for kids, a lot of kids either in TikTok or
Instagram, that's their primary messaging medium. That's where they message their friends. It's not just like the feed. It's also where you kind of message
your friends. So if you say, hey, I don't want to get sucked into that addictive feed, they have
bundled and entangled those two things together. And they don't want to separate them because,
so to counter your example about alcohol, alcohol wasn't baked into a fundamental need of the way
that you communicate, right? But imagine that the only place you could communicate is the place where they can put that
alcohol and pour you a glass. And they always pour you a glass every single time you want to
open your mouth and say something to someone else. And the companies know that parents are bad at
giving their own kids advice about this because they know that parents will say things like,
oh, honey, just stop using it. As if it's a matter, it's say things like, oh, you know, honey, just stop using it as if it's a matter.
It's like telling you, Megan, or me, don't don't text your friends like when they entangle
us.
That's really the where the abuse comes from.
Well, it is a big problem whether you're addicted or not, because you do.
I mean, I would love to step away from my iPhone more, but I suffer from the same problem.
I mean, every that's how everyone communicates.
That's where my news is.
That's how I text my team. My team texts me. So you'd have to reinvent society, you know, to go back to the way I grew up, right? The iPhone, the cell phone didn't even exist really until the early 1990s. I remember seeing somebody walk down the street with it in Chicago in 1995. She was having a conversation on the sidewalk and being like, what a moron. Who needs to have a conversation while they're walking from A to B?
That was in my lifetime, right?
That was 1995.
But we're so far, all of us, away from that now.
How can one exist without this device?
Well, you know, so I run an organization called the Center for Humane Technology that's been
trying to ask and answer these questions and at least point to a direction, which is really clear. This is not about vilifying all of technology or creating a moral
panic and saying everything is going off the rails and we should stop using our iPhones or stop using
technology overall. I love technology. I grew up on it. I think it can be an empowering tool. In
fact, my co-founder, Aza Raskin, his father, Jeff Raskin, actually invented the Macintosh project at Apple.
Back in those days, the idea of a computer is it's a bicycle for your mind. In the same way
that a bicycle uses more of us in getting even more leverage out of the kind of distance that
we can travel, technology can be a bicycle for our creative powers, for our communication powers, for our, you know, science powers, but that's not what the business model of these social
media, I think these social media companies are, we're gonna look back in history and
see them as a parasite that, that their goal is to suck as much attention out of society
as possible and, and suck it into these engagement and arrangement machines that polarize us,
that sort of, uh, want us to not be able to have a conversation over Thanksgiving dinner,
because they want to personalize these news feeds to each other so that we each get different
information from each other. So even when we try to have a conversation, we can't do that.
That is that the key difference here is the business model. Notice if you do a FaceTime call
to your, you know, your son or your daughter, Apple doesn't make
money the more time you use FaceTime. So when you stop using FaceTime, it doesn't aggressively
message you. It doesn't put hearts and likes and comments floating all over the screen to keep you
jazzed up and entangled. It doesn't do the beautification filters to plump up your lips
or your eyes or your cheeks, which the TikToks and the Instagrams do. In fact, TikTok was found recently to, without even asking users to do a 5% kind of beautification
filter, even if you didn't turn it on actively, because the apps that give you the, it's like
the mirror mirror on the wall, who's the prettiest of them all.
The one who reflects back the most positive self-image is the one you're going to get
addicted to.
And so TikTok actually invisibly was doing that and plumping, you know, kids, you know, lips and eyebrows and all of that. And it has these
really serious consequences that we saw in Frances Haugen's Facebook files, including the fact that
you have kids like, you know, teenage girls who will say, I'm worried I'll lose my boyfriend if I
don't have the beautification filter on because they've become accustomed to seeing me
with that filter. And it creates an anchor of who we are. We're the virtual us. They will only like
us if we look different than who we actually are. And that's the perversion that comes from this
business model, which again is separate from email or FaceTime or text messaging. Those things are
fine because their business model is not maximizing attention. Wow. This is so chilling when you think about the creation now of the so-called metaverse.
They're basically in the process of creating a new, more in-depth, more time-consuming
universe online, which I don't totally understand, but they're trying to suck even more time
from us as a world online.
They want an alternate universe online that's even more
involved and time staking than it is today. We'll get into that much, much more when we
squeeze in a quick break and more with Tristan right after it. Wow. Don't forget, folks,
programming note, you can find The Megyn Kelly Show live on Sirius XM Triumph Channel 111
every weekday at noon east and the full video show and clips by subscribing to our YouTube channel, youtube.com slash Megyn Kelly.
Don't get addicted, but enjoy.
It can be done in moderation.
If you prefer an audio podcast, subscribe and download on Apple, Spotify, Pandora, Stitcher or wherever you get your podcasts for free.
And there you will find our full archives with more than 240 shows.
So Tristan, you say in the film that it's almost like these tech companies create an avatar voodoo doll of us. I'm going to set it up with a clip of you going there and then get you to expand on it.
This is soundbite seven.
On the other side of the screen, it's almost as if they have this avatar voodoo doll, like model of us.
All of the things we've ever done, all the clicks we've ever made, all the videos we've watched, all the likes,
that all gets brought back into building a more and more accurate model.
The model, once you have it, you can predict the kinds of things that person does.
Where you're going to go,
I can predict what kind of videos will keep you watching.
I can predict what kinds of emotions tend to trigger you.
At a lot of these technology companies,
there's three main goals.
There's the engagement goal,
to drive up your usage to keep you scrolling.
There's the growth goal,
to keep you coming back and inviting as many friends and getting them to invite more friends. And then there's the growth goal to keep you coming back and inviting as many friends and
getting them to invite more friends. And then there's the advertising goal to make sure that
as all that's happening, we're making as much money as possible from advertising.
Each of these goals are powered by algorithms whose job is to figure out what to show you
to keep those numbers going up. It's chilling. And I love little Pete Campbell from Mad Men
in the background as the guy who's on computers. But it's so it's not in fact, like the algorithm
is effectively the three people in that room. It's not actual human standing there, right?
It's like they figured out algorithms that can figure everything out in an instant.
Well, you see Google and Facebook figured out how to clone Pete Campbell, the advertising guy,
and just sit him inside of the Google.
No, I'm kidding. I'm just kidding.
No, I think people looked at this metaphor.
So in the film, The Social Dilemma, which I really recommend everyone watches, it was the second most popular documentary, I think, in Netflix history, won two Emmy Awards.
And it really just lays this out in a way that I think everybody on all political sides can kind of understand as well. And what we talk about in the film, as you said, Megan, is that, you know, behind the screen,
you know, there's you, there's this piece of glass. And when you scroll up with your finger,
right, there's going to be another rectangle that comes up next. Do you think that that rectangle
that comes up next is just the next thing that one of your friends posted. No. What they do is they fork it off to that supercomputer,
which is that Pete Campbell character.
And that character, which is like you said,
you know, character embodiment,
it's not actually like that.
It's just a computer and it's calculating a number.
And it looks at every possible thing
it could show you next.
Like within the space of things it could show you,
it could show you something
that'll outrage you politically.
It'll show you something that'll, your ex-boy your ex boyfriend or your ex girlfriend, because that's what you
clicked on last time. It can show you a live video because Facebook wants to like dial up that live
video. It tries to calculate which thing would be most likely to keep you scrolling, because
obviously doesn't want to show you the thing that will stop you from scrolling. And it's a
supercomputer pointed at your brain to figure out how to basically light up your nervous system. And the voodoo doll idea,
one of the reasons we use that metaphor is that if I talk about, Hey, Megan, you know,
they have your data, they have your data. And that, that, where does that hurt you?
If you think about it just as a person, like there you are, you hear that phrase,
they have my data. It doesn't feel like what's the problem with that. But if I say, look,
that data is being used to assemble a model of problem with that but if i say look that data is
being used to assemble a model of you a more and more accurate model that can be used to predict
things about you and it gets more accurate the more information they have but it's like a voodoo
doll so all the clicks you've ever made that puts little hair on the voodoo doll so it's a little
bit more accurate when i prick and try to figure out what would activate the voodoo doll if all
the likes all the watch time and all the videos you've ever made um that also makes the voodoo doll more accurate adds little shirts and pants to the voodoo doll. If all the likes, all the watch time on all the videos you've ever made, that also makes the voodoo doll more accurate. It adds little shirts and pants to the voodoo doll.
But then what the point is that as that data gets more and more accurate over time,
and it looks at a hundred other people who saw those same political, you know,
enragement videos that you've seen, and it says, well, for people just like you,
this is the thing that tends to keep them scrolling, watching, clicking, commenting,
because all of that activity is engagement. It's attention. It's the thing that's sort of
the parasite that makes these companies worth trillions of dollars. And that's essentially
the system that we're in. But the problem is that it leads to basically all of these negative
externalities that dumped onto the balance sheet of society. We have shortening of attention spans.
We have more political of attention spans.
We have more political polarization because affirmation is more profitable than information. So giving us more confirmation bias of our existing tribal beliefs and why the other side
is so bad. Obviously this, this trend existed in other kinds of media, but now you have a
supercomputer that's like literally, you know, figuring out this is the next fault line in
society. And these keywords emerge and whether it's mRNA or masks or vaccines, or, you know, figuring out this is the next fault line in society. And these keywords emerge and whether it's mRNA or masks or vaccines, or, you know, no matter what it is, it finds the
one that works on buckets of users, just like you. And it knows that you're going to click before you
know, you're going to click. And I think some people hear that. And they think that sounds
like a conspiracy theory, like technology knows us better than we know ourselves. But Yuval Harari,
the author of Sapiens is a friend of mine.
He's gay and he jokes that his partner, Itzik, when he uses TikTok, it only took Itzik one or
two clicks for TikTok to figure out exactly which rabbit hole to send his partner Itzik down.
And that's the thing about all of us is it knows exactly what works. But the problem is what works
on us isn't the same thing as what's good for society.
Or for us, even, or for us.
And that's why, I mean, honestly, Twitter came out with a thing.
I don't know if they do it every year or whatever, but they just popped up in the feed.
This is how many conservative sites you follow.
This is how many liberal sites you follow.
And it just sort of volunteered, you know, your information. And on mine, I was, I was very pleased that I had a 51, 49% ratio on my income, incoming, you know, news and people I follow. And that's important. So it just makes me a little less easy to manipulate in
the information game, because you're definitely getting, you're getting propaganda from both
sides, but at least, I mean, it's propaganda, but at least you're getting it from both sides,
you're a little less easy to manipulate. Right. So that's one step that definitely, I mean, it's propaganda, but at least you're getting it from both sides. You're a little less easy to manipulate. Right. So that's one step that definitely,
I mean, that's, I actually had not seen that specific feature from Twitter. It's obviously
better for each of us to maintain more broad information diets. But the second problem,
Megan, is that the business model is we think of it like a parallel system of incentives to
capitalism. Instead of getting paid in money, you get paid in more likes, more views, more attention, more comments. And when you say something that basically outgroups
the other side and say, here's yet another example about why the other side is awful,
we'll pay you more likes, more followers, because that was better for generating engagement for the
machine. Now, no one at Twitter or Facebook has a big, long mustache and they're twirling it,
saying, gosh, how can we create the next civil war and drive this up as much as possible?
But that's the inadvertent side effect of a machine that's values blind.
All it knows is what increases people's likes, followers, get them to invite more people.
And the problem is that those things tend to be conflict. So even if you have a broad diet and you're looking at information from both sides, quote unquote information, what it really is, is basically
people, you know, shit posting on the other side and building on the boogeyman. So whatever your
boogeyman is for you, like, oh, they're doing, you know, this next in my hometown, it now you
can sort of carry that to the worst next conclusion. You can find evidence for every stereotype.
And in fact, one of the groups that we interviewed, we have a podcast called Your Undivided Attention. We interviewed Dan Vallone,
who runs More in Common. And what it really shows is that we completely see the other side
in stereotypes. If you ask Democrats estimate what percent of Republicans make more than $250,000
a year, they think more than a third of Republicans make more than $250,000 a year. They think more than a third of Republicans make
more than $250,000. I think the answer is more like 2%. If you ask Republicans, what percent
of Democrats are LGBTQ? And they'll estimate more than a third of Democrats are LGBTQ. The actual
answer is 6%. If you ask Democrats to estimate what percent of Republicans do they still believe
racism is a problem in the United States, they think less than 25% of Republicans do they still believe racism is a problem in the United
States, they think less than 25% of Republicans would believe that racism is still a problem.
The actual answer is something like 70%. And so we're seeing ourselves with stereotypes. And the
second thing they found is the more you use social media, the worse you are at predicting what the
other side believes, not the better, because the extreme voices on social media participate more often than the silent sort of calm, moderate majority, right? Like the calm,
moderate people, they don't actually say that much. So that's really the problem that we're
dealing with when we look at our polarization ecosystem. Wow. This is reminding me that when
we closed out the year, we went to Christmas break. The last piece I did was on Democrats.
And, you know, I have a lot of Republican listeners.
I have some Democrats, too, mostly people in the center.
But it was a reminder that, you know, the people who are trying to get everybody canceled and so on, they don't represent all of the left.
And that it's not the left that is the enemy of reason.
It's like activists who are pushing agendas.
Yes, we can
fight on that. But remember, your neighbor who's a Democrat is not your is not your enemy if you're
a Republican and is not necessarily against the things that you're against as well. All right,
let me pause it there. I'll squeeze in another ad and we'll come back. My God, there's so much more
I want to talk to you about. Tristan Harris is here and we are lucky to have him former Google
design ethicist. We got to talk about that podcast. Crazy, interesting stuff. Don't go away.
Let's spend a minute on your podcast, Your Undivided Attention. This is the description
that my team gave to me and I'm dying to know more. Okay. Interviews experts in invisible aspects of human nature from casino designers to hypnotists, ex-CIA propaganda experts, tech whistleblowers, researchers on cults and on. This is what's... So I guarantee you there are people out there listening to this right now that are saying, not me. I'm too smart. I understand what a manipulation looks and feels like.
Yeah.
Well, yeah, with our podcast called Your Undivided Attention, what we're really trying to do is just inoculate people from this manipulation.
And one of the best ways to do that is for people just to understand the truth behind the people behind that piece of glass screen.
So we had Natasha Dow Scholl,
who studied casino design. She wrote a 700-page book on how casinos are designed. So for example,
the classic example is your phone. It's like a slot machine. Every time you scroll your phone
or you pull down to refresh to see, did I get some new email? Just like a rat seeking a pellet,
you're playing that slot machine to see if I got something new. So we interview casino experts,
people who study attention spans, what's happening to the inner workings of our attention,
effects on children, hypnotists, all these sort of invisible access. We had a Rene DiResta,
who actually was one of the two teams given access to the data sets on what Russia and China
are doing in sort of social media, which by the way, is really one of the biggest concerns that I actually have about
this that's more subtle, is that I think social media is, and these platforms specifically,
so TikTok, Facebook, Twitter, et cetera, it breaks our democracy. It's actually incompatible with our
democracy in a more fundamental way. And in the competition with China and these digital
authoritarian societies, I don't want to go that direction. But I worry that we put this brain implant in our democracy called social media,
and it rewires our collective psyche so that each neuron maximally influences every other neuron,
right? Because it wants each of us to reach as many people as possible. That's what keeps each
of us coming back because we're addicted to influencing so many other people. But if you
think about what would happen in a brain, if I took each neuron and maximally fired
every other neuron, you'd get kind of like a social seizure attack, right? And when I look
at our country right now, and I look at how it's just this cacophony of anger and confirmation
bias, and we're so right, and we just have to escalate that conflict. It's like we're foaming
at the mouth, having a seizure as a country, while China is actually employing the full suite of all
these technologies to make a stronger authoritarian society. We can notice that democracies are not
employing the full suite of all these new technologies to make a stronger democracy.
Instead, we've allowed the business model of maximizing attention for enragement
and making us angry at each other to sort of actually collapse our capacity as a democracy
to agree on anything, to recognize that we have much more in common with our fellow countrymen
and women, and that there's actually real challenges we have to face. Meanwhile, China
is gerrymandering Africa, getting access to supply chains, doing foreign policy. I should
also talk about what they're doing with regard to their tech platforms.
I was actually meeting-
Their tech moves lately, you tell me, Tristan, when I read them in the news, I'm like, well,
that's very China, right?
To sort of the big hand of government now controls.
But I was also like, hey, China, for the first time in my life, I was like, you know what?
Maybe we should consider the Chinese way.
Yeah.
Well, so I was meeting with a senator who's deep in the foreign policy
world, and he was meeting with his counterpart in the EU who said, who does China consider to be
the largest threat to its national security? Who's its biggest geopolitical rival? And of course,
you would say the United States, right? You would think that's the answer. They said, no,
they consider their own technology companies to be the biggest rival to the CCP, the Chinese Communist Party's power.
Now, why is that? Because the technology that runs their society is really the new source of power,
right? It's controlling what kids are feeling, thinking, and believing. It controls their
identity, their educational development. It controls loans that get made, Jack Ma, Alibaba.
So they're going after their billionaires.
They're doing all these things, but they're really realizing that technology is the power
structure.
It's the brain implant that is guiding their society.
Now, I'm not trying to idealize it now, but here's a couple of things that you were mentioning
that they're doing to deal with the problems of the social dilemma.
So let me give you a couple of examples.
One of the things they do is on TikTok, their version of it called Doyin, when you're scrolling
TikTok, if you're under the age of 14, you can only use it until 10 p.m. at night, and then it's
closing hours. It opens again at six in the morning. They actually limit you to 40 minutes a day.
And when you scroll, instead of showing you videos of the best influencers, they show you science experiments, museum exhibits, patriotism videos, because they realize
that TikTok is conditioning kids' behavior. And now I'm not saying that we should be doing Pledge
of Allegiance videos to the United States on our version of that. But what we have to also see is
that China is controlling their number one adversaries, children's TV programming
education. I mean, imagine in the Cold War, the Soviet Union controlled Saturday morning cartoons
for its number one geopolitical adversary. You know, I actually talked to people in our defense
and national security apparatus quite a bit these days. My concern is that our generals and our
heads of the Department of Defense know everything about hypersonic missiles and drones
and the latest tech, you know, physical advances in warfare, but how much do they know about TikTok
and how their own children are being influenced on TikTok? And I'll give you a concrete example.
A TikTok insider told me this. He says, the thing that people don't realize is that TikTok is an
alternate incentive system to capitalism. Instead of paying you in money, I can pay you in likes, followers, and attention. I can give you a sense of boost of all those
things. So now let's say, and China is known to do this. They have a national security strategy
called borrowing mouths to speak. So I want to borrow those Western voices who say positive
things. Whatever anyone in the West says something positive about China and the Uyghurs are not a
human rights problem and it's all fine. China can just say, we're going to dial up those people.
So they get paid and more likes, more followers and more views. Then other people on TikTok look
at that and say, well, why are those TikTok influencers so successful? And they start
replicating their behavior. So you're creating an alternative system of influence on top of your number one geopolitical
adversary.
And you're being able to adjust those dials anytime you want.
And you don't even have to get them to trust the Chinese Communist Party's voices.
You can take Western voices who happen to be pro-China for whatever reason and just
make them the ones that are heard the most, right?
And my colleague, Rene DiResta, calls this amplifaganda. It's not
propaganda. It's amplification propaganda. I'm taking your voices, but the ones that I want to
hear. And similarly, we know what Russia did, you know, and not just in our elections, but ongoingly,
is they take the most divisive voices, especially the ones that focus on race, on guns, on
immigration, these topics, and the ones who want to do civil war and secession movements and things
like this, and they amplify those voices because they want to do civil war and secession movements and things like this,
and they amplify those voices because they want to amplify propaganda, amplifaganda,
the ones that are most divisive. There is a World War III information war that Marshall
McLuhan predicted in 1968 when he said, World War III is a global information war that will make no
distinction between civilian and military combatants because now we are in that war,
but we don't really see it or feel it that way. And I've heard you talk about in the past,
the difference between we have these huge oceans on both sides that make us a global superpower.
We have this physical, kinetic, asymmetric position compared to our adversaries, but those
huge oceans and borders go away in the digital world. We have Patriot missiles to shoot down a shoot down a missile that, you know, a plane that comes in from Russia or China physically. But if they try
to fly an information bomb into our country, they're met with a white glove algorithm from
Facebook or Twitter or TikTok that says, yes, exactly which, you know, minority group would
you like to target? And a recent MIT Tech Review article said that actually at the top Facebook pages, 15 pages that are for Christian
Americans, all 15 of those Christian American pages are actually run by Macedonian troll farms.
Of the top 15 African American pages on Facebook, these are basically bots, right? Of the top 15
African American pages, two thirds of those African-American pages reaching something like 80 million Americans a month are run by Macedonian troll farms. So we have to realize that again,
we're not even really living in a real reality. The metaverse is a virtual reality, but even
within that virtual reality, it's a virtual representation of our fellow citizens. They're
not even our fellow citizens. So that's, and I just want to pause and underscore to the audience
at home. This is something different. This is something Russia did do. Okay. They did do this. This is
not Russiagate stuff. This is not like the weird, this is totally different. Russia did do this.
They used bots to amplify disinformation. Does it come as news to anybody really that they're
trying to sow discord in the United States? You don't have to believe that this influenced
the election to just know that what they're trying to do is drive up division ongoingly,
right? And that is part of a deep warfare strategy, right?
Because we're falling over incoherently, constantly disagreeing with each other and then forced
to see the more extreme perspectives of our society.
While these countries are not doing that, they're not faced with that problem.
And just to pick up on the other point you were making about how they limit the children's
access to TikTok and so on in China, because they came out with a couple of sweeping reforms within the past few
months along those lines, trying to stop the children from spending all their time and they're
limiting some of the time on the apps to just the weekends. Yes. Gaming, they limit to 40 minutes a
day and only on the weekends, Saturday and Sunday. And TikTok, like you're saying, they also do only
40 minutes a day.
And like I said, they have opening hours and closing hours. So at 10 p.m., it just shuts off.
And the reason for that, by the way, Megan-
My thought in reading about that was, okay, so great, we've unleashed these unhealthy bombs on
our children in their country, in our country, across the globe. But China has actually stepped
in to try to stop that bomb from doing too much damage on its own children. Whatever its motivations, they do not want a bunch of, you know, missing the frontal lobe children to grow up addicted to technology just when you need to play their game, needing to play.
They want their kids to be smart and to be the next generation's leaders and so on.
Meanwhile, we left our kids twisting on the vine.
There's no attempt over here at all, as far as I can see by big tech to protect our children in any way.
In fact, the more addicted, the better. Exactly, exactly. And this perhaps I think
is one of also the major issues that in our country we can actually agree on, right? I mean,
who wants our children systematically warped and deranged with comeback emails that like a digital
drug lord, when you stop using, I figure out how to more aggressively get you to come back. And, you know, Frances
Haugen, the Facebook whistleblower, you know, people point to her credibility. It's not her
credibility that matters. She was just leaking Facebook's own research where she had their own
documents. And they found that 13% for among teens who reported suicidal thoughts, 13% of British users and 6% of American
users trace their desire to kill themselves on Instagram.
And they said that we make body image issues worse for one in three teenage girls.
I know personally, some Instagram insiders who actually left the company after seeing
that research because they couldn't justify staying there, knowing that that's the case.
But this is all obvious because the whole business model is designed around this kind of predation on our
kids. But again, I think what we need to do, Megan, is instead of focusing on just these
light reforms, like how do we make social media slightly more privacy-protecting or 10% less toxic
by removing the anorexia thing, I worry that this is a competition of two systems. We have democracy
and we have authoritarianism. And authoritarianism, that model, they're using the full suite of
technologies to make a kind of super authoritarian, stronger sense-making environment. They have many
problems. I don't admire it. I don't want it to be the future. But meanwhile, we can notice that
our democracy is not employing all these technologies
to say, how would we make democracy even better? How do we do even more consensus-based decision
making? How do we invite people? There's actually a model of this in Taiwan, where instead of
posting on social media when you hate something about, say, the tax system or potholes or masks,
when you post about something that you say, I want to fix in our democracy, instead of that
just turning into a long comment thread, but then get shared more virally and the more clever,
stubborn thing you can say, the more attention you get in their system. When you say, I want to fix
the tax, the tax system has a problem. You get invited into a zoom call, a stakeholder group
that actually talks about how you would improve it. And you actually get other citizens and you're
actually designing the improvement of that
system.
And then that's taken to the digital minister to actually implement.
We could have a whole basis of technology that's about strengthening our democracy.
And that's my concern about what we need to do.
We don't need 5% less toxic social media.
We need to sort of reinvigorate the values of the Declaration of Independence for a 21st
century age.
So we're not antagonistic technology to technology. We're using it to make a stronger democracy.
My gosh, it just makes me think I don't really like to crack down on, you know,
alleged hate speech or what have you. And I don't like big tech censorship. And I never thought Mark
Zuckerberg should have been pulled into that. He was originally like, it's not my job to police
the Internet and conversations having. And I was like, right on. That's the American way.
But then he did submit and so on.
But we're focused on the wrong stuff.
That's that is not the problem of big tech.
I mean, it's irritating, but it's not the problem.
Their sins are so much more nefarious and ingrained and deep and part of the business
model than all that stuff, which is a noise distraction.
Exactly.
And in fact, Facebook, after Frances Haugen came out,
we actually now know from Wall Street Journal reporting,
they were consciously trying to frame.
So she released all this research
about how much it's dividing and polarizing us
and hurting kids.
And then Facebook actually used their PR department
to sow stories saying that this was all about censorship,
that what Frances wants is censorship.
Whenever they talk about censorship, they do that because they know it just creates more division
because the conversation about free speech or censorship will never resolve. It's the same
800 page, just like law textbook conversation. Everyone brings up the same examples and it never
yields any results. It's not about freedom of speech. We all want that. In fact, we should
have that. We should have less censorship on that. What we need is we have to be careful about reach. We've decoupled power and reach from
responsibility. Typically in the past, the greater the broadcasting capacity you would have,
the more responsibility you would have because you're reaching a larger number of people.
Now we have a single TikTok influencer. There's actually an example in China. In one day,
because you're reaching a billion people, you can actually an example in China, in one day, because you're reaching a
billion people, you can actually create a billion dollars of sales. There's actually an article in
MIT Tech Review, I believe it is, that a single individual in China in one day generated a billion
dollars in sales because when you say something and you reach a billion people, this could be a
15-year-old or a 16-year-old. Or a Kardashian.
Or a Kardashian, right? But instead of the Kardashians, where it was only a few people
in the past, in the 20th century, we had a few big celebrities that could do it.
We're moving to a world where each tech company wants each of us to be a Kardashian. They want
each of your kids to be the influencer. They want that to be the model for what being human and being
a kid is about. And when they do that, notice that has an effect on the other kids. The other kids
say, well, they're way more popular and successful
in getting the attention than I want that attention.
And they're transforming the cultural basis
for what our kids even want.
And again, you zoom out and you say,
China's playing chess
and we're allowing these business models
to collapse our ability to think and act well
in the 21st century.
Because we've got a lot of problems
that we have to figure out.
I know that, you know, Francisis she came on your podcast right the whistleblower yes book and yes and i really recommend people check that out because it i
think um yeah there's a longer story that go on yeah i mean she's well worth listening to it i
she she i guess said that working at facebook was one of the most important jobs in the world she
hopes that her her papers don't discourage people from working there because the tech changes that we're talking about have to
happen from within. It can't be, I don't know that, can they be legislated? Or does there just
have to be a genuine will by the tech companies to have more Tristan Harris's and Francis Haugen's
working there? You know, you're bringing up such an excellent question, which is how are we
going to make these companies accountable? We can look to the market so you could compete with them.
You could start a new social network, but the problem is they have a monopoly on the network
effect. The reason we don't see other social networks succeeding is because they have already
owned the means by which people reach all of their other friends in a network. And so networks are
really hard to compete against. So we can't use the market mechanism. You also have to go lower in the
race to the bottom and brainstem. You have to make it more addictive, more engaging,
more polarizing. So if you use market mechanisms, it's A, it's hard. And B, you end up creating
usually something worse. Like TikTok is competing with Facebook, but it's creating something
that has all these even more negative effects. The other way is with, let's say, culture. We could do a
user boycott. We could do, let's not use it, but notice that that doesn't really work because
they've owned the mechanism by which we can communicate with other people. So boycotts
don't really work. Also, Facebook has a project called Project Amplify, where they can actually
just turn up the dial, just like China can turn up the dials on positive voices about China.
Facebook has a project called Project Amplify, where they can turn up the dials on things that people, when
people like something about Facebook, like it helps them find their lost horse, like a horse
owner loses their horse. They found it because they found someone else on Facebook. Whenever
that happens, they can dial up positive stories about Facebook. So Facebook can control what
people are feeling and thinking about Facebook. So the culture mechanism, boycotts,
public sentiment, they control that. Then we go to states and regulation, which is what you're
now talking about. We could use that mechanism, but then we know that Facebook can actually divide
the population about any kind of proposed regulation. And as you know, privacy section
230 reform, it's not enough to deal with it. And then lastly, as advertisers, we could take all
those advertisers that have the powerful pressure that they're the financing of these whole systems.
And we could try to say, we're going to do an advertiser boycott. But the reason that doesn't
work, Megan, is that there's not unlike other industries where there's like an 80-20 rule,
there's a small number of advertisers that make up the massive billion dollars of your revenue.
Because Facebook makes money from these millions and millions of small businesses and everything all around the world, you couldn't get enough advertisers to pull
out.
And also, where would they go?
They also have to use these platforms to reach their customers.
So this is the kind of quagmire situation that we're in.
And the reason I bring up national security is I think that national security is the way
we need to see this, that much like an, you know what an EMP attack is like an electromagnetic pulse attack, where it's like, I blow up a bomb above your city, but it has,
it kind of fries the electricity grid. So you get to keep your buildings, you get to keep your
people, you don't hurt anyone, but all of your infrastructure is fried. I think that these
social media companies are like a cultural EMP attack. They don't, you know, China and Russia
haven't killed anyone. They're not spilling any blood
in the streets. They're not bombing any infrastructure, but everything got fried.
Our trust broke down. We don't have trust needed. We can't do consensus. So it's a cultural EMP for
democracy. I think we need a deeper, more fundamental response. And frankly, we don't
have anyone in the political stage who's proposing this right now. And this is why I'm really excited that we're talking. I think that every one of your
listeners and all of us need to see this as the challenge of our time. Yeah. More and more,
they're trying to introduce legislation here in the United States to try to crack down. But
to your point, does that actually do it? Is that the right way in? Or do we need the next generation
of Tristan Harris's to be thinking
about getting hired at these companies and changing them from within and demanding more
responsible tech and the stopping of the exploitation of human vulnerabilities, as you
said, when you were at Google. So two years after the social dilemma was released, are the social media companies slowing down?
Did they dial back?
Were they shamed out of some of their bad behavior?
There's a reason Facebook changed its name to Meta.
When Mark Zuckerberg did that, I think everybody was like, well, what the hell is that?
Why is he changing?
Everybody knows Facebook.
What the hell is Meta?
Why is he doing that?
Well, Meta ties into this thing called the Metaverse, which he's invested in. He likes it. He wants it to become a thing. And he is deal and there was a there was some reporting that
that's because they want to be well prepared to enter the metaverse through gaming so what the
hell is the metaverse and why do we need this yeah well um you know as you eloquently put megan
the the main reason that i think they announced it obviously they've been in the works for a while
but they wanted to distract attention from franc Francis Haugen's whistleblower leaks about how toxic the company really is for democracy, for kids.
And they did this, you know, I think just a few weeks after those releases happened.
They also need to excite investors.
They need to say Facebook isn't just this product that, you know, with the blue bar at the top and the infinite feed and the thing that's just zombifying everybody that, you know,
it's this exciting vision of where, you know, we're going to be immersed in these virtual realities.
A lot of people ask about the metaverse. I think that they've, they've already been in a race to
create the metaverse for a long time. Cause you can think of it as we, their job, when you're
harvesting attention, you make more,
their stock price goes up. The more time people spend, the more time people spend, the more it
becomes your reality. Like your reality actually is the technology feed. Right. And in fact, Megan,
actually just to go back to the point on teen mental health, when you look at the stats on
what are called high depressive symptoms, like self-cutting, self-harm, teen suicide, you know, when those numbers actually start to tick up in the graph,
they took up in 2009, 2010, it's like kind of flat line. It's kind of flat. And then it goes
up like an elbow and it goes up 2009, 2010. What changed? We actually had social media before 2009,
2010. What happened, what changed in those years is it went to mobile. It became something that your kids are now 24-7 immersed in.
In other words, when it gets full on to the brain helmet of like, I'm living in this reality,
that's when it becomes more toxic because it actually is the fundamental way that you
say, what's going on in the world?
Are things peaceful?
Do I like my neighbor?
Or is it just this like, you know, infinite polarization close to civil war thing?
That feeling, we've forgotten how to escape it because we're so immersed in these technologies.
So they've been racing to create and own this virtual reality creating environment for a long
time. The metaverse is just this extreme, more extreme version of that at this virtual world.
And we're going to see these other companies, you know, compete with that. But if we haven't
been able to get, and you know, as Frances Haugen said, if we haven't
been able to stick the landing on how our existing products could be good for people,
good for democracy, good for society.
And instead of fixing the actual problems we have, they're just jumping to this next
thing.
Yeah.
That's, that's really a problem.
We're going to take it next level.
Because what I heard on the Daily, they did a podcast on it just today.
And they were talking about how you, you, in the metaverse, your avatar, you can make your avatar be as beautiful or not
as you want. You will actually be in the position where you could potentially be paying Nike for a
pair of sneakers to wear in the metaverse. So you buy actual sneakers you wear when you're in your
real body walking around the real earth, but then you have to pay these companies for your snazzy outfit in the metaverse
where you're jetting off to Paris for a lunch. Meanwhile, you're really just sitting in your
damn basement, not actually interacting with a real live human being face to face.
That's right. And I think that there's this principle when you're developing systems that
depend on another system. If you build a virtual reality that's not caring for the real reality underneath, you're just sort of building a disassociation layer that's it's not really a humane system. If you're creating a
virtual reality that depends upon people's embodied experiences, depends upon people being healthy,
depends on people having real relationships, depends on having a real democracy underneath
the virtual democracy you create. If you're creating a system that's not caring for and
tending to the things that enable that system in the first place, that's not a humane system.
And I think that's my worry, right? We're
virtualizing everything. And this is going to get a little bit spookier and spookier, Megan,
because there's these new technologies that are coming where I can synthetically make videos of
people, of faces, of audio, and have people say, feel, or think anything. And I can also do it
with text. There's a technology called GPT-, where I can actually just say, write me, create
me a, you know, a news article written in the voice of Megyn Kelly or Tristan Harris. And it
will create a, you know, an essay about why technology is bad written in my voice that
sounds pretty close to things that I would say. And in the future, if we're worried about
misinformation now, in the future, I can say, and I'm sorry, I'm not trying to just spook your
audience. I just, I think it's important for us to be inoculating ourselves for where these trends
are going. I can say in the future, write me an article about why the vaccine is not safe
using real charts, real graphs, real evidence, and write me a 200-page paper that'll take
statisticians weeks to decode or months. And it can just flood the internet with posting things about
whether the vaccine is safe or not safe, no matter what it is you believe, I can just flood the
internet with things that'll take forever to sort of adjudicate. And that's the world that we don't
realize that the humane world fundamentally depends on recognizing the limits and vulnerabilities of
human nature. We have finite time, we have confirmation bias, we are more likely to
believe our tribe versus another tribe. We have these bias. We are more likely to believe our tribe versus another tribe.
We have these fundamental truths about human nature and to be humane means to be designing for that. And that's really the thing where this choice point as a species, we're building
technology that's changing things faster than we're able to keep up with or understand those
changes. We've already gone through that in the whole conversation we've had. This is about
understanding ourselves so we can understand how to limit that technology to fit with how we really work and what would make us healthy and strong.
We've seen more governments try to crack down on it. I understand Italy's given it a shot.
Some more of our European counterparts are trying their best to come down on these tech firms when
they get a little bit too big or a little bit too aggressive or a little bit too programmy. You know, you're not allowed to do the targeted ad stuff or
legislation that would make us have to consent, you know, make Facebook have to make us or
Twitter or whoever have to consent to them tracking us and our preferences and so on.
We could opt in or opt out. Is that I mean, I imagine you would say at least that's a part of it, but what else needs to be done?
Yeah, you know, it's so tricky.
I think we need a, you know, a Geneva Convention for the arms race that is who can go lower into the human brainstem and human vulnerabilities to get attention out of us.
You know, like, for example, just to make it clear, you know, a Geneva Convention on no more beautification filters, right? Cause right now it's an arms race. If, if Tik TOK does the 5% without even you asking,
just like beautifies your kids' faces and that causes them to use it more. If Instagram doesn't
match them at the 6% beautification filter mark to out-compete them, it becomes an escalatory
dynamic. If, if one of the companies does these comeback emails, so it shows you your ex-boyfriend
or your ex-girlfriend and the other, and they're successful at getting people to come back.
If the other ones don't do that, they're going to get etched out.
So really what we have are classic problems in economics, right?
We have arms races and tragedy of the commons.
If I don't do it, the other guy will.
And this is where we need regulation to protect that.
This is not about speech.
It's not about saying, do we like that guy's speech or not?
Should we de-platform them or not?
It's about making sure that we're designing the attention commons to serve our society,
to serve mental health, to serve democracy, to enable us to be a better society, again,
in competition with real geopolitical actors who are building a different kind of society
in a different system.
So yes, it's great when you have certain things like maybe banning micro-targeted advertising,
there's some proposals like the Honest Ads Act, things like that. Francis Haugen has proposed,
you know, this is also an international problem, publishing for every market, for every country,
the top links that are getting the most engagement and traffic above a million views,
so that every country with their own investigative journalists can look at how is this rewarding the most engaging stuff?
Because you have especially these vulnerable countries where Philippines is going completely
off to full-on authoritarian kind of crazy town because it rewards the most crazy stuff.
And there's very few content moderators for these languages.
Facebook wants to invest.
Think of the US as we're getting the best of these experiences because they have so much pressure in the US that they're putting the most
resources into protecting election integrity or whatever they can. But if I'm Russia or China,
I'll just throw my disinformation into Central America and Haiti and say,
hey guys, there's like a free border opening. You can just run across the border. So I can
weaponize the rest of the countries that have less investment from Facebook and less protection and start steering people in
Spanish, right? So you get a sense of the global nature of this problem and that, yes, we have
certain efforts that we want to celebrate when governments do take small actions, if they're
the right ones. But unfortunately, I don't think these are the comprehensive things that we need.
We really need a 21st century democracy protection program. And it's not the things like censorship or free speech. It's really about how do we incentivize the right kinds of technology to get built. that you are quote slightly obsessive about um what equals time well spent in your life right
and i love that and i and i saw that you did the tango i'm like that's amazing right you're you as
somebody who's lived this firsthand understand the importance of prioritizing life away from
the device that i've gleaned but walk us through practical advice for the individual humans and
what they can and cannot do yeah you know you know, there's a bunch of things.
I mean, the easiest thing,
you can hear this conversation,
people say things like,
oh, just delete your social media apps off your phone.
You can still go to the website if you really had to,
but you could delete the social media apps off your phone.
Now, notice that when I say that,
a person's receiving that information,
they might take it in,
but are you really about to do it?
No.
Think to yourself right now in this moment, am I really going to lose something if I actually
just delete the app, the app itself off my phone?
I can hold it down.
It starts to weigh the whole, you can hit that delete button.
You can actually do it.
There's nothing they can do to stop you from doing that.
You can turn off notifications in general, turning off all notifications.
There's really very few things that are life-worthy notifications, that are time well
spent, that genuinely notify you. Most people don't change their notification settings. I forgot
the stat, but it's like almost no one goes into their settings and tries to tweak all these things,
right? It's a ton of work. This is also where Apple can do so much more, right? Apple is kind
of like a central bank or kind of a governor oversight body for the attention economy, and they can set better defaults. So it's less noisy,
less toxic. But there are things like this, you know, there's an organization called Wait Until
Eighth. That's about waiting for kids not to get a smartphone until I think 13 years old.
Recognize as much as you know, obviously, your kids, if they're using social media already,
and all their friends on it, I just want to acknowledge and be compassionate to how difficult that must be as a parent, because you can't tell
your kid to not participate where their friends are. Can you organize a group of parents or a
school to say, can we get the kids from chatting with each other, at least on Instagram, which is
the text medium for them? Can we get them to move to something like a big iMessage thread or a
WhatsApp thread, or ideally not WhatsApp, maybe Signal or something that's more, that doesn't have that incentive, right?
Doing FaceTime calls between kids as opposed to sending beautified photos back and forth.
I mean, there's so many issues, Megan, we didn't cover cyberbullying, the way that nude
photos get shared between kids and the pressure that that puts.
There's so many other aspects.
But what we really need, there's a great film, by the way, called Childhood 2.0 that I recommend as well. That's really more about how kids are
facing these dynamics. And we can all just advocate for a better world, right? Recognizing
that these systems are not built for us. We are the product, not the customer. So long as their
business model is selling atomized blocks of human attention, just like trees are worth more as two
by fours and lumber. So long as that's true, we're going, just like trees are worth more as two by fours and
lumber. So long as that's true, we're going to cut down trees and turn them into two by fours.
Well, we are worth more as dead slabs of human behavior when we're addicted, outraged, polarized,
narcissistic, anxious, and not sleeping because that's profitable for these companies. Even the
CEO of Netflix said our biggest competitor is sleep. You know, keeping you up till two in the
morning, three in the morning is more profitable than your
kids going home at night.
So just realizing the asymmetry of power between what the technology companies are doing and
what we're capable of, and just honoring that and then saying, how do we bring and restore
power and agency to ourselves?
I have to say, it would be dishonest for me not to mention the cable news model right
now. It's not dissimilar. I mean, one of the reasons I left cable news is because I was just
tired of being part of the outrage machine. It was every day, all day. It's not spoken out loud,
but it's obvious that they're looking to press people's buttons and make them upset. And they're
experts at it. They know exactly how to do it.
You know, one of the reasons I left Fox and went to NBC,
a place where I didn't ultimately belong,
was I was just desperate to get away from it.
I was very attracted to the idea of not doing politics every day
and doing something that felt better and was more Oprah-esque
and just like, you know, uplifted people as opposed to outraged people.
But I'm at heart a newswoman and I had to get back to news.
But this is one of the reasons I like my current job is, yes, there's room for outrage.
And there are some things that really piss people off.
And I understand that.
But that can't be your only diet in life.
If you want to be a well-rounded, occasionally joyful human, you have to have other meals.
Exactly. Exactly. And I think just realizing like, you know, look, I, I like all of us,
you know, we have to look at the news and I do dip into Twitter to try to figure out what's
going on in the world. But I think just recognizing it makes us feel awful because you're just seeing
the things that are meant to enrage you. And I think we just have to realize
how strange that
kind of stimulus. We've never before had a supercomputer assemble everything that would
maximally make you angry and then deliver this like sort of lab generated political red meat
to just plop on your plate every single day. It's just realizing this is not healthy for us.
And we can choose to really limit that. And mostly and mostly i mean there's ways people can set up
lists on twitter i mean in general try not to use it try to focus on what are long form
news that you trust who are people that are doing a good job of steel manning the other position
like why did that can i understand and even say why the other side values what they value
there's a great quote by um john perry barlow never assume that someone else's motives
aren't as noble to them as yours are to you.
And I think we really need to honor each other. Like we are fellow countrymen and women in a
democracy. I really worry about civil war. I really think that this is the key to it. I think
we've got to all become conscious and step up. I also think that there are certain brokers
of anger who wear it on their sleeves. You can tell they're angry and they want you to be angry
too. I used to joke that that was the motto of New Yorkers. Welcome, you can tell they're angry and they want you to be angry too.
I used to joke that that was the motto of New Yorkers. Welcome to New York. We're angry and we want you to be angry too. But you can make a smart choice in terms of delivery. You can tell
when someone's upset constantly and you tune into them because you want to be upset. Make a
different choice. You don't actually have to go that hardcore to stay up with news and information because this is the field in which you're, you know,
you get attacked often with the anger bombs. I guess I, my imagination tells me, because I don't
go on TikTok, that that's more of a manipulation in a different way. They're not trying to make
our kids angry. They're just trying to make them addicts. That's right. It's really just about
addicts and make them influencers. Their idea there is to make you addicted, not just to using TikTok, but addicted to getting attention from other people, to turn you into like an attention vampire that's never satisfied and always wants attention from other people. way more profitable kind of kid than a kid who's self-satisfied, who's sovereign in their identity, who's taking responsibility for themselves and who's developing healthy relationships
outside the screen. I mean, that's the basic thing, right? It's like life is better when we're
having dinner with each other and going hiking with each other and going on camping trips or
whatever it is that we love doing. But none of those things that I just mentioned are profitable
to TikTok. They don't make money when you go on hiking trips. They don't make money when you
start a language project. You don't have kids yet, correct? I don't make money when you go on hiking trips. They don't make money when you start a language project.
You don't have kids yet, correct?
I don't have kids yet, no.
Okay, so if you have kids,
would you let them do any social media?
I mean, would you let them do TikTok
or Facebook in particular?
I personally would really not.
And I know that we're very far down that land.
I think what I would tell people is just notice,
and we said this in the social dilemma,
most of the people I know in the tech industry do not let their own kids use social media. That should tell you everything. The CEO of Lunchables Foods, Megan, did not let
their own kids eat Lunchables Foods. And that was one of the most successful food product lines in
the country. It is frequently the case that when people aren't eating their own dog food,
there's a real problem, right? And by the way, that's simple, ethical, and moral standard. If we lived in a world where the only
technology we made was the ones that we happily endorsed that our own children use for long
periods of time per day, think about how much better the world would be. I grew up in a world
where technology was empowering, right? I grew up learning how to program, how to make things,
graphic design, making music on technology. Those are things that if we're making them,
we would say we would want our kids to use. When you have a whole portion of an industry that's the dominant one, that's worth a trillion dollars that people don't want their
own kids using. I think that's such an easy standard to apply for the world that we would
want that would make our society stronger. You have a sign I read, is it on your laptop
that reads, do not open without intention? So what is the
intention that you keep in mind that we should keep in mind when flipping open our laptops?
You know, I just want to say, first of all, I am like every other human being. I'm a, you know,
I'm a meat suit with paleolithic emotions and I'm easily hacked.
And I care about what other people say about me.
And if someone says I'm exaggerating, right?
Like we're just human.
I think that first step of self-compassion.
And yes, I do have a little sticker on my laptop that says, do not open with that intention.
It's a subtle way to try to remind myself, why am I here?
What do I want to do here?
What's time well spent for me in my life?
And we're living at very interesting times. And I think we should each ask ourselves,
what is time well spent for us? It's a way of asking what's a life well lived. Time well spent
added up over a lifetime is what is a life well lived? And I think we're each capable of asking
ourselves that question. And we have to just honor when we go off the rails, it's fine. Just come
back. It's just like a mindfulness exercise. You know, you notice that your attention wanders and you just come right
back. I think about, I say this to my audience too, but remember that when it comes to tech,
the laptop, the iPhone, news consumption, garbage in, garbage out, you know, you're in control of
what you expose yourself to. And the same way you would protect your child, ideally from what kind
of movies he or she is going to watch at age six, you need to protect yourself against sourcing that wants to mislead you, anger you, upset you, manipulate you.
You know, you have to be the parent to yourself.
Tristan is sticking with us to take your calls.
I'm very excited about that.
The lines are lighting up, so I'll squeeze in a quick break.
We'll come back and start talking to you.
So our first caller is Janice from California. Janice, what's your question?
Well, I have one of my children that works for Facebook and we've gone from having a relationship where I would speak to her on a regular basis throughout the week to I barely hear from her anymore.
And it's an argument because the philosophy that she has taken on is so much different.
And she made a comment to me last week that really sent chills up my spine.
And that was that the metaverse is real, that she can own a house in the metaverse that
she can't own in the actual world. And I find that to be chilling because at the end of the day,
the metaverse isn't real. It's virtual. It's the matrix basically. And the real world is where we
all live and should live. And it has taken our children from living outdoors and enjoying life to being sucked
in to this screen and living inside and having no life and very few physical contacts with
anybody.
Are we so far gone that we can't fix that, I guess, is the question.
Oh, my goodness.
Thank you for that, Janice.
My gosh, I'm sorry you're going through that.
Go ahead, Tristan.
Yeah.
Thank you, Janice.
Yeah, I also know a lot of people who work at the company, have worked at the company.
I, you know, here's how I think about it.
I think human beings are always tempted by faster, better, more efficient, right?
So when Uber makes it so that you can order a taxi and it becomes faster, better, more
efficiently, more reliably, why wouldn't you switch from a taxi to an Uber?
When Instacart makes it faster, better, cheaper, more efficient to do that, you're going to
switch.
When your phone presents a 24-second, think about
right now, just forget the metaverse even, just our phone. If I'm sitting with someone and the
conversation is starting to get boring in a group conversation, the reality that I'm in isn't as
sweet as the reality that I can taste by just quickly pressing, you know, pulling my slot
machine and seeing what I'm going to get. I can instantly access a sweeter feeling, a sweeter
taste. What you're talking about the metaverse at the house
is this, I think of the same example, right? I can get a better self-image with my beautification
filter in my metaverse than I can by looking in the mirror and seeing that I haven't been,
you know, very healthy the last I've been living in the metaverse. I can get a sweeter looking
house that I can virtually live in there than I might be able
to afford in the real world. So I think what this is presenting to us as a species is a choice to
really recognize in ourselves just because it might taste sweeter to go look at my phone and
run away from my anxiety and run away from the kind of boring conversation. It doesn't mean I
actually can't, I don't have to do that.
I can be aware of that feeling and I can say, yes, but what do I actually really want to
invest in here?
And I can take a breath.
I can be present with someone.
I can redirect the conversation.
I think that that's really the reckoning that we're in the middle of.
Because as you said, we're right at this precipice.
I think I understand the feeling that your daughter who works at Facebook was mentioning.
But we have to make a choice.
I think the good news is I think a lot of people feel very skeptical and afraid of the
metaverse as a vision for the future.
I think that was most people's response to Mark Zuckerberg's announcement in that video.
So I think that's the good news.
We're really recognizing that.
Can I ask you something, Tristan?
You mentioned a few times.
I just want to circle back the slot machine aspect of the iPhone. When I was training my dog, my friend, who's a dog
trainer said, don't give the dog a treat every time he sits when you tell him to sit, only give
it sporadically because it's more exciting for the dog if he doesn't know which time he's going to
get the treat. And she was saying that that's why the slot machines work so well.
It's better.
It's more addictive.
It's more, it works better on controlling behavior.
If it's up in the air, they know that.
Well, it's just, it's such an interesting example you mentioned,
because this is not a story that I've really told publicly.
I don't think before.
I know one of the very first designers of the Facebook newsfeed.
If a lot of people don't remember this,
Facebook didn't used to be just a infinitely scrolling newsfeed. That was a design that they later moved to at the
very beginning. It was more like a contact book and address book. You'd be able to search for,
click on friends and browse through profiles and search for a friend. There wasn't an aggregated
feed that said, here's everything that changed since you've last been here. What she told me,
the thing that transformed the use of Facebook was making that feed
infinitely scrolling. So it used to be, you have to hit like next or, you know, things like that.
And the second was that on the mouse, remember the mouses and the track pads, they didn't use
to have double finger scrolling or like you could just scroll, or it used to be, you have to move
your mouse to the, the, the scroll bar, click on the scroll bar and then drag.
People don't really remember the details of this,
but it used to work that way.
On a current computer, typically you just take two fingers,
you put it on the thing and you just go like this, right?
With your finger.
That makes it so your hand never has to leave
its resting position.
And then it hit me because I heard the designer
of slot machines say, your hand never has to leave
its resting position.
Then they designed the slot machine so that you just go click, click, click to get the next thing.
The key thing that made the Facebook feed so addictive and absorbing is that you're,
they moved it to, you don't have to move your, your hand. You just do this. And the same thing
on the phone right now, obviously is one, you know, scroll, scroll, scroll. And that's the
thing that makes it like a slot machine is you just, you're just stuck in that one absorbing kind of experience.
So anyway, it was just fascinating that she had told me this designer,
that that was a really key change in the way that Facebook was designed.
What are the other things they do in the casinos? I mean, we know that they don't put clocks in them
and that there's like, you can't get out, you can never find the exit. But they are masters
of manipulators. Yeah, I mean, I think it's about designing for
absorption. They want to design for flow. They want the fewest number of interruptions. Actually,
one of the things they do, as I understand it from Natasha, I think her book is called Addiction by
Design, is when you start losing, like if you start losing money or you're about to leave,
you're about to walk away from the machine, I think someone will come up to you and give you
a coupon that's like, get the free buffet. So buffet at the, you know, so you stay in the
environment. And it's very similar to Facebook, right? Like you're about to leave, you're about
to scroll away. And so it starts to email you more aggressively. Here's the ex-boyfriend,
here's the ex-girlfriend. And you're just designing for re-engagement, for flow, for
hooking people. And there, by the way, there are books, there's whole conferences called
Hooked that teach people these kinds of techniques. This is not like a fair situation.
And nowhere was it written that this is how technology should be, right?
Again, in the 80s and the 90s, when I was growing up, technology wasn't designed to
just addict and manipulate you.
It was a creative tool.
It was a bicycle for the mind.
It can be that again.
And we're trying to emphasize that vision for technology with our work.
But obviously, it's going to take a while to get there.
One other question.
Do you limit the time you can spend online if you have to practice the tango and so on?
Like, do you say?
I haven't seen it in a while because of the pandemic.
Right.
But do you say like no more than 30 minutes or half an hour, an hour a day?
I try.
I mean, listen, one of the these people should know is your willpower.
Um, you know, your ability to kind of really be aware and take responsibility and resist,
you know, the second marshmallow person versus the first marshmallow person.
Um, uh, that ability wanes, especially as you, uh, it gets late at night. So, um, you know,
one of the things is don't use social media right before bed. Two reasons. One, it'll make you depressed and ruin your dreams.
Second, your ability to sort of have your brain re-engage and wake up actually diminishes
late at night, as opposed to early in the morning, you have more willpower, right?
There's many studies on this.
And so just to be aware of the battery of our own volition and that free will is, does
exist, but it's something we have to protect.
And our sovereignty, our very ability to make free choices is the thing that's under assault
in these technologies and pointing a supercomputer at our brain and keeping us
scrolling like the rats in the searching for pellets.
Right. You know, it's funny because I had a woman on the other week who talked about the future,
and she's somebody who studies trends and where we're going when it comes to the future. And one of the things she was saying was,
you're going to, you're going to not walk around with a phone. Your shirt is going to have the
abilities that your phone has today. It's going to be able to Google things or tell you directions
or even potentially game. You'll have glasses on that that can do all of that. You're not going to
be having to hold a device. And it sounded very cool.
And now I'm thinking it's sounding very toxic, very scary, and something, again, like we
don't want.
Yeah.
Well, I think that the difference is going to be the immersing, especially the visual
system.
So much of our brains are devoted to vision, right?
And so when information comes in visually, it really takes up the kind of full space
of our minds.
One of the reasons I love podcasts is I can be cleaning the dishes.
I can be going on a hike and I can listen to something, right?
So actually I'm excited about the potential for auditory technology, right?
That's more blended into our lives, but is more passive.
It's like not taking over the full visual system.
I want things to take over the visual system when I'm doing something creative.
If I'm making music, if I'm writing code, if I'm writing an essay, but I don't want any of that
addiction engagement complex, these just massive behavior modification empires that treat us like
the product and just want to suck it all out of us. We would never want that to be built into
these visual environments. And that's really the mistake that we made. If we don't have these,
these visual environments occupied with these horrible business models, I'm sitting on a computer right now with you. This would be a fantastic machine.
It's just that it's those business models that really ruin those visual mediums.
I'm glad to hear you say that because I love podcasts. We were just talking about this with
my producers about how you get these alarming notifications from your phone. Your usage was
up 14%. You listen for nine hours. You're on your phone for hours, nine hours a day. Well, in our case, A, we're in news and we program a news show. So it's going to
be a lot. But B, we all listen to a ton of podcasts. We get our news there and it's an
important source and it's a delightful way of getting your news. Totally. And it provides that
space for complexity and nuance. So we can really talk about, you know, the issues aren't as black
and white as simple. We're not fitting it into just shit posting on each other because we get
to really just talk and debate like, well, why would that perspective be valid? Let's talk about
it. What's, what's, you know, what would the other side say to that perspective? We can really work
it out. I love podcasts as well. I think podcasts are a humane technology. Yeah. And you can go to
so many different places, you know, I mean, I like your podcast sounds amazing. I'm downloading this
and becoming a fan today um but i also think
you know i like well i like crime that doesn't stress me out unless i'm you know directly related
to the people involved but i do think it's interesting to listen to crime investigate
investigation techniques and so on somehow that's soothing to me because i'm an odd bird
but i like that you can take a show like this and you can take a break and do you know do an
interview like this right or we did oh god we've done a lot of feature interviews that I, that just sort of
take the focus off of the intensity of nasty news, politics, back and forth. It's one of the beauties
of technology, right? Because we are advancing in a way that makes some consumption more enjoyable
and less toxic. Totally, totally. If you do listen to our podcast, there's one I really recommend. And it's
the one with Audrey Tang, the digital minister of Taiwan. I know that sounds like a bizarre thing,
like why should we look to Taiwan for how they're doing democracy? But it's really inspiring what
she's done there. And it's also inspiring because the reason why China, China can broadcast to its
own people, hey, look how dysfunctional democracy is a broadcast, you know, all the dynamics here
and in Europe. But the reason that Taiwan is so threatening to China beyond the fact
that it's, you know, strategically important, they want it, is that it's an example of a really
well working democracy for people who look and talk just like them, but are under a completely
different governance model. So it's a really big threat to China. And I think there's actually
reasons why we should be interested in that Taiwan being a very strong digital democracy, that we should dial up as much as we can,
because it shows that there's a different model than the thing that they're projecting into the
world. And that interview was just a really good and inspiring interview.
Well, let's hope Audrey has the ability to continue broadcasting for the rest of her life
without any interference or any takeovers. So interesting. Tristan, thank you for being so brave,
for doing what you do for calling our attention to all of this.
You've done a huge public service. I'm very grateful.
Really my pleasure to talk with you, Megan.
Thank you so much for making time for your audience. Yeah.
Yeah. Let's do it again. Wow. He was amazing.
And tomorrow's guest is amazing too.
Do not forget to download tomorrow's show because the one and only Goldie Hawn is going to be here.
I love this woman.
I adore her like every normal person.
And she is so much more interesting than you even suspect.
She's tough.
She came up in Hollywood at a time when it was not so easy for very young, beautiful gals like Goldie. And she's got
some stories that will shock you, but it will make you appreciate her grit. I mean, she never lost
her joy, her sense of humor, her ability to laugh and make us laugh, despite a lot of crap that that
industry, that disgusting industry that lectures us all the time on how to be better people,
threw at her.
So I think you're going to love her insider's perspective, her push towards mindfulness,
because she spent a lifetime working on that too. Her beautiful relationship with Kurt Russell.
I've said before, the last time I interviewed her, she had a great line about him. Of course,
most of American women love Kurt Russell. And she said, I said, how'd you make it work for so long?
And she said, well, they say the grass is always greener, but it never was for me.
She's great.
You'll hear her tomorrow.
Don't miss that.
And until then, we'll see you soon.
Thanks for listening to The Megyn Kelly Show.
No BS, no agenda agenda and no fear.