Your Undivided Attention - Mr. Harris Goes to Washington
Episode Date: January 30, 2020What difference does a few hours of Congressional testimony make? Tristan takes us behind the scenes of his January 8th testimony to the Energy and Commerce Committee on disinformation in the digital ...age. With just minutes to answer each lawmaker’s questions, he speaks with Committee members about how the urgency and complexity of humane technology issues is an immense challenge. Tristan returned hopeful, and though it sometimes feels like Groundhog Day, each trip to DC reveals evolving conversations, advancing legislation, deeper understanding and stronger coalitions.
Transcript
Discussion (0)
Hey, everyone, we're going to try something a little bit different for this episode of your undivided attention.
Tristan just went to Congress to testify.
Mr. Harris, you were recognized now for five minutes.
The hearing was on Americans at Risk, manipulation and deception in the digital age.
Thank you, Chairwoman Jakowski, and members, really appreciate you inviting me here.
So why did we go to Congress?
I'm going to go off script. I come here because I'm incredibly concerned.
We've often framed these issues as we've got a few bad.
apples. We've got these bad deepfakes. We've got to get them off the platform. We've got this bad
content. We've got these bad bots. And we've got these dark patterns. What I want to argue is
we have dark infrastructure. This is now the infrastructure by which 2.7 billion people,
bigger than the size of Christianity, make sense of the world. And if someone went along private
companies and built nuclear power plants and all across the United States, and they started
melting down and they said, well, it's your responsibility to have hazmat suits and, you know,
have a radiation kit. That's essentially what we're experiencing now. The responsibility is
being put on consumers when, in fact, if it's the infrastructure, it should be put on the people
building that infrastructure. We thought instead of us interviewing someone, I'm actually just going
to interview Tristan about what that experience was like.
I'm Azaraskin.
I'm Tristan Harris, and this is your undivided attention.
There you are right, and you're up there with three other witnesses.
Monica Bickert from Public Policy at Facebook and a professor of law and dark
patterns and Joan Donovan at Harvard Kennedy School. Think about how important this conversation
is and what's going wrong and how much is at stake. The format is five minutes opening
statement. And then each Congress member has a question that they can ask for five minutes.
Usually it takes them two minutes or so to tee up their question. And then there's a response
from one person or two people. It's just very short. And it's actually a perfect example of the
problems of the attention economy where, you know, there you have a presidential debate.
And it's like, how are we going to solve the Middle East?
You know, what's the U.S. policy you advocate for?
You have 30 seconds go.
It's just not the right format.
I know I'm running out of time.
So I'm out of time.
Thank you, Mr. Harris.
I run out of time.
I wish I had more time.
One thing I struggle with is just how can we have the space for people to really understand these issues?
Yeah.
This is why we started this podcast because we wanted the space to immerse people into these ideas.
From a humane perspective, how would we redesign the hearing?
That's a great question.
I mean, I think hearings are more about, they're not about legislating the answer.
They're about understanding.
They're about creating shared understanding.
Now, here's the background of the reality, right?
You know, the members of Congress in this hearing had actually a second hearing going on at the same
time on health care.
So actually people were going in and out of the room.
Wait, really?
Yeah, this is actually something people don't know.
And there's sort of an optical illusion because from the television or from the live stream
or something, you only see the witnesses and you see an individual questioner, right?
What you miss, what the camera does.
and show you. It's like a magic trick. There's sort of a background is that half of the
representatives are moving back and forth in and out of the space because they have to go to
another hearing and in case it's their time to ask the question, they have to be there to ask the
question. And then they move back into this hearing and then they ask their question and then they
leave. Where's the context then? It means like the thing you said, the really important thing
that one of the witnesses is said. Isn't actually heard by most of the people. Or if you have
people who've pre-prepared questions that no matter what the person behind them just said that
might have even invalidated the frame of the question they're about to ask, they're still going to
ask that question. And the way the media records that question actually hides the fact that there
might have been a total disconnect between the assumptions that questioner B asks right after
questioner A. We need processes that actually create shared understanding. And a lot of that actually
had to happen behind closed doors, not closed doors, like private doors, but in the sense
of actually one-on-one meetings. And that is much more effective. Yeah. Well, you came back hopeful.
Yeah, I did come back hopeful because it's not like everyone in the U.S. government has been
listening to your undivided attention and following, you know, all of these harms from addiction,
shortening attention spans, social isolation, reduced critical thinking, election engineering,
etc. They haven't been following all that.
Yeah. So what has me optimistic is that when people understand that these harms are not
like cultural trends that are happening in society by accident, they're happening by design
of the business model. And if enough people know that there's a problem and the whole world
knows that the government knows that there's a problem, then it's a matter of sort of shame and
embarrassment for there to be no action. Obviously, we live in a very partisan climate right now,
But I think that, you know, strangely, I think there's a lot of people, a lot of allies in Capitol Hill who really want to do something.
So talk to me a little bit about the Overton window.
Like, what is it like, one, to just sit in front of senators and explain such a fundamental issue?
And then, two, what's changed between sort of the last time and this time?
What's now in the Overton window?
What can we say now that we couldn't say before?
We had several really incredible meetings with different members of Congress, senators, different agencies.
who have a large appetite. One thing I was excited to see was the conversation moving from privacy,
which is important, but not sufficient. And we wrote this op-ed in the New York Times recently
where we challenge people. Step into your privacy utopia. You've perfectly protected people's
data. You still see mass shortening of attention spans, loss in critical thinking, loss of romantic
intimacy, social isolation, mental health, election engineering, disinformation, outrageification,
narcissism because all of those effects are still part of this other business model. So here's the
thing that's exciting, though. It used to be the case that people thought privacy would address all
of these problems. But I think there's a real awareness now that we have to do something about
these broader issues. And I think one of the biggest aspects of why it's been hard to do that
is it can feel overwhelming. Yeah. Like it's a long list of harms. Yeah. And it feels like they're
coming from all sides. But actually, there's one generator function.
which is this race to automate human attention.
Many people when you meet them behind closed doors are really eager to solve these problems
and really eager to look at things like Section 230 of the Communications Decency Act
and unfair and deceptive practices in sort of the Federal Trade Commission rulemaking frameworks.
Did you see that there was bipartisan support for this issue?
Or did you see that it broke along party lines?
What did you see?
Well, whenever the issue turns into free speech, it's just an endless rabbit hole that's not going to go anywhere, right?
For example, let's take this hearing.
So the hearing was about Americans at Risk, online deception, and deepfakes.
So let's take deepfakes.
Well, who's to say that that deep fake over there, that's faking an image of your favorite politician saying something funny, just funny, satires, irony?
That's just a free speech rabbit hole.
The real question is, how fast does it travel?
Is it labeled as a deep fake?
Is there a clear disclosure, right?
The key thing is context.
There's an interesting content and context.
Free speech is a content conversation.
Free reach is a context conversation.
How fast do things spread?
Describe that a little bit more.
Why is that context?
Let me give an example that people will probably get from their own experience of Facebook.
Content might be something like I posted this article that was true, but it might be false
context because it was actually an article from five years ago.
and we've since debunked that article
so it's not that it's fake news
it's just that it's old and when it was
posted in the news feed it's presented as if
it's brand new
so that's a context problem
not a content problem
and context is the medium of our understanding
it's our friend Brewster Kale
at an internet archive
says that there's this phrase that's common in the free speech
community that the solution to bad speech
is more speech but we know that doesn't work
because if your dependency is that
you need more speech to battle bad speech, well, then I'm just going to flood the channel with
bad speech, disinformation, noise, until you're apathetic and won't do anything. That's right.
And so Brewster has this great phrase that the solution to bad speech is better context.
Yeah.
Which is to say to contextualize and understand that, you know, if that's a conspiracy theory,
let's actually show that a lot of people think that that's in doubt and that's a conspiracy.
We can make some distinctions there, like labeling. Do people know that something that is a deep fake is a deep fake is a good example?
Yeah. Yeah. And then I like to say.
idea of amplification liability.
That is, if you are going to amplify, then the scale of the amplification should be timed
to the scale of your liability.
I think last time we were talking about the idea of an attention jail that Facebook could
implement right now.
There was a phrase you used that really, really struck me.
We've often talked about how technology in the business models in particular are eroding
the fabric of society.
And the phrase used is that it's not so much that it's eroding the fabric of society,
is that it's become the fabric of society.
Yeah, this is really critical.
I mean, the point we were making is that technology has become the infrastructure for every aspect of our social world.
It's sort of like if you got a prosthetic limb.
Like, it is your limb.
You're holding this object.
It is the now genuine extension of how you wake up in the morning, how you go to bed at night, how you open your mouth to say something to someone.
So, okay, that's not necessarily bad.
You know, you can have technology takeover infrastructure.
the problem is if there's a commercializing interest on intimate infrastructure that's become
the way that you communicate. So now the very meaning of communication has a commercial interest.
There's no way, basically, for you to send a message from person A to person B without it being
paid for by person C who has an interest about what they want that context to be. That rhymed.
Therefore it's true.
Therefore it's true. There's a great psychology study about that, right?
If it rhymes people viewed as more true. If it illiterative, people view it as more true.
Yeah, let's see. What was, what were we just talking about? We're talking about social fabric, and I think we're about to move into social organs.
Yeah, so they have become the social organs of our society. You know, YouTube has become the broadcast organ for video, right? You don't really get to go somewhere else. It's the infrastructure for all, you know, video, etc. The main point I made in the hearing is that, okay, so take this world down here, this world of atoms and, you know, we've got a police force in case something goes wrong on the streets. We've got an FTC in case there's fraud happening, you know, unless.
lawfully and from a business, we've got a Pentagon in case Russia tries to fly a plane into
the country. Okay, so that's the physical space. But once I go up into the virtual space,
now Russia can fly an information plane in. Now, where's the Pentagon? Instead of the Pentagon
and thousands of military employees, et cetera, we've got 50 people on a trust and safety team
who've been rolling over with employee turnover because it's been a tough time the last couple
years. And that's the best thing we've got. So the point is that when we move from the physical
infrastructure, to the virtual, to the cyber, you know, internet world, we lose all the protections.
You know, during the hearing, I said, we used to have Saturday morning cartoons. We protected
children from certain kinds of advertising, time, place, manner restrictions. When YouTube gobbles
up that part of the attention economy, we lose all those protections. So why not bring back
the protections of Saturday morning? That's, by the way, the same thing as context. It's time,
manner, and place restrictions on what times you can advertise to children or you can't do adult
television during the prime time hours, things like that. We lost all those protections. They're all
gone. And that's happening with everything, whether it's how we treat teenagers, childhood development,
child education, election advertising. We used to have fair, equal priced elections, so it should cost
the same for candidate A to run an ad on TV at 7 p.m. as candidate B. When Facebook gobbles up
election advertising, we just removed all of those same protections. So we're basically moving from
a lawful society to an unlawful virtual internet society. And that's
that's what we have to change. Mark Andreessen has the very famous quote, which is
software is eating the world, which means that software is eating our protections. Yeah. So there's
this example of a polarization study that Upturn just released. Upturn is this great
nonprofit organization. And what they were studying was how far does your dollar go to generate
clicks on advertising? And is it fair across the aisle? So what they did was they said,
okay, here's a bunch of money. And let's say we have a left-leaning message. And we try to
advertise that to a left-leaning audience. Well, it's cheaper to put a left-leaning
message and have it be reaching a left-leaning audience than it is for a left-leaning
message to reach a right-leaning audience. Same thing is true on the right-leaning, obviously.
It's cheaper for a right-leaning message to reach a right-leaning audience.
So in other words, polarization is part of the business model. It's a polarization for-profit
business model, because it costs more to reach across the aisle than it does to have that
$1 reach your own constituency. So now, imagine just billions of dollars, like let's say
there's no Russia, there's no
no one trying to manipulate the system.
We clear all the bad guys out.
Now we just have their Facebook.
Like we don't have bad apples.
They don't have bad actors.
Okay.
So we've got billions of dollars rushing into the system, flooding the system.
But now we have this weird invisible sorting function like Moses,
just like spreading the seas of society where it intrinsically has left-leaning messages
reach left-leaning audiences more efficiently and more profitably and right-leaning messages
leading to right-leading audiences,
and then the way you continue to advertise
is to get more and more extreme.
So it's like Moses.
It's sort of splitting the oceans of society.
And, you know, I said in the testimony,
this is really serious.
This is like Civil War accelerating polarization.
I mean, the polarization dynamics
are accelerating towards Civil War-level things.
Hashtag Civil War is coming.
There's our colleague, Renéide Arresta says,
if you can make it trend, you can make it true.
When you're planting these suggestions
and getting people to even think those thoughts
because you can manipulate the architecture.
We're profiting.
As I said, we're subsidizing our own self-destruction
if the government doesn't say
that these things can't just be profitable.
And I want people to move from this kind of like,
oh, it's kind of fun,
that technology is just a little bit addictive
and we should really talk about that and fix that.
It's like, no, no, no.
This is an information and trust meltdown.
We've democratized the capacity
for politicians to advertise
because now there's many more people advertising on Facebook.
I would just say it's not inherently bad
that political advertising is moving online
What's bad is that the online that we've constructed is one in which every dollar creates more division.
And so as we move from having television-based ads to online ads, every single one of those dollars makes us more divided and unable to stand together.
It's like a nuclear power plant that just had a meltdown in your society.
But that nuclear power plant infrastructure was built into 2.7 billion people's societies, especially people like Maria Ressa.
who we had in the podcast earlier in the Philippines, saying,
Filipinos spend the most time on the Internet,
even though the speed is so horrible.
We spend at least 10 hours a day on the Internet.
We embraced it, but we didn't realize that it would demand so much of us.
We had no voice in how it developed.
The irony, of course, is that the values, American values that you built it with
has been completely turned upside down and it has been used by illiberal forces.
people who want to control the information ecosystem.
It is definitely the case in countries like that, that it is the infrastructure of their society.
Right.
This sort of dark infrastructure.
It's every dollar in creates more division.
Every dollar in, more division.
Every dollar in, division out.
Division out.
2020, there is an estimated, just in the U.S. alone, $2.8 billion is going to be sent on political ads.
So that billion dollar number isn't a pie in the sky, like abstract billion dollars, this is going to happen.
this year. It's really, really frustrating because we know that this is coming. This is what I was trying
to tell members of Congress and senators is like, sure, we absolutely want to protect against the new
disinformation threats, the new deepfakes. But why don't we recognize the hypnotic spell that our society
has been under for the last six or seven years? You know, let's say YouTube perfectly got rid of all
the conspiracy theory issues and the outrage and the hate machine. Our society has still undergone,
everywhere around the world, two billion people, six to seven years of that attention maximizing
algorithm that basically said, let's tilt the world towards the outrage, the extreme stuff.
And so you wonder, why does the world just hate each other so much?
Well, it's obviously it's the other side. It's they're the ones at fault, right?
It's always the other side, Aza.
But that's sort of the point. It's not the other side. It's the pipes.
Yeah. It's not the people. It's the pipes.
It's not the people. It's the pipes. It's all the all the alliteration today.
where there things that other people said
that sort of got you angry,
in the sense of like a myth that it's just like, you're like, ugh.
I love that question.
There was one Congress member who had asked,
well, haven't we always had, you know,
manipulation and marketing?
And, you know, I have a supermarket in my hometown.
All of us go to the grocery store.
When we're at the grocery store and you're in the checkout line,
you get all these things up there that they're trying to get you to buy.
They're not necessarily, or you could argue that they're impulse items.
But then again, you could also make the argument,
that when you get home, you say, geez, I wish I would have gotten that at the grocery store.
I wish I would have gotten these batteries or band-aids or whatever.
How do you differentiate between what's harmful and what is beneficial?
And, you know, there was an answer by one other member on the panel, but I kind of jumped in and said,
well, actually, there's something really, there's many things that are very different about this.
The first is that this is infrastructure we live by.
This is your, when you talk about children waking up in the morning and you have auto play,
you don't that's not like the supermarket where I occasionally go there and I just made some purchases and I'm at the very end of it and that's the one moment the one little micro situation of deception or or marketing which is okay in this case we have children who are like spending 10 hours a day so imagine a supermarket you're spending 10 hours a day and you wake up in that supermarket and that's the degree of intimacy and and sort of scope in our lives that's the first thing the second thing is the degree of asymmetry between the
persuader and the persuadee. So in this case, you've got someone who knows a little bit more about marketing who's arranging the shelf space so that the things in the top, red eye level versus the bottom level. That's one very small amount of asymmetry. But in the case of technology, we have a supercomputer pointed at your brain, meaning like the Facebook news feed sitting there, and using the vast resources of 2.7 billion people's behavior to calculate the perfect thing to show you next and to not be discriminated about whether it's good for you, whether it's true, whether it's trustworthy,
where that's credible. And so it knows way more about your weaknesses than you don't know about
yourself. And the degree of asymmetry is far beyond anything we've experienced. And you want the federal
government to control that? There's just so many days. It feels like Groundhog's Day.
Like, why are we still having this conversation? I mean, Azza, you and I feel like this is,
we felt like this is just obvious. In the same way, a tree is worth more dead as lumber than
it's a tree. A whale's worth more dead than alive. A human is worth more if they're addicted,
outraged, narcissistic, polarized, and disinformed than if they're a human being.
It's just that simple. Let's now talk about what we do about it. And by the way, the cost of that
isn't just that we're more narcissistic, disinformed, et cetera. It's that that's kind of a meltdown
of the fabric of society. I don't think that we need to sit here and keep talking about the
problem. I hope we don't have to. I really want to get to a world where we do something about it.
And one of the things we talked about in the hearing was we already have existing federal bodies,
legislative bodies, regulatory bodies, agencies to care about certain areas of society.
We're simply not applying those existing laws and jurisdictions.
We just need to apply the principles and jurisdictions and concerns of the physical world to the virtual world.
So one way to deal with this would be to create a new digital agency, something like an Office of Technology Assessment that just regulates all of the harms emitted by technology, the polarization pollution, the distraction pollution, the narcissism pollution.
But that's a lot of work for one brand new agency to regulate all of technology.
Are you kidding me? That's never going to work.
So what we proposed was, what if we ask each agency, we mandate a digital update?
And just have them ask the questions that then are forced upon the technology companies to use their resources to calculate, report back, set the goals for what they're going to do in the next quarter.
Thank you, Mr. Harrison, I.U.
So each one, Health and Human Services, Department of Education, National Institute of Health, gets to say,
Hey, we care about, let's say, kids' mental health and addiction.
Okay.
Well, here's a fact.
Unlike, say, the alcohol industry or the tobacco industry, they don't know which of their
customers are addicted.
Cigarette industry puts their product on a store shelf.
They don't know exactly who's buying it.
They don't know which people who are buying it are the addicted ones, which will use it
in lightweight ways.
What's different about technology?
Well, they know exactly which users are using it because they have to log in.
Their customers are legible.
Their customers are legible.
There's the lessing phrase.
Yeah, and the patterns of use and the harms are legible too.
Yeah.
So they actually can't claim that they don't know.
They know exactly how many teens between 10 to 14 years old are actually using it after two in the morning.
They could report back on a number there.
They know exactly how many people are addictedly checking more than 100 times a day.
They know exactly how many kids are posting photos and then deleting them within one minute because they didn't get enough likes because they can actually see that pattern of use.
And imagine that the NIH or whoever would have jurisdiction over that area could say, hey, Facebook, it's your job to report to me, because I care about that.
It's my area of purview, about how many people have that problem.
And then each quarter, your job, Facebook, is to reduce that number.
And every quarter, I have power and authority to make sure that you're doing that.
And we can find you if you either lie about it or you're not making the reductions within some reasonable mode of effort or outcome.
There are actual legislative proposals to do with some of these things.
Yeah, I want to hear the fast list of what are the legislative proposals to deal with some of these things, just on.
You know, Senator Markey has something called the Kids Act, which is the Kids Internet Design Safety Act.
They tackle some of the design features that are out of bounds, things like autoplay, quantifying popularity of people, push alerts, come back emails, come back emails for those who don't know, like here's a good test.
If the tech companies who, you know, go in the world of addiction and they have those kinds of problems,
The easiest thing they could do to prove that they're not trying to addict people and that they're not digital drug lords is they could stop user resurrection or what it's called comeback emails.
That's like, you know, you ever notice you go on vacation for a week, you come back from a week, and you have way more emails from Facebook or Instagram or YouTube than they normally send you.
That's because they were watching.
They're like, hmm, why did this user go offline?
Why aren't they using it very much?
I know, let's seduce them back like a drug dealer and get them coming back.
the easiest thing they could do to prove that they're not addictive designed companies is they could just not do come back emails.
There's also rules around commercialization and marketing and the amplification of harmful content because YouTube for kids has been real cesspool of this kind of really dangerous stuff.
I believe Senator Markey was also involved in the original Children's TV Act.
And this is just about bringing back a lot of those kinds of protections.
But for kids, there's also Kappa 2.0, which is the Children's Online Privacy Act, bumping underage use from 12 to 15 years old.
Because underage use is a big problem talking to Jonathan Haidt about that.
It's really the younger teens who have, especially are vulnerable to instant, constant rating by other people.
The feeling of being rated by other people constantly is a horrific thing to expose young kids to.
And nothing that we signed up for.
I always have this analogy, which is imagine between you and your friends, you took bets to decide how much money your friends are going to make in the next year.
And you could also take puts against them.
I think you're going to lose your job this year.
I'm going to buy a bet against Tristan.
And just imagine how that would warp your friends group.
Like, it would really change the dynamics.
That's also happening to our children.
It's kind of, like, instant grading and quantification of exactly how much people like or dislike you.
I'm so glad that I did not grow up in that time now.
Yeah.
It's hard enough being a teenager, right?
I mean, it's hard enough negotiating your identity and feeling the self-worth and figuring out who you are.
To add this on top is just like a war zone.
It's like a psychological war zone.
If I was to ask the congressperson like one question, I think it might be, imagine if you took a stance against Facebook, something that Facebook didn't like.
And you're coming up for re-election, and Facebook quietly downranks you.
Not a lot, but just enough that your opponent wins.
Would you even know?
I wouldn't know.
Yeah.
That's sort of the point is that the only thing stopping complete capture of our government is our good faith in the leadership of these private companies.
And that's an untenable place to remain for long.
Even China doesn't want to allow Facebook into the country because it says, no, we look at what
Facebook's doing around the world and it's kind of dismantling the fabric.
We're not going to let that in here.
I think about TikTok, China having a direct conduit to control and own the culture of our youth.
Right.
And if you just reround the quack and you said, cool, actually China's going to subsidize and write
all the textbooks for our kids, we'd be like, hell no, we're not going to allow that.
But because it's an app that people are opting into, somehow it gets a free pass.
We do not have digital borders.
Well, exactly.
I was saying during the hearing, you know, it's like we've protected the physical border.
We left the digital border wide open.
Imagine a nuclear plant who said, we're not going to actually protect the nuclear plants from Russian cyber attacks.
Well, this is sort of like Facebook building the information infrastructure and not protecting it from any bad actors until that pressure is there.
We actually let them build the infrastructure for our children's development, which is TikTok.
Right.
It's like that, you know, they go.
to these countries and they build the ports and then they get them in debt and then they owe
China and then they have like we would never let them do that in this country at least and yet we're
letting it happen over and over again and it's threatening the viability of the u.s. government and
like our culture as a culture that can outcompete closed authoritarian regimes because right now
an open society means a distracted narcissistic outrage disinformed society because we left the door
wide open for manipulation and these business models and a closed society is a protected one yeah what
or the Congress people's response?
Like I use this, this is really important.
I use this metaphor in briefing some of the members of Congress
that there's kind of like these micro-targeted zip code size Pearl Harbors
happening every single day.
But you don't know because it's this sort of silent weapon
that only hits those people.
It's like those military weapons.
You know, like there's these sonic weapons that the military has
where I can point it at you and I can actually turn up the volume
and it'll be a piercingly loud sound,
but only you will hear it.
And if I move five inches away to your,
right.
You can't hear anything.
Can't hear anything.
That kind of weapon is essentially what Facebook micro-targeting is, right?
Except instead of doing it from a distance of 200 feet, I can do it from across the world.
And I can point this information weapon at 100 conspiracy theorists and say, I'm going to dose this information to you.
And then now I go walk to the representative of Nebraska or of Massachusetts.
And I say, by the way, representative, did you know representative that the Russia is targeting the U.S. military veterans in your zip code?
And they'd say, no, I never heard about that.
and then you tell them all this information.
They're like, that sounds kind of like a conspiracy theory
because I've never heard about that.
But it's because I can keep micro-targeting your people.
And Russia, this is a real example,
Russia's been found to be going after U.S. veteran military groups on Facebook
and sowing disenchantment about all the spending
and all these pointless wars and all that you could say,
well, it's real.
These are actual real facts.
But the fact that there's all this manipulation going on
is not commonly known.
And the representatives don't know this is happening.
Well, who knows the surface area?
of where this is happening? Who has the best access to both monitor where the harm is happening
and where you would distribute the mass public awareness campaign? Yeah, it's the platforms themselves.
Right. So why wouldn't they do it? And one of the examples I think I gave in the hearing is,
you know, when there's a breach like Equifax and your data gets breached, there's law that
forces you to notify each person who was breached. But when there's information operations that
target you as a veteran, and we later discover them and we take down the accounts and we
Facebook doesn't go back to the people who had been affected and say, by the way, all these groups that you joined in this group and that group and those posts and those posts, that was all an information operation.
And the very presence of that, simply showing up often and often and often, would first be a deterrent for Facebook because now they would have a reason to not want to have to admit to people how often this is happening.
And second, it would obviously create a culture of a kind of a cultural immune system where we would all be aware that this is happening much more often than we previously.
known. Therefore, we wouldn't have to wait for representatives in the U.S. government in
Washington, D.C., to have to know about it because we would all know about it in a bottom-up
decentralized way, which is much more powerful. We have this concept of an ad-blocker,
which is a big, you know, white list or blacklist for things that I don't want to see.
And I really wish there was essentially an ad-blocker, big white list or blacklist
for, you know, coordinated inauthentic behavior. I want people like Guillaume and Renee and all
these, like, brilliant researchers to be able to put into a centralized place that
kind of content which is propaganda and then have it run in my browser, run on my phone, and just
block it or at least highlight in a different color so that we don't have to rely on Facebook
getting it all right. Yeah. And that's really a good example of where a company like Apple is in the
best position to do this because they're kind of the fiduciary to be the agent of protecting
us and our limits of our minds. And then this whole world of manipulation that's sitting
out there. So in controlling the operating system and the apps and the notifications and the
things that are coming in, you know, they can act as a better steward and fiduciary agent of
not just saying, well, who wants my attention, but who's going to help with our values? And that's
the relationship I'd love to see Apple step into. Yeah. Is there something just on that you wish you
had said in the hearing that you did not? Yeah. I mean, I opened by saying, I'm here because I'm really
concerned? We're actually at the last turning point, kind of an event horizon, where we either
protect the foundations of our information and trust environment or we let it go away. And when I say
something extreme like we're having a trust meltdown, it sounds extreme, right? It sounds like an
exaggeration, and you're not being specific. And I didn't really get a chance in the hearing to
defend why that was true. So here's how I think is a good way to describe. Back in the 1990s,
newspapers thought that they were in the truth business.
The product that they were selling was truth.
Craigslist comes along in the internet, and they realized, oh, crap, we weren't in the truth
business.
We were actually in the classifieds business.
And then suddenly they had to go through this whole rejiggering to say, okay, well,
how do we make this business model work again?
And they kind of leaned more on advertising.
Okay, so now they're leaning on advertising.
And that's their support.
Okay.
But they still believe, well, no, we're in the truth business.
We're selling truth.
Okay. But then Facebook and YouTube and Twitter come along and they realized suddenly, oh, we weren't in the truth business. We were in the attention business. And, you know, whether it's the Wall Street Journal or Fox News or in the New York Times, we're all having some kind of expensive process to generate what product. Ultimately human attention. Right. Yeah. Okay. So there you wait. Let's imagine you've got this big black box in front of you. Black box A. And let's say it's the Wall Street Journal. Inputs coming in. You've got a bunch of money. And then you've got to pay.
all these journalists and these editors. You've got to pay people to do fact checking. You've got to
interview witnesses. You've got to pay people to double-confirm things. You've got to pay people
to make moral decisions about what you go on the front page and let's do the word counts and let's
update this and that. You know, you've got to pay advertising people to sort of say, well, what ads do
we want that are actually along with our values and how do we do this right? And, you know,
what's going to make a lot of money? Sure, absolutely. But let's also make some other decisions.
That's a lot of manpower, human power, effort going into producing what ultimate outcome.
a few articles that generate a certain amount of human attention.
Okay, now let's look at how the tech industry, black box number two, comes along.
And they say, let's say this is Twitter.
And I don't mean this because I think Twitter diabolically thought this would be true,
but from a business perspective, this is kind of what happened.
Instead of paying all those journalists and human editors,
those $200,000 a year, $100,000 year salaries to do that work,
what if we got each human being on Earth to be narcissistically posting their news,
their commentary in photos of their cats and dogs and breakfast, and they'll do that work for free.
We don't have to pay them because we just show their friends the number of followers they have,
and now they'll actually just addictively come back and want to get as much attention for themselves
as possible. But what's really happened is that they've become kind of like the information
or attention gig economy workers. They're like the Uber drivers, but they're driving around
attention for us. So essentially, we have become the gig economy workers in the attention economy.
That's right. Each of us are free, unpaid labor.
doing the work for free.
So it's actually worse than an Uber driver because we're not getting paid.
That's right.
We're not even getting rated.
But we are getting rated, exactly, by our friends, not even for the quality of the effort
that we provide.
And so now you zoom out, okay, and you imagine that, let's say, Black Box A, the expensive
news creating process, used to be the process that generated our information environment,
right?
It went through some gatekeepers.
I'm not saying it's perfect, by the way.
I'm not trying to romanticize it and say it didn't have problems.
There's lots of problems.
There's narrative control, gatekeeping, unethical.
moral decision making, a whole bunch of stuff, power, power symmetries, okay. But there was
a process that generated our information environment. But then imagine you zoom out and you do like
the Indiana Jones swap. And so instead of the first black box generating the information environment,
now you have the second black box generating our information environment. So now instead of
quality researched, investigated, reflective, thoughtful, historically contextualized
information. We have hot takes, cynical commentary, breaking news, all caps, outrage, because you
win for a totally different set of reasons when you're operating in the second black box.
And that becomes the basis of our entire information environment. So the problem statement is not
that we have these bad deepfakes or we have this bad disinformation. It's that we've replaced the
entire information ecology with bullshit. And that is the thing when I say there's an
information trust meltdown is the thing I really want people to get. But it's not,
it's actually, it's not that we've just replaced it with bullshit. Sometimes it is good
information coming from your friends. That's right. And so that makes it really hard because
we've entangled and twined. Simultaneous utopia and dystopia. All the voices that didn't get
to heard got to be heard, but all of the outrage, narcissism, conspiracy theories, toxicity also won. And
it's so intertwined. And if you look at it as a kind of zooming out to the population dynamics,
what percentage is the brilliant stuff? How many people are experiencing brilliant Twitter or beautiful
Twitter or loving Twitter or deep, reflective, amazingly better sense-making Twitter than we had
before versus how many people are experiencing angry Twitter, narcissism Twitter, outrage
Twitter, conspiracy Twitter, and especially polarization Twitter. At the very least, what we know
is that polarization wins in that world. And that is so dangerous.
polarization trend is the thing that I think as well as getting traction on Capitol Hill.
Everyone's worried about it.
No one wants to live in a world where you can't talk to people.
And that's why we're doing this podcast in my view is we can't just say let's stop the new bad
information.
We actually have to wake up like the Matrix, but through enough layers.
We actually have to go back in time and pop through layer by layer by layer by layer.
We've been under this spell, this illusion of everybody hates each other.
We can't agree.
It's not true.
We've just been under like six and a half, seven years of automated attention.
And all it knows is Alex Jones tends to work really well.
So it gets 15 billion views.
So all of that, we've been simmering and marinating two billion people's minds, like in the pot.
And it's been marinating in there for like six years.
And now you pop up and you say, this is the real world.
Well, it's actually because this illusion became delusion.
Because the weird beliefs that we've been holding in our brains started showing up in votes, started showing up in not vaccinating our kids, started showing up in shootings because of hate speech on the internet being amplified.
And now we say that's normal.
Oh, those are just people.
That's just a mirror.
But it's not a mirror, as you say.
It's a fun house mirror.
But again, how do you reboot from this?
I really think it's about people seeing this process, clarity of understanding.
You can see the thing that's happened and you can stand above it saying, oh, this is why all that other stuff down there was running through my brain for everyone, all of us, me included, right?
This is not a clarity in some sense is exactly like a breath of fresh air in the sense that as you lose.
oxygen like you also lose the ability to know that you're losing oxygen you're like
your mind becomes fuzzy and vague right and it's like a breath that you can
have clarity is like a breath of air where you actually get to see again and you
have your full perceptual exactly and you have a limited amount of time in which to
act before you lose it again right that's really really said I actually have
a question for our audience which is if you guys have well you'll listen to the
podcast and if you watch the testimony I'm super curious what questions you
we missed or what questions you would ask? Because this will not be the last time that Mr. Harris
goes to Washington and would love to, in a sense, crowdsource really stellar questions to get
to the heart of the problem. Send us your ideas on Twitter or you can send us emails or even voice
notes at undivided at humanetech.com. The other reason why I'm asking our audience for
questions is I think it's in a sense on us as technologists not to repeat that means.
of government's too slow or too stupid and doesn't get it, but rather to ask the question,
how can we upgrade the capacity of our government? Right. And so per, you know, E.O. Wilson,
we have paleolithic emotions, medieval institutions, and godlike technology. Humane technology
is we have to embrace our paleolithic emotions, upgrade our medieval institutions,
and have the wisdom to wield godlike technology. And the way that we upgrade medieval
institutions is we have to invest in them. And there's great organizations out there like
Tech Congress, which is actually placing technologists in government. And I've met many of them
and they're super talented. And there's a lot of people in government who do get it. But there's
also a lot of people in government. And so there needs to be a lot more education out there.
And so if you listeners are interested in that, there's a huge need for more talented
technologists to go into government and to help educate. It took me a long time to retrain my mind
because I've been in the startup game for so long that it'd be like, oh, here's a problem in the
world, I bet the best way to solve it, the only way to solve it is to make a product to replace
that with a different set of playbooks. And so if you're thinking about developing an app as your next
thing, why not think about finding a place like in tech Congress, like in Congress, in government
that you can apply what we know about how technology actually works to the protection side.
I mean, I was really grateful to see this study by Upturn. You know, it's hard to do this work
and different academic departments and different groups. U. Disinfelab, Stratcom, D.C.,
Graphica, Harvard Kennedy School. I mean, there's like so many different non-profit, civic society
groups that are doing the research that powers the kind of hearings and insights that we're trying
to spread. And just want to put a large thank you out there to all those groups, especially this
month upturn for this polarization study that I think is instrumental for people to see the
harms that are intrinsic to the business model. And the things I'd like to give is to Rebecca
Lendell, actually, on our team. She and Randy Fernando in the last year have,
really built the organizational capacity for our team. And it's gone from feeling like it's just
five of us just grinding. Grabbing by the seat of our pants. Yeah, to actually starting to have
enough resources that we can respond to emails, get back to all the people that we need to get
back to and tackle this problem. So it's, yeah, just a huge set of appreciations to Rebecca.
I love this habit of gratitude. And I also feel incredibly grateful to the two of them and that it
gives us the capacity to work much more with the whole set of people that are working in
this space. For those listening who work on these topics, it's not that we don't want to answer
your email, it's that we've just been so overloaded. And so we're really excited this coming year
to help build much broader coalitions and take this problem on with everyone holding our hands
together.
Your undivided attention is produced by the Center for Humane Technology.
Our executive producer is Dan Kedmi and our associate producer is Natalie Jones.
Nor al-Samurai helped with a fact-checking, original music and sound design by Ryan and Hayes Holiday.
And a special thanks to the whole Center for Humane Technology team for making this podcast possible.
We want to share a very special thanks to the generous need support.
of our work at the Center for Humane Technology, including the Omidiar Network,
the Gerald Schwartz and Heather Reisman Foundation, the Patrick J. McGovern Foundation,
Evolve Foundation, Craig Newmark Philanthropies, and Knight Foundation, among many others.
Huge thanks from all of us.