Your Undivided Attention - The Dictator's Playbook — with Maria Ressa
Episode Date: November 5, 2019Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising... tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.
Transcript
Discussion (0)
I remembered getting 90, 9-0 hate messages per hour, and I went to Facebook, and I said, I think I need help here.
This is Maria Ressa, arguably one of the bravest journalists working in the Philippines today.
And they said, just go ahead and report it.
And I thought, 24 hours in a day, 90 per hour, even if it only takes me two minutes to report every single one, it is impossible.
It should not be my responsibility.
Now, Maria is not the type of person who shrinks from responsibility.
When a terrorist organization abducted her journalist colleagues, Maria herself negotiated for their release.
But for this particular threat, Maria didn't see how she or anyone in the Philippines, for that matter, could get a handle on it.
Those 90 threats lobbed her way every hour were so much bigger than her inbox and a sign of how quickly Facebook's influence had spread and taken hold in the Philippines.
No organization in the world has ever worked that way.
I opened the Jakarta Bureau for CNN, and it took me seven months of negotiations.
to be able to open that bureau for the interests of both CNN and the Indonesian government.
So I think fast growth, this exponential growth that tech has enabled, came with a cost.
We were the first to feel that cost.
Maria and the citizens of the Philippines are still feeling that cost today.
And if history is any guide, we will all pay the price.
Unless we act quickly to fight what Maria first saw on Facebook and then later watch it
it spilled onto the streets, into ballot boxes, and even into the highest chambers of government.
She's taken her arguments to the heart of Silicon Valley and worked directly with the inventors of social media, pointing out not only its flaws, but how it could be so much better.
She may be one of social media's most powerful critics.
She's less known for her underlying optimism and her fight to make these tools better.
I don't want Facebook, Twitter, YouTube.
I don't want them to go away.
They have to stop digging in their heels.
They need to jump in and fix it because they can.
Today on the show, Maria Ressa, founder of the media site Rappler, takes us down into the streets of Manila so that we can see clearly this global threat and think about how to reverse it.
I'm Tristan Harris, and I'm Azaraskan, and this is your undivided attention.
I graduated school in 1986, the word people power came out of the Philippines, and that fundamentally traveled, not just in Southeast Asia, but to the rest of the world, right?
It sparked these people power movements. And the beginning part of my career was exactly about that, covering these transitions of governments from authoritarian one-man style rule to democracy. And it is always chaotic. So it's always.
the pendulum swings wildly like this, right?
But if I pull out big picture in my career,
I covered Southeast Asia moving towards democracy.
And then the hard part is in the tail end of my career.
I'm starting to see it swing back.
And even worse, the accelerant is technology.
We have governments now all around the world that are coming together.
It's like a dictator's playbook.
It's in play.
And the technology is the accelerant that is given.
giving them more power because they have a scorched earth policy and they don't care how they maintain power.
This has to be stopped.
You guys here in Silicon Valley have that power.
And I see such a reluctance to exercise the power.
Even though your decisions, the decisions of tech platforms, their values for free speech, have actually caused a lot of these problems.
I'm not saying I'm not for free speech, but what we've seen on the platforms is free speech is used.
For free exponential hate.
Yes, it incites hate.
And we also know that online violence leads to real world violence.
I mean, when is this stuff?
Whenever has this been okay?
It never has been.
So why would you want to create that world?
That's actually why I find your story so fascinating.
It's because it starts with this appreciating technology
as an accelerant for democratic forces
and the excitement of more democratic media.
And Rappler, which I'd love you for to explain more for the audience,
as a sort of news site in a vehicle,
as one of the sort of first social media news sites.
Yeah.
Rappler started with 12 people.
Within three years, we grew to almost 100 people.
Wow.
And the technology allowed us to, at that point,
become the third top online news site in the Philippines within a year and a half.
So you went from nothing to being the top third news site in the Philippines.
Yes.
Reaching, and the Philippines has about 100 million people. Now it's probably about 110 million people. And for the fourth year running, we Filipinos spend the most time on social media globally. According to the 2019 numbers of Hootsweet and we are social, Filipinos spend the most time on the internet. Even though the speed is so horrible, we spend at least 10 hours a day on the internet.
10 hours a day. I actually remember that when we met at a conference, the Stratcom conference,
in D.C. Strategic Communications and, you know, lots of State Department people there and
disinformation researchers there and the graph. You stood out. Well, you stood out to me. I think we
spoke after each other. But I actually, I have the picture on my phone. I remember that the thing
I was most called to take a picture of in your presentation was this graph of time spent on social
media and that the Philippines was as the number one country. In fact, you've even said that the
Philippines is the Facebook nation because it has, what is it, 90s?
37% of the population is on Facebook.
It's now 100%.
Now 100%.
And Facebook has actually said that we are ground zero.
I mean, of course, I felt like the canary in the coal mine, right?
Because I celebrated this.
I watched it.
I watched.
It's like oxygen to me.
So I knew like a titration experiment, you know when something is slightly off.
And we felt it leading from the campaign of Duterte, of then-Mayor Duterte to the presidential
elections to after the presidential elections. And the weaponization of social media really didn't
happen until after he was elected president. And let's back up just to explain the concept
of free basics. Because I think this is really important for people to understand the impact of
social media in developing markets. Because what is free basics? Free basics means that when you get
your cell phone, Facebook is built in and it is free, right? You get an automatic, a curated version of
the internet immediately. Meaning that you can have like online and
access. It's not just like the app is free, but now data is also free. Data is free, right? So for most
Filipinos who can't afford to pay for data and there's a lot, this opens a whole new world where
in the past a lot of people spent time on the internet because 10% of our population, 10 to 12%
live overseas. The largest revenue earner for the Philippines are overseas Filipino workers,
the remittances home. And so what they do is 70% of the population would access the internet
through internet cafes.
This is before free basics.
Once free basics came in, no need.
That's why everyone is on Facebook, right?
Right.
Free.
And so the key to free basics
is that Facebook makes it free.
And free basics was a distribution business strategy
for Facebook, correct?
So let's talk, it's Facebook and the Telcos.
And the Telcos, right?
The two, the duopoly that we had at that point, right?
Both telecommunications companies carry free basics for Facebook.
We weren't as smart as India, and we embraced it.
What happened in India?
In India, the population civil society pushed back against free basics,
and Mark Zuckerberg had to deal with it.
They don't have free basics in India.
So one way of looking at this is here comes an American company,
and it sort of takes over the public sphere,
but it doesn't take over any of the institutions that help moderate the public sphere.
And in fact, there's no really recourse for somebody inside of the Philippines
to now change the infrastructure that everyone is now communicating on.
So I think there are two main themes in what you just said,
which is the first is that the system that we wound up using.
It is a form of colonialism because it was given to us,
but really created for a Western audience with the values of a Western audience.
Obviously, the Philippine constitution is patterned after the United States,
So we have similar values, which have never been reflected in the strength of our institutions.
So that's the first, is that we came in and we walked into this.
And for me, understand at the beginning, I thought it was incredible because nothing was moving in our government.
So what was the Philippines?
We're weak law enforcement, endemic corruption.
We have the same problems.
So to a degree, the technology that was brought in that was created here in Silicon Valley seemed like a lifesaver.
The second point, though, is that we embrace.
it, but we didn't realize that it would demand so much of us. We had no voice in how it
developed. The irony, of course, is that the values, American values that you built it with
has been completely turned upside down and it has been used by illiberal forces, people
who want to control the information ecosystem. News journalists have two purpose. We distribute the
news and we are the gatekeepers of facts. This is what we have agreed on. This is the social
compact, the social contract that we've had. And in taking over when Facebook became the world's
largest distributor of news, its algorithms left behind the gatekeeping functions. The responsibility
functions, the journalistic ethics standards. There is a notion of responsibility. There's a notion of
standards of practice. There's a notion of being a responsible gatekeeper, not just a gatekeeper,
but a responsible gatekeeper. And that you're held liable for that. And I think this is part of the
problem is that the illusion that was created here in Silicon Valley is that you can have this
intense, immense growth without any responsibility and that you can say, hey, it's someone else's
fault, but it isn't. In the end, you created this. And so as a news organization, the business normally
is headed by someone else because that's the push for growth. And the editorial, the gatekeeping
role is headed by the chief editor, whoever you want to call it, right? And it is always a clash
between the business and the purpose, the primary purpose. Whether social media platforms want
to admit it or not, they are now in charge of the public sphere. When you're distributing
information, which is what flows through the networks of Facebook, when you're distributing information,
you can't actually distinguish fact from fiction.
They don't want to have that responsibility.
But journalists have evolved over time to have this set, and it's very complex.
It takes a lot of experience to make these very, very difficult calls, and whatever call you make, you're responsible for that, right?
What's an example of some of that discernment and moral weight?
There isn't some easy calculated machine answer.
It's not a scorecard.
It's not pros and cons, utilitarian, net cost, benefit.
But give people a sense of the moral weight of some of those tricky decisions that you're making.
I'll give you two examples.
The first is there was a coup attempt in 2005 in the Philippines, an ongoing.
The Philippines is the only place where the soldiers were gossiping so much that everybody knew a coup was going to happen the next day.
And I had a reporter.
So I was handling ABS, CBN, which is the largest news group.
I had a reporter embedded with the group that was going to mutiny.
against the government, rebel against the government.
It's the scout rangers.
And they were telling our reporter, turn on your cameras, and we will come out and start the coup.
And we were like, no, you come out first and we will turn on the cameras because you don't want to be the trigger for a coup.
Right.
Who makes that call?
So that's one.
That was a very easy one.
We decided not to.
And you know what?
It never happened.
it didn't actually happen.
These are the checks and balances, right?
So it's, when you're talking about real world consequences, there are a lot of minute-to-minute
decisions that determine what reality is going to become.
And that comes from subtle, like, world cultural knowledge about what is a kidnapping mean?
What are the kinds of events?
What are the trigger points?
What are the things that have to be watchful for?
Let's imagine someone just says, oh, let's just put that on Facebook.
Or let's do it with WhatsApp and let's add a flashy title.
and then let's make it easy, so you can one-click share that to 25 other people without even hitting any other button.
If we think about an algorithm that's enabling that kind of spread, now you have something amplifying it through a recommendation system, it doesn't know what the meaning of the word kidnap is.
All it knows is there's this article that everyone seems to quote-unquote like, and they like it a lot, and they keep liking it and sharing it.
Right.
And without the knowledge of the dynamics of this culture, this society, this checkpoint, this military that you would need to call.
Those kinds, all that local knowledge is gone, and it's just left in the product of automated machines.
Wait, can I ask you a quick question on that?
Why can't the social media platforms, just when it hits a certain point, get an alert and have a human look at that?
Aza here. These are really serious issues, and we wanted to take a minute to really reflect on what Facebook should be doing.
It was just at Vanity Fair, where Cheryl said Sandberg was interviewed by Katie Couric,
and Cheryl said the most viral stuff, they fork it over to fact checkers.
But, you know, Katie asked Cheryl this profound question.
She said, well, how many content moderators do you have?
She said 35,000 moderators.
That's up from, I think, 10,000.
They have 35,000 moderators for 2.7 billion users,
which is about a third of the population of planet Earth.
I mean, the math doesn't work out.
If you imagine how much does a really big city,
Metzrapalton City, like Los Angeles.
How much do they spend on security, police, right?
Just trust and safety effectively.
They spend 25% of their budget on security.
And Facebook, and their just recent, you know, announcements is only a week ago in October
2019, Zuckerberg and Cheryl announced that they're spending more money on trust and safety
than they did as all of their revenue when they went public as a company in their IPO.
Good spin.
So it's good spin.
It sounds like a lot of money, right?
They actually spend more on trust and safety than,
all of Twitter's revenue in a year. Okay. So that's, that sounds like a lot. Sounds like they're
doing everything can, but you have to actually put it in context because as a percentage of their
revenue, they're only spending 6.5, I think, percent of their revenue on trust and safety. So they're
spending one fourth on safety compared to the city of Los Angeles. And if they're running a two billion
person city, is that enough? The one simple thing Facebook could do is why aren't they quadrupling the size
of their trust and safety budget? Because they're right now underspending by four times.
They're understanding by four times.
Maria's coming to Facebook and saying, like, I am getting hammered.
My country is getting hammered and you guys are doing nothing.
Right.
So what should Facebook do?
Like right now, today.
Yeah, yeah, yeah.
There's three things they could do.
They could turn off custom audiences for political ads.
You can't do micro-targeting.
They could turn off look-a-like models.
So you can't do, I found these group of vulnerable minorities, and here's 100 user IDs.
And these are all the minority groups.
the Philippines and then now I can do a look-alike model.
This is give me a thousand more.
Like, that's a dangerous tool.
Let's turn off look-alike models.
The third thing is having them delete all third-party data that they've collected on users,
so everything they've bought from other sources to do that targeting.
I mean, one thing they could do.
I think we were talking about this before is just increased the friction that it takes
to share, right, so that you have to really care about sharing something.
Because often, if you give the human brain the chance to catch up to its impulse,
Like, it'll do the right thing.
That's what our prefrontal cortex is all about.
It's helping us do the harder, right, or thing.
And our technology should be creating space for us to do that.
And I know that there are people inside these companies who agree that the real place we went wrong
was on making one-click sharing the basis of the way that social media works,
which is basically you participate by playing in the attention fame lottery,
by playing in the attention casino.
And how much money would it take to make that implementation change?
It's basically two lines of JavaScript code.
Yeah, exactly.
So content moderation policies, it's one of the things that's extremely frustrating to me
because only a tech person could think like this.
You know, journalists wouldn't try to tell you how to take things down based on a list of things because you can't.
Well, this is the moral weight versus automation distinction.
Are we doing it with automated criteria or the kind of discernment, judgment, moral weight,
the heaviness of that process that a human mind has to weigh.
But my point there is that before you can go to atomizing it,
you had better have understood it first
and fed the machine a lot more information
and let people continue to do it because, sorry,
and I know it's a, I'm getting upset,
but content moderation without values,
the guy who took down the napalm girl,
Nick Utts photo, he was a Filipino, right?
Could you explain that example?
people may not be aware of it.
So in Norway, there was a newspaper that posted as part of their story, the picture of the
Napalm girl.
The photographer was Nick Ott.
He's based in L.A., and it was taken down by Facebook because it was against content moderation
policies.
And what it was, naked.
The girl is naked.
She's burning.
She's like this woman survived that.
This is iconic, this photograph.
And yet, because the...
The only criteria is naked or not, naked, check, done, right?
So David Kay, who's the UN Special Rapporteur for Freedom of Expression, he says, you know, well, why don't we use the UN Declaration of Human Rights as a framework for content moderation?
Because that means you need to move away from these types of lists and automation and making people act like machines because we don't.
And the systems in the real world that we've created have so many grays.
that need to be evaluated if you have real power.
And tech has real power.
It determines the reality we live in now.
And by pretending like it is a game
where you can make a list, atomize it,
this is why we have dystopia.
Yeah.
On the other side of that, it's, you know,
the atomization of news,
meaning that everything you see in your news seat on Facebook
or on Twitter,
you'll see something truly horrific, followed by a cute video of a cat,
followed by a random post about somebody eating broccoli, back to something truly horrific.
And because they're all mixed together, like our brains just can't handle it,
and it just becomes normalized.
You shut it down.
You don't have time to feel the emotion that you're feeling right now.
Or do you take it seriously?
Because I think what people don't get, and I know this is so personal for you, Maria,
I get why it's emotional, but I think people need to understand.
understand a little bit better the human cost at the other end of this. So what's the cost of
this not being moderated appropriately? Let me first pull out macro and then go to micro.
We've now elected leaders who in many different parts of the world, if you go by Oxford
University's computational propaganda research project, just a few weeks ago, they said that
there are now 70 countries around the world where cheap armies on social media have
pushback democracy. So we have elected populist-style leaders who are using the democratic processes
and, again, using that to turn it upside down. So that's on the macro end. We have no meaning.
I just add one thing there, which is often you'll hear the phrase that social media companies,
their business model is advertising. That's not exactly right. It's that they've built a lever
to monetize changes to our beliefs and our behaviors and our attitudes. And they're getting
better and better and better at that. So it's not that these bad actors are abusing this system
in a way that was unintended. They're using it exactly as intended. Yeah, I agree. And I've evolved
over time, right, because we're still partners of Facebook. We're partners of every single social
media platform. But let's call it what it is. This is a behavioral modification system. And it isn't
just about looking at your patterns. It is meant to intervene at the right
point to change your behavior. And the people who have used it the most, it's our political leaders.
I mean, again, let's go back, Cambridge Analytica. Who are the guys behind it? Steve Bannon,
funded by Robert and Rebecca Mercer. You know, these guys are deeply connected. Let me not walk into
American politics, but let's talk in the Philippines, right? The partner company in the Philippines
is also headed by a Duterte ally. So I think the last part of that is how do we prevent
humanity from unleashing the worst of humanity. It has been through things like the Bible,
Princeton's Code of Ethics, the principles, the Declaration of Independence of the United States,
these codified agreements of values that we have, right? Where is that for the Internet? I'm still
waiting for that. That's on the macro level. On the micro level, I know this firsthand. I'm just a
reporter. I lead teams in war zones. And I know how to protect my teams.
when the gunfire is coming from this side and it's coming from this side.
I know how to protect it.
But in this world today, this is far worse than any war zone you can be in because it is personal.
It comes to you when you wake up and before you go to sleep.
And everyone sees it, right?
And it is meant to, it's psychological warfare.
It's asymmetrical warfare.
A person cannot stand up against an onslaught of information operations.
Real world impact on me is this kind of, you know, this,
bottom-up social media astroturfing that is meant to tear down the reputation, the credibility
that I have built up as a journalist over the last 30 years. The messages that are seated there
are now mimicked by the government, President Duterte top down a year later.
You actually would be helpful for you to describe quickly astroturfing and cheap armies.
Sure. Astro-turfing means you take the idea that Rappler is CIA.
And in this one, it was the mass base account, a woman who later.
became a member of the Duterte administration campaign for him and is now back in the administration, right?
So she headed social media for the presidential palace.
She writes in her Facebook page, which has, you know, five million followers.
Rappler is CIA, question mark.
It's seated.
It's the very first time that idea.
It's like fertilizer, right?
And then repeated over time, a lie said a million times becomes fact in social media.
This is what Gerbils said who worked for Hitler, yeah.
Right.
Or Renee has this wonderful line, which is if you can make it trend, you make it true.
And here in the Philippines, there are great resources, a large amount of resources that are pushed onto this.
So then once this is AstroTurf, think about it, they've already, it's fertile ground.
A year later, the same message comes out of President Duterte's mouth.
And he doesn't just say it to random, not even to a press conference.
It is his second State of the Nation address in July of 2017, where he accused Rappler of being owned by Americans.
And, you know, I was covering the event and I immediately tweeted, Mr. President, you're wrong.
But about a week later, we got the first subpoena for the lawsuits.
And then this is where the real world kicks in because the weaponization of social media was followed a year later by the weaponization of the law.
Within five months, in January 2018, the first case, the government tried to shut Rappler down.
And in a little over a year, we had 11 cases filed against me and Rappler.
I've had to post-bail eight times in a three-month period.
This year, I was arrested twice in a five-week period, including coming from, I'm getting on the same flight tonight, right?
Coming from San Francisco, I got off the plane the second time I was arrested.
I was picked up as soon as I got.
got off the shoot.
So.
Do the Philippines airport?
Yes.
I didn't even get through immigration, you know.
So, yes, the real cost.
All of this is meant to be, not just to harass, but to shut us up because they've filed
cases against, like my board, for example, right?
A former president of IBM sits on my board.
He now has a case, a criminal case against him.
And when you file cases against boards, the business and the editorial, business wants to
shy away from these things, right?
editorial, we keep doing the stories.
We must because obviously the administration doesn't want the stories out.
But, you know, I joke for every case, I get a few awards.
I would give everything back in these awards to just have a functioning democracy.
We're just trying to do our jobs.
But it's the people, it's the journalists at the front line in the global south.
And, you know, when Mark Zuckerberg spoke in Congress and,
2018, and he said it would take five years and still thinking like it's the same paradigm that
you need to have artificial intelligence to be able to do this. Hello, wake up. Because in the
global South, every day that no action is taken means someone dies. I don't want to go to jail.
But every day that no action is taken, those chances increase. You mentioned you got, you get
hate, sorry, 90 hate messages per hour. So I love for you to talk about just for a moment,
what do we mean by that? Yeah, I mean, look, I was naive at the beginning because in July
2016, that was when we began to see the weaponization of social media start to happen. And now
coincidentally, that's when this brutal drug war of the Duterte administration began. The UN now says
that at least 27,000 people have been killed in this drug war since July.
2016, right? That's an incredible number. But the first casualty in our war for truth and our
battle for truth is the number of people killed because the government has parsed it. And they
continue to threaten any journalist who tries to report what the real numbers are. So July
2016, we began to the drug war happened. And I had one team, only one team that was going out
every night. An average of eight people, eight bodies killed every night, right? So I knew
something was happening, something bad. And this is the government calling anyone they want to
a drug lord, a drug dealer, and going after them on that basis of that justification. It was like
being on a list, and then when you're on that list, you can be put on the list because your village
chief doesn't like you, but here's a list that's given to the police and they knock on your door
and people wind up dead. The police claim that these deaths are because people fired back,
even if they have no gun.
We were extremely naive and began on our Facebook page, hashtag no place for hate.
And that's when I began collecting data.
And I realized at that point that this is systematic.
And we started by looking at how 26 fake accounts, we proved, because I don't trust the machine,
we proved it in Excel sheets that these 26 fake accounts could actually influence up to 3 million other accounts.
So there's your exponential spread.
And once I understood that, then we began looking.
We collected the data.
I gave all of this to Facebook in August of 2016 because I wanted to do a story because
this is huge, right?
And the three people I spoke with were alarmed, but nothing was done.
And I waited a month.
I waited a month and a half.
We came out with our story.
It's the propaganda war, weaponizing the Internet.
And it's a three-part series.
I wrote two of the three parts.
as soon as it was released, that's what triggered the 90 hate messages per hour.
So you release this story, this propaganda war story.
And this is after you were, because you didn't get a response from Facebook.
So there you are.
You report this to the three people working for Facebook in Singapore.
Don't get a mess.
I think you said later that two out of the three left their jobs.
They're no longer there.
No longer there.
So you publish this propaganda story.
And then what happens?
That's when I got pounded on my own.
accounts, hate messages, 90 of them per hour.
How did I get to that number even, right?
Yeah, what's that like for you?
I mean, because 90s just, first of all, just coming into some, your phone's dinging.
I mean, or you're seeing, like, what are you seeing?
So at the beginning.
More than one ding a second, right?
Yes.
Yes.
It's insane.
Well, so when it was happening, I was watching it.
And at the beginning, I tried to respond to everyone because I think that's, those were the
principles that Rappler was built on you, engage.
So, like, what kind of message would you get and what kind of response would you give?
I mean, some of them.
I tried to respond to the ones that weren't threatening me, to the ones that were half rational.
But in the end, no one was responding back.
It was really a pound, pound, pound, pound, pound, pound, right?
But they went from, like, you're horrible, you're wrong.
Those are the really tame ones to you should, to rape threats, to murder threats, to I know where you live.
I'm coming to get you, private messages, very coarse, very crass, hitting you in ways,
I suppose that they felt, you know, again, to me it was, it was alarming, but I think the impact
on me was, did I do the right thing? Is my data correct? I went back. So at one point I was
responding. You doubt whether the data was correct in the propaganda story, which is sort of
the whole point of propaganda is to make you doubt, right?
So I did, and I'm pretty thick-skinned, right?
But I went back and I said, this is correct.
So I went.
And then when I kept trying to respond, I realized it was impossible to respond to it.
And then I just started counting.
How many times?
And then I took kind of the most creative ones.
One I remember was so funny.
It was like your mother should have swallowed the sperm or something.
It was like, it was gross, but it was, you know.
I mean, it's hard to deal with the fact that, you know, you have to touch.
the grossness. My managing editor actually said, you know, stop, get off social media, because
for two weeks it did have an effect on me. What was that effect? Yeah. It's like PTSD, I guess,
you know, because it makes you doubt yourself. So I was going back over it, going back over it.
This is hate spam. It's attention spam. It's epistemic spam. You know, one solution is you build a
spam filter, which is the direction we went with email, where
People still do send spam, but then we build the AI so it's so smart that it actually does detect it.
And I think they can certainly get better and there can be bigger investments there and that should be done.
But the other side is you put a cost to sending email.
It goes to Bill Gates' idea of an email tax back in the days of spam.
And if it costs you something, the real solution is the spammers, because they send so much email,
it would cost them a lot of money.
Hate messages tend to come from trolls who are sending lots of hate messages to lots of people.
And so for you to live on social media hating on the world would cost you,
a lot of money. And the people who send just a few emails, it doesn't cost them as much money,
so they can do it. Now, you're not going to like it because it means that now it's not a totally
equal playing field and it's all free. But you have to ask, what world do I want to live in?
And I think saying, even if it's a virtual currency of I have so much attention I can take
in other people's lives, I'm going to allocate that much more carefully if I know that when
I share something, I'm going to be thoughtful about it. And you can imagine that if it ever does
get identified as hate speech, it should be three strikes you're out, right? I mean,
there's ways of creating counter incentives that at least create, you know, there's shadow
banning, there's holding tanks, there's like temporary bans, there's ejections from communities.
I think the main thing is it shouldn't come from algorithms.
And so the real question for our time is how do we scale human judgment and how do we keep
human judgment local to human situations?
One of my favorite examples of taking a small group of people and scaling them up for
moderation.
From moral judgment.
For moral judgment comes from the Huffington Post.
where they had this system, actually don't know why they don't have it anymore.
So, Huffington Post had these paid moderators, right?
These ones who were like the super moderators, the ones who were on their staff.
And they were moderating, oh, that's not good, that's not good.
This is good. This is good.
And then what happened is they had all these users, just the readers.
And so as the readers were marking what they liked and didn't like, they built this clever system,
where the readers who ended up making the same moral judgments as Huffington Post paid staff
invisibly got promoted to basically be...
They'd get badges. They'd get badges.
that would go from like, I'm a level one moderate, I'm a level two moderator, and eventually,
you know, you hit level three, and all of a sudden, your choices as a reader were as powerful.
Like, you could flag and you could take down comments, but you never knew whether what you were doing
actually matched what the paid staff were doing.
And if you started to do things that, like, you know, you built up your reputation,
then you started to abuse it.
Well, it would just automatically, like statistically, you're doing things that no longer the paid staff
was not doing, and so those powers would be revoked.
This is just such a brilliant way of taking a small group.
of people and their values and scaling it to a community judgment. Yeah. It's a great
example, essentially the Huffington Post values, which they're imposing top down saying these
are the values we want and what we want to be on our platform and what we don't want on our
platform. But then finding organically, you know, which people are mirroring the values of the
people who are at the top saying, oh, let's just have the system that automatically promotes them
when they align and automatically disempowers them when they start to go the other way. And it's a way
that like you've imagined if there are many HuffPose, right? And it's sort of like there are many
Facebook groups that each community can come up with its own values and then have a
community that jumps on board and enforces them with very little overhead.
My reporters were attacked in the same way, one who's just in mid-20s at that point, right?
She was getting hammered as much as I was because she is the presidential reporter.
She was following President Duterte.
And shortly after that, a few months after that, because she also continued.
to report and ask tough questions of the president. We were banned from palace coverage.
She and I, even though I don't cover the palace, and President Duterte said it was his decision
that ban went to every single Rappler reporter, to every single private event that the president
is part of. We've now challenged this at the Supreme Court, and we've had other news groups,
other journalists, joined the challenge at the Supreme Court. But again, is the judiciary?
captured. This is a question I ask myself all the time because by the end of President
Duterte's term, in two and a half years' time, he will have appointed 13 of 15 Supreme
Court justices in the Philippines. So this is why we wanted to have you here. Your message is
not heard by the – we're even just talking about the Philippines and we're – we haven't
even gotten to the full global south. I mean, the point is that, you know, we've had other people
on this podcast, Guillaume, talking about YouTube recommendations and Renee. And, you know, we talk
about the Western countries. Oh my God, the Yellow Vest movement being amplified by Russian trolls
and these sort of Western examples. But then for each of these Western examples, we know well,
the anti-vaccine conspiracy movement, the 9-11 conspiracy, Sandy Hook, Alex Jones. These are the ones we
hear about because we have Western media that covers it because we live here and it reaches the
umbelt of people living in Silicon Valley. But then we've just created this world where we're
impacting a bigger world than we have the awareness for we're impacting. It's irresponsible to
have power that causes harm beyond the scope at which you become aware of it. If I hurt you by saying
something that hurts you right now, looking face to face at Maria here in the studio, my nervous
system, my evolutionary nervous system makes me feel what I'm doing to you. When you tear up a second
ago, I tear up, right, because it affects me. That's built into my nervous system. There's a
closed loop between what you feel and what I feel. If I impact 100 million people in a country
have never visited, I can't feel that. Right. And it's easy to believe.
that we can just keep doing what we're doing.
And so here we are.
You've just come from a week in Silicon Valley.
You know, we have a lot of people from policymakers and media
and people who are product managers, designers, leaders at these technology companies.
What is the thing you want people to know and understand?
I think to the tech folks, right?
It's not us.
It's them.
It's not us.
It's them.
And by them, you mean the users?
Someone else, right?
It's like it's a bad actor.
Yes, yes.
Yes, yes.
It's the bad apples, not us.
Yes.
Instead of we are a bad apple factory.
Exactly.
They try to make it be, it's a bug.
It's a reason.
It's because we didn't do this one thing.
But it's the system, right?
So the first is if you take responsibility, you can fix it.
If you're in denial, the longer you're in denial, the worse it will get.
And the worse it eats at you when you go to sleep at night.
Yes.
Have you noticed any like markers of like that kind of denial or like the responses you
Yeah, some responses we often get are it's a neutral platform or net net it's good.
Or we're doing a lot of good things.
One, it's not a neutral.
There is no neutral platform because the systems that created it should have values, right?
And those values actually didn't fit the rest of the world.
Early this month, the Cambridge Analytica was a blower Chris Wiley and I cornered him and I asked him the role that the Philippines played.
Because obviously our data is something I hold on to a lot, right?
And he said that the Philippines was the petri dish for Cambridge Analytica.
And he said that even before Cambridge Analytica, SEL, the parent company, had been operating in the Philippines.
It makes sense.
Large population, large adoption of social media.
And what did they do?
What was the way in which it was the petri dish?
He said that in countries like the Philippines and others in the global south, this is where they experiment with tactics, tactics of manipulation.
If it doesn't work, there's no problem because there are few regulations, right?
But if it does work, they then take those same tactics.
And his word was they port it over to the United States and to the West.
Where you're testing ground, where we are, you will be.
And the Philippines, I focus on that because even during the days of Yahoo, digital products were first tested in the Philippines,
We're English-speaking nation or 100 million people.
So we're a little test case for you.
If you don't do anything drastic, we really are your dystopian future.
I think you're getting there already anyway.
But your institutions are just stronger than ours.
In the Global South, and this is something Wiley had said, right, they tested all of these things.
And he said that they didn't actually demand payment for the leaders they helped elect.
It was after they were elected.
the contracts that they got, right? Because then they have far more money to be able to give.
So it's a freemium business model is what you're saying. Yeah, a little bit, right?
Well, and I think one thing people don't understand is actually how inexpensive it is to run one of
these campaigns. This is not even that expensive. So this is where it's like Facebook is an arms
dealer for manipulation and psychological warfare. And its incentive is to drop the price to be as
cheap as possible, to enable mass chaos. Now, okay, we've taken people down a really dark,
deep road, which I'm fine with because it's the reality. But I really want to get to...
How do we get out of it? How do we get out of it? Marie, I'm actually kind of curious how you
cope with this, because I think I've learned to create psychological barriers in myself.
It's so hard to actually take this in. And I'm just curious how you deal with it.
So first, there's Rappler.
We're about 100 people and we're holding the line.
My job is to hold up the sky so that my team can continue doing its work.
We just finished a seven-part impunity series in the Drug War that won an award at the Global Investigative Journalism Network.
So we keep doing our job.
That's one.
Two, I know how fast Facebook pivoted to mobile.
They did it in two years' time.
And I was hoping.
Because their business interests on the line.
Exactly, right?
Well, in 2019, Facebook is aiming for a $69 billion income, right?
So pivot, man, because I think that, you know, we can still recover.
The destruction has happened.
But Joseph Schumpeter said it, creative destruction.
Let's create it.
And let's do it purposefully.
I do have faith in the smarts that are there.
It's just please keep doing what you're doing, Tristan,
because you guys have to convince them of the reality we're all living through.
It is dystopia.
And if they don't do anything substantive, it will take us down an irreparable path.
This reminds me of our interview with Renéidei Resta, where she said, you know, the global public square.
What Facebook says, well, we're hosting the global public square.
It's like, when's the last time you ever heard of the global public square?
It doesn't exist.
We don't have such a thing like that.
We imagine a dinner table with two billion guests at the table and everyone's trying to actually get a turn to speak and then they act.
How do you think that's going to turn out?
I mean, it's sort of self-evident.
Yeah, throwing food at each other.
Food fight.
Yeah.
And we have to start questioning certain presuppositions like does everyone deserve broadcast capability?
And how if you are giving people increasing levels of broadcast capability that it scales with increasing responsibility.
Because the problem right now is, you know, you can say anything and lie and there's no responsibility.
for being wrong. So if you think of technology as this global super brain that connects all of
our thoughts and posts and messages and stuff together, Twitter and Facebook right now are all
just the excitatory neurons. It's just crazy spirals of positive feedback at different degrees
of positive feedback, which of course it's going to turn into chaos and insanity.
One of the things think about all the time is that scale itself does not scale and that if
you sign yourself up to be the single point of failure, then you're going to have every part of the
world trying to make that thing fail in the direction that they want. Right. And so it is an
inherently brittle solution. And instead you have to move to this kind of fractal solution with lots
of different communities, with lots of different norms, because that means it's just a much
harder system to take over, because you have to do it differently in each different community.
The second thing that keeps me really motivated is I'm learning, because it does get me excited.
We're building a tech platform.
We're building a tech platform that takes that idea of building communities of action
because most of the civic engagement groups don't scale when they have a platform.
Nation builder, countable.
You look at these things, right?
We have millions of unique users, tens of millions,
and we want to be able to build communities of action.
So now we're six sprints into a 10 sprint build,
and we'll be rolling it out early next year.
What happens when journalists are in charge of some of those tech decisions?
It's kind of interesting.
I think that the tech platforms have to realize that the public sphere is now theirs to protect, to grow, to pollute, right?
And that if they jump in now, it is still salvageable.
There are many things that can be done.
If a little journalist, if a little group in the Philippines like us can try to do this with no resources, you know, they can.
And so I still have optimism.
And look, finally, I have no choice but to be optimistic because we use the battle cry, hold the line.
Hold the line because I'm not against government.
I am for making sure that we have the rights guaranteed by the Constitution.
We will hold the line.
So I think, you know, the two biggest battles are climate and truth, right?
But more than climate, you can't have action on climate if your information ecosystems.
That's why we work on what we're doing.
It's true.
My personal biggest concern is climate,
and it's the reason that we work on this issue
is that our attention and our epistemic commons,
our truth, is the basis for all action.
And we have to protect that.
And so I hope people take seriously your call to action
and I think just recognize what is actually worth doing right now.
We want everyone to,
to be the moral conscience of the world
and to everyone to have that
in own sense of responsibility.
If everyone walks around
with that level of responsibility,
the world gets better
because it's a distributed system.
It's a complex system.
And so you can't do it in a top-down way.
It has to be through a massive responsibility movement.
And I think that's the only thing to really ask for.
And the weird thing is as much as that might feel hopeless
when you say, okay, well, if I do that,
well, what's the big deal?
Like, I'm just going to change this one thing
in my life, is that going to do anything? And it's like, well, if you talk about it and that inspires
10 other people to take responsibility from where they are, you won't be able to see it.
We get depressed every now and then around our work. I don't get the sense or I don't know
how much this is actually making a difference. If this podcast is making a difference, if the speaking,
if the work, if the, you know, the briefings that we do make a difference. But you hear in invisible
ways that those seeds are being sown and it does make a difference. So I think it makes a difference.
I think, obviously, right.
So I think at no other point in time, I became a journalist because of the power of journalism.
And that power, Spider-Man, great power comes, great responsibility.
You use this in your presentation, I think.
Right now, it's with great power comes no responsibility.
Right.
Nothing comes for free.
And I think this is it, right?
Things that Silicon Valley thought this great growth would happen, there's a cost to it, right?
And the cost is the world.
At no other point in time, have I.
seen information is power. That adage, this time proves it. And I think that your knowledge
of technology, your audience, the inherent audience that you have here, they're the ones
who can do something about this. And I hope we don't wait until a global Bretton Woods or
Declaration of Human Rights, because by that point, it's too late for me. You know what I mean?
Like, I hope that's, and I guess you can always say, you know, this is existential, not just for me and Rappler.
It's existential for journalism because we're attacked on two fronts by the business and the credibility and the attack.
So anyway, I still have hope.
And we must have hope because so much was done by so few here in Silicon Valley.
You need to now pull up and say, what are the values that drive us?
that's the most important thing
and then, you know, do something
in each of these platforms
that's part of the reason I was here.
This is it. Like, you can't just do
an incremental, this will be
better because we'll make
this one small switch in a product.
It's not that. This is
look again at the world
today. You think you can
just make it be about advertising.
You've been gained. You've taken
the rest of the world down this path.
How do you fix it? Because if you don't
fix it, then regulations will come in and it's starting already. But when we bring in the lawyers
and the people inside, I'm hoping there's a movement inside tech, which you're starting,
which you're doing, right? Please. That's our hope and what we've seen is the strongest thing
that's changing the industry is people on the inside, creating this sense of responsibility
because it's much and more expensive to make the economic argument to a Facebook or a YouTube
to replace a disenchanted person
and to keep a newly motivated set of people
who want to correct all these problems.
Marina, thank you so much for coming today.
It's honestly, it's such a privilege.
I admire you and your work so much,
and so it means a lot that you're here.
Yeah, it's truly, it's truly humbling.
No, guys, thank you for having me.
And, you know, I look at your work also.
Please come to the Philippines.
Your undivided attention is produced by the Center for Humane Technology.
Our executive producer is Dan Kedmi.
Our associate producer is Natalie Jones.
Nora al-Samurai, helped with fact-checking.
Original music and sound design by Ryan and Hayes Holiday.
Special thanks to Abby Hall, Brooke Clinton, Randy Fernando,
Colleen Hakeas, Rebecca Lendell, David Chay,
and the whole Center for Humane Technology team for making this podcast possible.
We want to share a very special thanks to the generous lead supporters
of our work at the Center for Humane Technology.
including the Omidyar Network, the Gerald Schwartz and the Heather Reisman Foundation,
the Patrick J. McGovern Foundation, Evolve Foundation, Craig Newmark Philanthropies,
and Knight Foundation, among many others. A huge thanks from all of us.