Endgame with Gita Wirjawan - Maria Ressa: Diktator yang Sebenarnya
Episode Date: March 28, 2024A study by the V-Dem Institute revealed that by 2023, 71% of the world's population was living under authoritarian regimes, marking a 48% increase in just a decade since 2013. According to senior... journalist and 2021 Nobel Peace Prize laureate Maria Ressa, this trend is not a coincidence. She believes that leaders in information technology must be held accountable because the exploitation of lies, hatred, and shallow thinking on social media is at the core of today's democratic crises. Maria is the founder of Rappler, an independent news channel established in 2012 in the Philippines to champion press freedom as a cornerstone of democracy. Among her notable works are "How to Stand Up to a Dictator: The Fight for Our Future" (2022), "From Bin Laden to Facebook: 10 Days of Abduction, 10 Years of Terrorism" (2013), and "Seeds of Terror: An Eyewitness Account of Al-Qaeda's Newest Center" (2003). #Endgame #GitaWirjawan #MariaRessa ------------------ Get Maria's book from Periplus bookstore here: https://www.periplus.com/p/9780753559208 ------------------ Understand this episode better: https://sgpp.me/eps180notes ------------------ Earn a Master of Public Policy degree and be Indonesia's future narrator. More info: admissions@sgpp.ac.id https://admissions.sgpp.ac.id https://wa.me/628111522504 Visit and subscribe: @SGPPIndonesia
Transcript
Discussion (0)
I stand before you a representative of every journalist around the world.
Who is forced to sacrifice so much to hold the line.
To stay true to our values and mission, to bring you the truth and hold power to account.
This is the tipping point, Gita.
The Maria Reza.
Journalist and president of Rappler,
Philippines top digital news site.
The first Filipino winner of the Nobel Peace Prize in 2021.
She was the first journalist to be honored with the Peace Prize since 1935.
What is happening to us is coming for you.
We now have Marcos Jr. as our president in the Philippines.
Indonesia, for all intensive purposes, has our dear friend, now President Prabolo, right?
You cannot have integrity of elections if you don't have integrity of facts, if you don't have a shared reality.
All of that has already happened.
But really, the more powerful dictator is Mark Zuckerberg of Facebook.
The minute you make the tech companies responsible for what they are feeding us, then it will stop.
Right now they're not.
It's absolute impunity.
In the short term, it is just us.
It's hand-to-hand combat.
Hi, today we're honored to have Maria Ressa, who is a Nobel laureate.
She won the Nobel Prize in 2021, and she's also been the leader of Rappler.
Maria, thank you so much for gracing our show, endgame.
Thanks for having me.
I can't help but say, Pa, Gita.
But it's good to see you.
Cut out the pa.
Look, spend a few minutes talking about how you grew up.
Being born in the Philippines, went all the way to Jersey, went to Princeton, and all the way to, you know, where you are today.
Please.
You know, I like calling myself a boundary spanner in social.
And this is, I learned this while I was in Indonesia, you know, social network analysis.
A boundary spanner is part of different, many different groups.
And for me, it was, I was born in the Philippines.
I grew up in Tom's River, New Jersey.
So in the United States, my family left shortly after martial law was declared in the 70s.
And then I landed and I could barely speak English.
And then in 1986, during the People Power Revolution, that year I went back.
I had graduated college and went back to the Philippines to try to figure out who I was.
I also then in 1995 opened the Jakarta Bureau where I learned about Indonesia, which is an incredibly diverse country, really the lynchpin of Southeast Asia.
And I love that saying, you know, you think you know something, but you don't know until the first year in Indonesia you feel like, okay, this is familiar.
I know this.
and then the more you know, the more you realize you don't know.
So what I mean, so many.
I turned 60.
So I'm, you know, I'm in a good place.
And yet the world is on fire.
I think this is this is what I struggle with.
This is my 38th year as a journalist.
So I began in the Philippines in 1986.
I opened the Manila Bureau there
and then went on to open the Jakarta Bureau of CNN.
At the time when CNN was still called Chicken Noodle News,
I was with the network for almost two decades, the best of times.
It was the most wonderful way to learn countries, cultures, political systems, languages,
and to understand that people are really the same across,
Indonesia was very familiar to me.
It took me a little while to set up the Jakarta Bureau.
It took me a while to learn Bahasa Indonesia.
But that was my mooring, my anchor, Southeast Asia.
And I would say in the beginning part of my career,
I covered the shift of Southeast Asia from authoritarian one.
man rule. So the pendulum was released and then the pendulum swung. So I was, I did people, you know,
all of this, 1998, the fall of Suharto. I remember when he was on the plane. I remember I was there
when the, when the riots happened in Jakarta. I was there. Then it was like opening Pandora's
box. I mean, I was at Trisakti when the
were shot. And then Pandora's box opened and every
week I would be traveling from one city to another city
and it felt like the violence that had been suppressed
under almost 32 years of Suharto really just popped out.
And then so much change, right? Like, remember, if you remember
that time, it's still every year we had a new president in Indonesia. And I
you know, it was, I think even for the Indonesians, at a certain point, it was too much change.
And then there was a nostalgia.
And that I learned from both the Philippines and Indonesia.
So the pendulum, I think the beginning part of my career, the pendulum swung to democracy.
And ironically, at the end part of my career, I begin to see the pendulum swinging back.
And, you know, we can talk about all the many reasons for that.
But I think that, you know, you asked me to talk about where I come from.
I think we're made up of our values.
And our values depend on our, on how we grew up, on our culture.
I think Southeast Asia is not as radically individualistic as the United States, for example, or the West.
Right.
We place a lot of, we suppress individual needs.
for the greater good. This is something I've seen all throughout, our families, our, our, our regions.
So, yeah, I'm a hybrid. Maybe like you, fuck. I mean, you know, you, you would have, you could have
passed as an Indonesian. The first time I saw you, I thought you were Indonesian. So did many Filipinos.
Yeah, yeah. But let's, let's jump forward a little bit. I mean, you've, you've written three beautiful books.
right and to what extent which you attribute that literary experience to some of the most difficult
times that you had to go through as a journalist? I'm sure you feel the same way, you know,
having been in government, but I think that I use this phrase all the time and then when
the violence starts, when violence starts, I'm careful, but you know, what doesn't kill you
makes you stronger. And all the book, the three books, what they have in common are some of the
worst times that turn out to be some of the best times, right? It's funny. The hardest personally for
me has really been the last few years, starting in 2016 here in the Philippines when my government,
then-President Rodrigo Duterte began to attack me personally.
and Rackler, the company I had formed.
And then I realized that, you know, you don't really know who you are
until you're forced to stand up for it.
But in my first book was called Seeds of Terror,
and I wrote it very much like a reporter documenting
what we had lived through in Indonesia.
What an incredible country, right?
Like, this is the linchpin of Southeast Asia,
incredibly diverse. It's deceptively familiar, much like the Philippines is deceptively familiar to an
American. The Philippines is a much easier fit for a westerner. You know, you can kind of get through.
Indonesia, a little bit harder, but seeds of terror was about the al-Qaeda's network in Southeast Asia,
how it co-opted.
So this virulent ideology of al-Qaeda,
which distorted phrases like
an attack on one is an attack on all,
distorted the phrase jihad, right?
A jihad in Islam, you know this is a personal battle.
It's like in Christianity, in Catholicism,
it is your battle for your conscience,
the battle to do good.
The jihad in Islam was about the internal battle
to do good.
And yet it was distorted and used to co-op homegrown groups in our part of the world,
in Southeast Asia, to recruit them into a global jihad.
And really the flashpoint was Afghanistan, the Afghan training camps.
So that was what the first book was about.
Jama Islamia had a basis.
It was funny because the training camps were in the Philippines and part of Indonesia.
but the theater of operations were in Indonesia largely and then much later in the Philippines.
But it spread through Singapore.
They had plots in Singapore that didn't happen.
They had a cell, a fundraising cell in Australia.
So you can see Mantikis 1 through 4.
The second book was an extension of really my work as a journalist.
I became a journalist because I long known that.
information is power. If you don't have the right information, you cannot take the right steps forward.
And it's part of the privilege of being a journalist, whether it was in the United States,
in the Philippines, or in Indonesia, or in India, right? Like so many different countries around the
world, and you realize that what holds us together is our shared reality, the information
that we all agree to. And it's a common thing, because in the end, government,
is just a way of organizing our societies, right?
So it doesn't matter what kind of government you have,
but you agree, for example, that a red light means stop.
That's simple.
And that if you cross that, that that is against the law, right?
So this is kind of a very basic idea of rule of law.
My second book was from Bin Laden to Facebook.
And it took forward, sorry, I jumped in, but it took forward the idea.
of how the virulent ideology of terrorism spreads through societies.
And in Indonesia, I learned person to person.
But there were already the beginnings of the Internet, but not really.
It was really person to person.
And I met a lot of the, in quotes, terrorists, right?
Abu Bakar Bahashear, who I love his phrase, very much like bin Laden.
He said, you know, I make many knives and I sell many knives.
but I'm not responsible for what happens to them.
You know, this was the Bali bombing, Amrosi, the smiling Bali bomber.
I mean, so many things there.
So from bin Laden to Facebook, took it into the Philippines
when the very first YouTube video of a Filipino speaking in Arabic,
telling everyone in the world to come to the Philippines for jihad was released on YouTube.
right and that was
2009
so not that that far away
and I had come back
I had decided to leave CNN
in in 20
the end of 205
right around the tsunami actually
and then come to the Philippines
and head the largest network here
ABS CBN
and at that point
from bin Laden to Facebook
where a lot of the ideas of
how we do information
cascades it was also I didn't
realizes until later the radicalization.
Because the radicalization, the growth of extremism, which I tried to track in counterterrorism,
is now happening in the political world, which led to the third book.
And the book is called How to Stand Up to a Dictator.
It was released in November of 2022.
The same week that Czech GPT was released by opening.
November.
Yeah.
So that book is really tracking the growth of journalism through my experiences, but then beyond that, the impact of technology on the information ecosystems globally.
I talk about, you know, the title is how to stand up to a dictator.
What were the two dictators that I was referring to?
One was the most powerful man in the Philippines throughout our history, Rodrigo Duterte at that point, because it took roughly six months to,
crush our institutions and give him all power after he was elected.
But really, the more powerful dictator is Mark Zuckerberg of Facebook,
because at the time I was writing this book,
the Philippines had spent six years,
for six years in a row,
Filipinos spent the most time online and on social media globally.
So we were like a petri dish for all the experiments,
that could be done on social media.
And that corruption of the information ecosystem
has changed everything.
And what we've seen, you know, in 2016,
I used to say that, you know,
what is happening to us is coming for you.
And you, at that point, I was in the United States.
And of course, January 6, 2021 happened more violent
anything we'd had in the Philippines.
But I also find it fascinating that we now have
Marcos Jr. as our president in the Philippines and Indonesia, for all intensive purposes,
has our dear friend, now President Prabolo, right? So it's, are we...
Full circle.
Sure. Right. Right.
No, you've, you know, we've gone used to this notion that democracy has been or is being
equated to algorithmic amplification. And then you've been very passionate about this issue,
And I'm kind of disillusioned with the idea that the democratization of information was supposed to translate to the democratization of ideas, but it hasn't.
Right?
And how do you think this will reshape the future leadership in any dimension, whether it's in a house of worship, any kind of social institution, academia, policy, politics,
entrepreneurship, what have you.
We're seeing this tremendous tendency to sensationalize
based on the asymmetry of information,
based on, you know, you've used this phrase,
massive operationalization of misinformation.
We call that disinformation.
Talk about that, Maria.
Wow, I let me begin with where,
so I think of the information ecosystem like a river, right?
It's a river flowing downstream.
And what we see downstream,
let's think about it like the pollution of a river.
And there's a factory up top upstream that is spewing,
that is polluting the river.
And then what we're doing is we're here downstream,
like picking up a glass of water,
cleaning up the water,
and then dumping it back in the river.
That is downstream, right?
That's what fact checking does, content moderation.
then what we need to do is to go all the way up to the factory and shut this down.
So it stops polluting the river so that we can actually have a clear public discussions,
debate, listen to each other so that we can have what is necessary for a democracy.
So let's talk about the design that brings it there.
In 2018, MIT came with a study that said that on social media lies,
spread six times faster than facts, because facts are really boring, right?
Journalists spend our entire lives. This is my, like I think I said, 38th year, right?
So it's been almost 40 years. We spend our entire careers trying to learn how to tell stories
to keep your attention. Now we're fighting against social media platforms that literally
their goal is to keep you scrolling. And in order to keep you scrolling,
they will feed you lies six times faster, six times faster.
Lies will spread.
And then what we saw in the Philippines is that if it is laced with fear, anger, and hate,
it spreads even faster.
That's globally.
That's what we've seen now.
So the incentive structure was turned upside down when it became a profit motive.
right. That's one part. The second part is in order to get to the business model of this,
as former trade minister, you would know this, right? Like in the end, it's the business model.
And the business model is surveillance capitalism. We didn't even get that name until 2019.
This means that these tech companies can take your data, combine it, right? If you're on the platform,
everything you post, say you have 500 posts on Facebook or on X on Twitter.
It will take all your posts and your relationships and anything else you may do on the platform,
quantified in data, and build a model of you that knows you better than you know yourself.
Why do I say that?
Because essentially, they will use a much faster computing model to be able to clone each of us on the platform.
So stop using the word model and let's use the word clone.
Once we are cloned, AI, so this is the first contact of the entity with AI,
AI then comes in and takes all of our models, our clones,
and puts that into a database.
This is the factory.
That is then it's sold.
This is the massive database use for micro-targeting.
This then is sold, your weakest moment to a factory.
a message is sold to a company, that's called advertising, or to a country, that's called
information operations, your weakest moment to a message. This I began calling insidious manipulation
because it is, one, we don't know it's happening, the individual people. And in the Nobel
lecture in 2021, I compared it to an atom bomb exploding in our information ecosystem. Because
literally we are like Pavlov's dogs, right? Because A-B testing, which is what tech companies will say,
you know, they'll try this versus this and see which one keeps you scrolling longer.
So one algorithm, for example, that is the same. Every single social media platform uses this,
which is the algorithm for growth, right? So meta, what's the algorithm for growth? They found out
that if they recommend to each of us friends of friends,
that we will click on it because they're friends of friends
and grow our network and in the process grow their own network.
But that Friends of Friends algorithm is literally the algorithm
that polarized society.
So whether it's in the Philippines or in the United States,
so you have Duterte or you have Trump,
whatever this was in 2016,
In the Philippines, we didn't debate our facts.
But in 2016, when President Duterte won,
if you were pro-Duterte, that Prince's algorithm moved you further right.
If you were anti-Duterte, you moved further left.
And over the years, again, replaced Duterte with Trump, with Orban, with Bolsonaro.
That gap grew wider.
That's the polarization.
Sorry, there's a lot of stuff to say.
So this, I began to call this the corruption in the information.
Because when you begin to reward lies, I mean, would you reward your child every time that person lies?
Every time your child lies.
What kind of person will your child become?
That is what has turned the value system of the world upside down.
And it is not a coincidence that as of January 23, so this was a year ago, we're
waiting for the new results to come out.
72% of the world is now under authoritarian rule.
That is up from 60% in 2022.
So look at that gap.
And then 2024 is an extremely critical year
because about half of the world will be voting this year,
including the largest democracies.
You're in the United States.
The U.S. will vote in November.
But the polarization and the radicalization of all of us on these platforms is really cause for alarm.
Do we still have agency?
Or are we being insidiously manipulated?
And I think the last part is it doesn't end with that first contact of AI on social media.
In November 2020, generative AI was rolled out publicly, right?
This is chat GPT was the first mover.
When it did this, this is large language models.
They thought, OpenAI thought that when they rolled it out,
that about 100,000 people would download and help them train their model.
But it became 100 million people training that large language models,
which essentially scooped up all the content on the internet.
That means the bad, right?
Whatever was there on social media already,
incentivizing the worst of human nature
was then pulled up into the large language model of OpenAAA.
Of all their 10 major models out right now.
And the problem, of course, is that this now gives,
it can be the industrialization of fantasy, of fake.
I even hate to use the word fake news
because the other thing that's happened the last few years
is a flattening of meaning, right?
Democracy in China doesn't mean the same as democracy in Indonesia or in the Philippines or in the United States, right?
These are different.
All the nuances has been sucked out of the public sphere.
I'm okay with the different interpretation of what democracies ought to be like.
What I'm not okay with would be the following.
the lie that travels six times faster than truth
and the continuation of this polarization of conversations, right?
And I go back to the earlier question,
what sort of hope do we have for ushering a better kind of leadership in any dimension?
And using your metaphor of the atomic bomb,
you know, there since we've had seven decades or more of non-proliferation.
Yep.
Right?
What would be the equivalent of non-proliferation in a context of whatever we're seeing on a social media?
At the rate that this has not been regulated, at the rate that the regulators don't understand what needs to be regulated and how it needs to be regulated.
And at the rate that, you know, you've referred to, you know, this term called colonization,
where there is an accumulation of value or wealth at the expense of the majority.
What's the hope for nonproliferation in the context of, you know, this symmetry of information?
This is the tipping point, Gita.
I mean, this, right?
And part of it is because it was like a silent atom.
Right? When the atom bomb was dropped, 140,000, 144,000 people were killed. So the devastation was clear.
This is, you know, and I'll use again a phrase that Osama bin Laden was used, this is death by a thousand cuts.
And so, you know, we're like slashing and bleeding out, but we each cut is small enough that we don't pay attention to it.
And yet the impact overall is huge. I think you mentioned two things, right?
So the abdication of responsibility for protecting the public sphere comes in two places.
The first are the big tech companies that are making significant amounts of money,
like incredible amounts of money.
And then the second abdication comes from democratic governments because they have protected us.
And I go to a government like the United States, for example,
Because what the big tech companies will say is, you know, but if we don't develop it, then China will.
That's like, it doesn't even compute, right?
So you're going to drop the atom bomb, which of course the U.S. did, but you're going to drop the atom bomb.
But now if you compare it to that time period, we've had like hundreds of atom bombs.
We would have already destroyed it.
Our information ecosystem has been destroyed.
It's hard for any government to govern well at this point in time.
because what has been destroyed is trust.
And the kinds of insidious manipulation that is happening,
information warfare, right?
I mean, look, again, not a coincidence that there is war again
in a way that we haven't had in a very long time
because Russia, for example, began information operations in 2014
first to seed metanarratives that it used to annex Crimea.
And then that was in 2014.
And then those same metanarratives were used by Russia to invade Ukraine itself in 2022.
We've seen something similar in the Philippines, kind of the changing of history in front of our eyes.
And it isn't just happening in the Philippines.
I see it happening in the United States, right?
Where history of, I mean, remember the South, the Civil War in the United States,
and how that is also being fermented again.
History and the Black Lives Matter,
how that riot was actually being pounded on both sides.
Black Lives Matter being targeted on both sides
by information operations from outside the United States.
So insidious manipulation, that's first.
You can't leadership do this?
In the past,
a good leader takes conflicting sides and brings them together, bridges the gap.
Today, it becomes even harder and at times impossible.
I talked about Ukraine.
Now we've got the Middle East, right, in ways that would not have been possible before.
And that war in the Middle East is what's happening in Gaza is now polarizing generations.
the older generation, I mean, let's talk about the United States alone.
There's a gap between the older generation and Generation Z,
which is seeing both real violence on social media,
but also propaganda on all sides.
And they are being, our emotions are being played.
And frankly, I mean, we should be stopping when South Africa asked for
an end to the violence, because you're talking about almost 30,000 people killed.
How can we stand by and watch all of this?
Let me not go into Gaza now, right?
So the second thing that you mentioned is colonization,
and that this is a little bit easier to deal with,
because in the end, these algorithms, the code that is now ruling the world,
was created in Silicon Valley and in China.
So these are the two, but it originally was the United States.
States. And then who did the work, the content moderation, to clean up the machine? That came from
the global south. That came from the Philippines, from India, from Indonesia, from Africa, right?
So we've been, this is Christopher Wiley, who really said that colonialism never died. It just moved
online. And then the harder part here is that the very workers who now have to actually
actually, A, be exposed to the violence, but two, who clean up the large language models for the global north are coming from the global south.
So we're twice calling this.
It's an information system that we need to fix.
And what's the hope?
That's your question.
The hope is that governments, democratic governments, wake up before it's too late.
I laugh when I say this, but the EU, the European Union, has,
won the race of the turtles. And it has put in place the Digital Services Act, which kicked in.
It's fully implemented last Friday. I was at the Munich Security Conference when that happened,
right? So all of those measures are in place. But even as that kicked in, the big tech companies,
then they didn't address these issues. What they did is they leapfrog to large language companies.
Right. So I think that's our problem is that you need to protect people, whether it's the Better Business Bureau in the United States or whether it was FDR's New Deal, right? The governments actually need to clean up the public sphere. And the difficulty in this is that the big tech companies argue that they stand for free speech when in the end,
And that is a lie because free speech is being used to stifle free speech.
When you have an attack coming at you and saying a lie a million times on social media, it becomes a fact.
And this is the problem of the breakup of our shared reality.
Yeah.
Well, it's really not free speech.
It's paid speech at the rate that you're susceptible to this clickbait syndrome.
And what does that do? There's a degradation of journalism, right? Like the incentive structure is upside down. Good journalism. It's actually ranged against good journalism. So you have, you know, several things happening at the same time. The business model of journalism is dead. This is advertising. The ROI is cheaper if you go to micro-targeting. And yet that is manipulative and killing information you need. So the first is the business model is dead. The second,
is that news is distributed on the same social media platforms,
the world's largest distributor of news up until last year was meta, was Facebook.
And last year they began choking traffic to news organizations.
This is a problem with all the elections coming up.
So instead of fixing the design of the platform,
they just decided to take away news.
I mean, we on these platforms are more susceptible to disinformation,
to getting
spread to us, by design.
And then I think the last part is
if you're degrading or commodifying news,
then what happens?
How are you going to get your information?
Because even with large language models,
with generative AI,
no one is going to be able to tell you
exactly what's happening at this moment in time
except real people, journalists.
And if that doesn't survive,
then it's a hop-skip and a jump to the destruction of journalism as well.
Because the journalists, our standards ethic should push us.
We try to hold power to account.
Not always successfully, right?
Because some of our countries in Southeast Asia, where corruption is endemic,
it flows into every part of society, including to journalism.
But right now, it is this tipping point moment where we need to help it survive.
So last thing on this is that, you know, in 2019, I decided to stop being a punching bag and then to try to do something about this.
So we created something called the International Fund for Public Interest Media.
It is to go to governments that have ODAs, overseas development assistance funds.
And we pointed out to them that 0.3% of ODA go for independent media.
if we can just raise that to 1%, that would become a trillion dollars a year.
So in the first year, we raised maybe $50 million,
and we've now started giving out the grants,
particularly for the GILSA to help independent media survive this time period.
That's not a cure for everything because we still have to do so much more.
But the other part is governments.
I feel sorry for anyone in government today who doesn't want to love.
lie who doesn't want to instig fear, anger, or hate. It is difficult. And I think, you know,
there's a reason, again, why the kleptocrats are coming together globally and changing our
world for the worst. I don't want us to lose our faith in humanity because the incentive
structure of the distribution of information is the worst of humanity. It triggers the worst of
who we are. But in the end, I've seen so much good in the worst of time.
Yeah.
Right? So we have to find our way back to that.
I want to test the following observation, if not hypothesis, with respect to the first one is
the internet, right? The internet was supposed to be the greatest democratizer, right?
it did democratize information.
It didn't democratize capital.
It didn't democratize ideas.
It didn't democratize a lot of things that could have led to the creation of public goods for the many.
Right.
And as a result of which empirically we're seeing rising genie coefficient ratios,
you know, economic inequality, social inequality, cultural inequality, value inequality, right?
we've seen diversions of values in many places around the world, both in the global north and the global south.
Then we're confronted with this thing called AI, which in my view, it's not a challenge with respect to software.
It's a challenge more with scalability.
When one talks about scalability, it's an engineering undertaking.
When one talks about an engineering undertaking, it requires capital.
So it's only going to further elitize and unequalize what the pre-existing unequal landscape would have been.
So where is this all going to end up?
Right.
We're talking about basically having to accumulate trillions to fight against this behemoth that's got the economic war with all.
the technological wherewithal, that's into trillions of dollars.
Yes, into trillions.
I mean, you pointed out the right thing, and this is exactly in your house, which is in the end,
it's the economics, right?
And particularly post-COVID, the rich got richer, the poor got poorer, the gap has
widened all around the world, and that is fuel for the fear, anger, and hate.
right that this all fed same thing but the first question really was was were our economics working
and you know this kind of market economics a market driven economics was it working and in many
instances we were actually seeing that it wasn't especially after if you look at the at the
at the s mp 500 you know it's driven now by tech right in 23 503 50s
billion dollars was invested into generative AI, generative AI with massive environmental costs and
still doesn't really have a purpose yet, right? It's quite speculative, but because it is money-driven,
it is driven, it's power and money, right? So in that sense, you're absolutely correct. And then
let me use the right word to see how that connects to our governance structure.
It is the rise of illiberalism, I think, that connects to that, partly because the information
ecosystem as designed then was able to fuel the anger and hate.
This is why former President Trump has such a solid base of support.
It is why the far right is also rising, right?
because those real world inequalities are also rising.
So that's connected.
You talked about artificial intelligence.
You know, artificial intelligence is neither artificial, nor is it intelligent.
It's a nice little phrase, but in the end, this captures what has already been created, replicates it.
It's, you know, garbage in is garbage out.
It doesn't create new things.
And much as it's fun to use chat GPT or Claude or Gemini, whatever we want to call these things, they still hallucinate.
Right.
The best uses and this, I do have to say, have been in the medical field and in CRISPR technology.
Because if you look at DeepMind, which was bought by Google, it's not just the machine behind the search.
It's also the machine behind synthetic biology.
CRISPR technology when it was discovered,
the governments realized that we cannot play God
and they put guardrails in place relatively quickly.
But we did not do that in information,
which manipulates our emotions,
which changes the way we look at the world,
which changes the way we vote, right?
Emotions is the way in.
So the solution for all of this, right?
Like, what do we do?
I think the first thing is, and this is what we've been trying to push forward, is we came up a year ago or two years ago now, Dmitri Muratov and I,
Dmitri is the Russian journalist, head of Nevaia Gazeta.
We were the ones who won the Nobel Peace Prize in 2021.
We did a 10-point action plan for what we were seeing, right?
And that has now been signed by over 300 Nobel laureates, civil society groups all around the world.
and it comes down to three buckets.
The first is to stop surveillance for profit.
This is a business model.
It's not just extractive.
It is corrosive.
It steals our data and insidiously manipulates us.
So that's the first.
For big profits for the tech companies, right?
So that's stop surveillance for profit.
The second is stop coded bias.
We talked about this a little bit about code from the global,
in the global south. Well, coded bias in our virtual worlds right now, if you are a woman,
LGBTQ plus, if you're Filipino or Indonesian, if you're brown, if you're marginalized in the real
world, you are further marginalized online because it is coded into what rules are world.
So stop coded bias. And the third is journalism as an antidote to tyranny, right? Because
for all intents and purposes,
tech has never come out
with a set of standards and ethics,
the values it lives by.
So the journalism in Rappler,
now we've built our own tech.
We're building our own
matrix protocol chat app
that will protect our community
from insidious manipulation.
But let me end that last part with this, right?
In the end,
this extractive model, this isn't the first time this has happened in history, when human beings have been commodified in the age of industrialization. And I remember this from history books when I was still in school. When labor, the first thing that was commodified of human beings was labor. That is during the age of industrialization. And when you had the robber barons, you know, the Carnegie's, the Rockefellers, when they started forming factory lines,
What did we have? Exploitation of people.
Childing, sweatshops.
And what happened?
The government stepped in and stopped it.
Unions were formed, right?
We're just slow right now because what has been commodified our emotions, our attention.
And you're already seeing the impact on this generation.
Gen Z, right?
Well, it took a long time for the U.S. Attorney General.
Surgeon General, for the U.S. Surgeon General to come out with a report he finally did in May
last year. And he pointed out in that report that teenagers now have higher levels of suicides,
higher levels of depression, higher levels of sleeplessness, and that he also did hand in hand
that he pointed out that there is an epidemic of loneliness. Right. Right. Addictive behavior that is
designed to be addictive is robbing us of happiness.
And so what generation are we creating now?
And the last part, of course, is the world is an existential point for climate change.
How can we solve climate change issues if we don't have the information to do that?
That's part of what has pushed it back, is are the revenues of big tech companies worth all of this?
Absolutely not. Do we have the power to change it? We do, but civil society is bottom up.
Journalists are under attack all around the world. Governments do have the power to do this,
but it's a tough time because we're extremely polarized, marginalized with real world violence now.
In a way we've not seen since World War II.
You know, you've talked about this in previous conversations about the negligible amount
allocated by social media platforms for content moderation purposes.
And I don't think that's been taken a view off by the regulators or, you know, a large chunk of the civil society.
What sort of hope do we have?
And I want to take you to some of the prescriptive ideas that you've articulated before with respect to, you know,
investing in the long term in education, in a midterm in legislation, and in the short term, journalism.
Talk about these.
I would say more than journalism, because news organizations are extremely weak right now.
We don't have, I mean, look at the United States of the beginning of 2024, right?
The L.A. Times closed its Washington Bureau.
You have, you know, so many journalists have been laid off now.
So I would say in the short term, it is a whole of society approach.
A phrase I've heard in Indonesia, it is time to stop being a user of the tech and become a citizen again, right?
Because citizen, our responsibilities as citizens doesn't stop with voting.
It isn't just your vote, which now is manipulated, right?
But it is about, I guess this is it.
This year in particular is the moment where you have to figure out what is important to you.
What are your values?
Are you going to stand up and fight for democracy?
Or are you going to be okay, right?
Because sometimes, I mean, we've seen this in our countries.
Democracy is chaotic.
It's too much work.
And so they'd rather, I've seen, you know, a wish.
for a strong man ruler because then you don't have to think about it.
But hand in hand with that means that if you don't get a voice in how your world is run,
then if you get targeted, you have no one to run to.
Right?
It's a tough one.
But again, this is the moment we're in.
So yes, in the short term, a whole of society approach, we've seen this.
We need to mobilize beyond elections.
And the first step is really to demand guardrails around tech.
It's interesting because at the Munich Security Conference, Microsoft led the big tech companies into an agreement to stop the industrialization of generative AI.
Well, that just means that they want to mark what is created by large language models.
But they didn't say anything about the design of the design of the.
the distribution. And that remains. So that's the first. I think we need to organize and demand better.
The second one is our educational system. In almost all of our countries, history is shifted,
is being shifted because how many more flat earthers now believe this, right? Like a very,
very simple thing in the United States. But let's talk about something simple that is a document that came
from META. During the 2020 elections, when a very, very small change in code, and this is shifting from
the bundled invitations, bulk invitations, that is the way, if they do bulk invitations,
you grow the platform faster, but if you do bulk invitations, you also then have exponential growth
for all the groups.
So what did it show?
This document showed that after the 2020 elections,
when MEDA took away the guardrails
had put in place for the elections,
they then allowed bulk invitations again.
And there was a little known group called Stop the Steel
that grew at 400,000 a day.
Wow.
So the violence on Capitol Hill
was directly connected to the,
the growth of these groups that planned led to the chaos and the violence on Capitol Hill.
So these are, I mean, but it isn't the first time we've seen this, right?
Meda and the UN itself, Marzuki de Rusman went into Myanmar.
And he came out with a report that said that this platform enabled genocide.
Yeah.
That was in 2018, right?
Yeah.
Again, we have the evidence and we haven't done enough.
But legislation, yes, Europe also has an AI act,
but that hasn't stopped the experimentation on real people in real life.
And then the long term, of course, the education,
we're just teaching our people to distrust everything.
Look, I'm with you in the sense that the Europeans are winning the turtle race, right?
Yes.
I'm aware of time here.
The reason, in my view, as to why the Europeans are winning the turtle race,
is that they have not been a big beneficiary of the software game, right?
And at the rate that the Chinese and the Americans have been, the biggest winners,
this tendency or propensity to cultivate more than to question,
I think it's a structural risk to humanity.
Yeah. Right. So how do you infuse the right political culture in the household, in the places of worship, in the social institutions, in the office, in the schools, and the nightclubs that you go to so that you have the right conversation, you have the right discussion, you have the right discourse, so that, you know, we can achieve whatever you've suggested that we ought to achieve to get better?
Oh my gosh.
Well, first you brought up.
It's tough, man.
Yeah, yeah.
You brought it.
So here's the other thing that happened, right?
Because the incentive structure of our communications of our public sphere are bringing out our worst.
We've also seen a rise in cults globally, right?
Faith becomes incredibly important.
But you have to, I have faith in the goodness of human nature.
So this, again, right?
So starting last year.
here. I was a wayward Catholic. I'm become a better Catholic now because Pope Francis pulled
together 30 Nobel laureates to begin to look at this problem. We need faith. And faith groups are
incredibly important, but cults exploited. So we've seen this in growth in different parts of the
world, right? And oftentimes, the far right is connected to religion, like a very extreme
form of religion. Anyway, so what you said, how do we bring it forward? This is where civil society
plays a large role. And I would really say this is in the United States, right? We need to
decide what world we want because we are on a back foot. We're playing defense against
tech very wealthy powerful and moneyed companies again i i compare this sometimes to tobacco companies
tobacco companies knew a decade before the public knew that that what they were selling was was
killing people and yet that wasn't pushed out except the problem today is the tobacco isn't killing
one by you know it was killing one by one the tech is killing at scale right genocide this is a problem
it exacerbates the fault lines that were already there in society and tears us apart even further.
So what can civil society do?
I think this is what we were all trying to figure out in the same way that the Western world, NATO,
together against Russia in Ukraine.
That was certainly one of the things we saw at the Munich Security Conference.
this is the time to draw your lines, right?
What are you willing to accept?
And I guess what's the incentive to do that?
Because this is the moment.
And for me, it's been really clear for a while.
I no longer have an editorial role in Rappler, right?
I run our tech and our business.
And then I actually look far more at global policies.
I think it's also strange to me that the heads of news organizations are not involved
in the policy debate that influence every part of, not to the information ecosystem, but every
part of society.
So hopefully we all realize it's, the problem is that what we're talking about, civil
society moves at the pace of human comprehension and human empathy.
But what we are fighting, technology moves at the speed of light.
And so it's been a hard catch-up, but more and more, I mean, look, we began talking about this, showing our data in 2016.
It's 2024.
In the Nobel lecture in 2021, I said that we needed that the United States should reform or revoke Section 230 of the 1996 Communications Decency Act.
That's a very simple thing because the minute you make the tech companies responsible,
for what they are feeding us, then it will stop.
Right now they're not.
It's absolute impunity.
They can bring in, you know, they can fill the house full of terrorists and it is okay.
Maria, I know you got to go, but I got two final little questions.
The first one relates to how you dealt with the most difficult of times in terms of how you were so resolute in being able to draw the line.
in being able to embrace the deepest of your fears in terms of, you know, your ability to beware to mop.
Briefly talk about those.
I think, Gita, that's part of what getting older is like, right?
You, you, I, Indonesia had taught me so much and reporting had taught me so much that you have to make split-second decisions without all the information.
And then you have to course correct as you go.
But when you're making these decisions, what guides them?
It's your values.
And I think that's what I tried to do in how to stand up to a dictator that I realized at this moment, because I was hitting a news organization, if I buckled it, we buckled for our entire country, not just the industry.
And so I felt there was a lot at stake.
And then it becomes just coping mechanisms.
You know, one, I wasn't alone.
I had three amazing co-founders.
In many ways, I was very well placed to be the one taught
because I had, you know, in working with CNN for two decades,
many of my former colleagues who knew me personally and my work
were now heading news organizations in different parts of the world.
So it was easy to actually say,
they were the ones who actually showed.
that injustice was happening in the Philippines.
But I think beyond that is that I'm old enough that I've lived a life of no regrets.
And I didn't want to look back a decade later and realize that I've like filtered my values away.
I've thrown away what I believe in because it was more comfortable because I was afraid.
So, you know, in the book I talk about how there are two things.
embrace your fear and then plan for it.
So if you embrace whatever it is, you're most afraid of.
And we did, me personally and my organization.
And then we workflowed the worst fears.
When you workflow it, and in the case of Rappler,
we drilled, we drilled it.
We did quarterly drills.
It takes the sting out of it.
And we can move forward.
Like in Rapler, for example, one of the drills we did is a shutdown.
Because who knew that I would get arrested in the middle, well, it was 5 p.m., right?
So I was shocked when that happened, and that's when I realized, oh, my God, this is not the old world that I lived in.
So I just, I keep track.
So what's the worst thing that can happen?
And what happens?
What will I, if that happens?
If you have a path forward, then you can hold the line.
Then you can stand your ground.
Because in the end, what is happening to us, and this tech enabled is, here's what our
Constitution says.
These are our rights.
It felt like there was a bulldozer trying to make us voluntarily step back to give up a big
chunk of rights.
And what we did is we just linked arms and said, no, we're standing.
standing firm.
That's what civil society needs to do.
That's what we need to do to uphold these rights.
If we don't, we just have to be aware we will lose them.
And the world moves in a complete different.
Final one.
What makes you happy?
Oh, my God.
What makes me happy?
All the cases getting dismissed.
I mean, you know, I had 11 criminal charges.
charges at one point. Criminal charges,
ma'ita. I was like, oh my God. Don't call me,
pa. Oh, sorry, sorry. So, you see, that's part of our culture.
But philosophically, what makes you happy?
Oh, my gosh, so much. Like, I'll say, the worst of times also brought the best of
times because the worst of times for us, we thought we were alone, but we
God, I saw the goodness of human nature, right? When I was first arrested, there was a woman
doctor who had worked with the UN and she came to where I was being arrested and, like, stopped
me as the police were bringing me in and said, I'm coming with you. I'm your doctor because she knew
that the most vulnerable moment is when you're, you're stripped down, right, after you were arrested
and you go through a medical checkup. But if you have your doctor with you, then you're safe.
you have an extra layer.
I mean, I remember her because she died soon after.
But there were so many moments like that.
Look, I'm bigging emotional.
Seeing old friends, you know, I don't know.
In Indonesia, this is the funniest part because some like Tom Lambeong,
who I knew as when he was, before he joined government,
was voice of Suharto in the in the in the,
in all of the live of Suharto in 1998, of Megawadhi.
He was my simultaneous translator.
And for him to go on and have your post, right, to become trade minister,
was also fascinating me.
Getting older, seeing progress makes me happy.
Okay, but the last one, please get rid of all of my cases.
I didn't do anything.
That will make me very happy.
Maria, you've been great.
and so kind with your time.
No, thank you.
It is good to speak with you again,
and I might definitely be in touch for other things,
because I think, you know,
this year becomes critically important.
Absolutely.
Thank you so much.
Thank you very.
Same.
That was Maria Ressa,
the leader of Rappler.
Thank you.
This is Endgame.
