The Decibel - Introducing: Machines Like Us
Episode Date: May 21, 2024In the last few years, artificial intelligence has gone from a novelty to perhaps the most influential technology we’ve ever seen. The people building AI are convinced that it will eradicate disease..., turbocharge productivity, and solve climate change. It feels like we’re on the cusp of a profound societal transformation. Fifteen years ago, there was a similar wave of optimism around social media: it was going to connect the world, catalyze social movements and spur innovation. It may have done some of these things. But it also made us lonelier, angrier, and occasionally detached from reality.Few people understand this trajectory better than Maria Ressa. Ressa is a Filipino journalist, and the CEO of a news organization called Rappler. Like many people, she was once a fervent believer in the power of social media. Then she saw how it could be abused. In 2016, she reported on how Rodrigo Duterte, then president of the Philippines, had weaponized Facebook in the election he’d just won. After publishing those stories, Ressa became a target herself, and her inbox was flooded with death threats. In 2021, she won the Nobel Peace Prize.As novel as AI is, it has undoubtedly been shaped by the technologies, the business models, and the CEOs that came before it. And Ressa thinks we’re about to repeat the mistakes we made with social media all over again.
Transcript
Discussion (0)
Hi, everyone. Today, we're bringing you an episode of a new podcast from the globe.
It's called Machines Like Us, and it explores the ways that artificial intelligence is changing
our world. It's hosted by McGill's Taylor Owen. He talks to the lawmakers, scholars,
and entrepreneurs who are shaping the future. The first episode is a conversation with journalist Maria Ressa.
She's the founder of the online news site Rappler, and she was awarded the Nobel Peace
Prize in 2021.
Hope you enjoy it.
So if I had asked you what you thought about AI two or three years ago,
I'm guessing you wouldn't have had strong feelings.
But now AI is everywhere.
Your office has a new AI policy.
Your kid wants to use ChatGPT to write an essay.
Or maybe you're worried about AI taking your job.
And the one thing that's become really clear to me
is that there's a ton of uncertainty about where this is all going.
But if you talk to the people building AI,
you won't hear this uncertainty at all.
They're convinced AI is going to solve climate change and cure cancer,
that it'll mean the end of menial jobs
and the start of a new era of economic prosperity.
In some ways, it's a really exciting vision of the future.
And I think parts of it are probably accurate.
But this isn't the first time we've been told the tech was going to solve all our problems.
And the last time I checked, we still have some.
Hi, I'm Taylor Owen.
From the Globe and Mail, this is Machines Like Us.
So before the AI boom, I spent more than a decade studying social media.
And by now, we all kind of know that social media has made us angrier, siloed us from one
another, and occasionally untethered people from reality. But in the early days of social media,
people were really optimistic. Silicon Valley was going to connect the world,
catalyze social movements, and spur innovation. This show is going to be about the future, but I don't think we can fully
understand the moment we're in without considering how we got here. And few people understand that
trajectory better than Maria Ressa. Ressa is a Filipino journalist and the CEO of a news
organization called Rappler. In 2021, she won the Nobel Peace Prize for her unflinching coverage
of Filipino strongman Rodrigo Duterte. Now, back when she founded Rappler in 2012,
the Philippines was the most online country in the world. And Ressa thought social media was
going to empower citizens and revolutionize her journalism. But that's not exactly how things turned out.
After she became the victim of an online harassment campaign,
Ressa became one of social media's most powerful critics.
And now, she wants to talk about AI.
Maria Ressa, welcome to Machines Like Us.
Machines Like Us. I like that. That's interesting.
So the last time we spoke in 2020, you were under the threat of life in prison.
Yeah.
Also under a different leader of the Philippines.
So are you in the clear now?
Can you just update on that?
And what's it like under Marcos?
Oh, wow.
So all of these legal threats began in, oh, I would say 2017.
That followed the year of intense online attacks, right?
And then the investigations began.
There were 21 investigations
and then 11 criminal charges for me.
It was Amal Clooney who then said,
oh, that's like 103 years in prison total.
When we spoke, I had just been convicted,
I think, for cyber libel.
Out of all of that,
there are two criminal charges left.
One is cyber libel. Out of all of that, there are two criminal charges left. One is cyber libel, which I'm technically traveling, so I'm not supposed to talk about it. That, though, can send me to jail
for at least seven years. And then there's a second case that could still potentially shut
Grappler down. That's at the Court of Appeals in the Philippines. For me, it means I still have to
file for court approvals to travel. Every time you leave the country?
Every time. Every time I leave the country. It's a good reminder, right, how quickly you can lose
basic rights. What is it like? So let's just say under the Duterte administration, we were in hell, and now we're in purgatory. So the cases have,
we've been acquitted out of most of the cases. There are two left. I hope that, you know, we get
justice soon because this does eat up your resources, your time, your money, your energy,
right? But in general, in the Philippines, it's better than it used to be, a little more chaotic.
Civil society has free reign.
But, you know, Duterte's daughter, Sarah Duterte, is the vice president who ran with Ferdinand Marcos Jr., who won overwhelmingly in our May 2022 elections.
What a flashback to the past.
I mean, it's just remarkable.
I mean, but the other part, Hillary, is like, it made me realize that it's not just us,
right?
I went to Charleston, and I looked at, you know, the monuments to the Confederacy.
I went to different parts of the South where the statues were taken down.
In America, the losers don't go away, right?
There are these narratives.
They have their own establishments, organizations.
And what the internet has done, what social media has allowed to happen is that they have been able to come together.
And our shared reality now
is fractured. And you spent the last few years raising awareness about how social media might
be contributing to that very fracturing, how it could be polarizing us and making us more
radical even. And now with AI, we're seeing another round of technology that could make some of these problems worse.
And I'm wondering what you make of this latest generation of technology.
And even what you make of the people who are building it.
I've become far more cynical because I think what I used to dismiss as naivete now is quite dangerous for the world. And generative AI, you know, it's funny,
it makes me crazy. Well, first, let's say that in How to Stand Up to a Dictator, the book that I
wrote, there were two men that I actually focused on, the dictator, right, like Duterte. But the bigger one was Mark Zuckerberg, right, tech,
because his power is global in scope. And it makes me crazy that there was absolutely no
responsibility for it. The new round of people, if you look at it, these are similar companies,
they're the same companies, American companies. Yeah, yeah. If you look at the large language
models, right, what is generative AI, large language models, and look at the idea, which In some cases. largely in the medical field, work, right? Which is like you look at- Which creates the promise and the-
Which creates the promise, but they jump into it.
They jump without any evidence.
When you talk about the information ecosystem,
it's a whole different ball game
from doing generative AI for radiology, right?
Which is very specific.
These are different things.
And you said something really insightful
about Zuckerberg in your book
that it wasn't a personal flaw.
He was a manifestation of the iterative software development process, which I thought was so striking.
And I wonder if we're doing that with AI, too.
This like ChatGPT 3, 3.5, 3.5 Turbo, 4, that we're just iterating rather than pausing and thinking and then deploying.
Is it the same thing happening there?
Absolutely.
I think we're making the same mistake all over again, right?
But think about it like this.
As they iterate, why are they not responsible for the harms?
Of the previous iteration.
Not just social media harms, right, which we're still struggling to make them responsible for that. But of the harms that are happening now,
there's an AI startup called Replica,
which is offering you, like, you will have a constant companion.
Like, if the first generation AI and social media
weaponized our fear, anger, and hate,
this one is going to weaponize our loneliness.
And this is where governments, again, cannot abdicate responsibility.
They cannot cede their power to regulate to the big tech companies.
During COVID, for example, why didn't the drug companies just go to a town square and test out their vaccines?
Oh, on this side of the town square, I'm going to test vaccine A. On this
side, I'm going to test vaccine B. Oh no, vaccine A people died. So sorry, we'll go with vaccine B.
These harms that are happening now, they're already being documented, but yet no one is
responsible. I think that's the first. So the first step is accountability,
right? Every day that democratic governments do not exercise their power to regulate and continue to cede to big tech now in the age of generative AI, if they fall for the lobbying, this idea that
they don't understand it, they don't have to understand it. What they have to prevent
are the harms. They need to step in and do this now before it gets worse.
And it took us a decade to figure out what sorts of regulatory tools and levers might help with
social platforms. Why do you think it took so long? Why did it take so long?
I mean, I think these are big, complicated companies that touch on so many aspects of our
lives and are embedded in our society in such profound ways that there was a lot of nervousness
to take them on and to take those levers. But like for social media, we now say like the Digital
Services Act says you have to do risk assessments, right, on your product before you launch them on
society. The Canadian Online Harms Act does the same.
Should we be doing that same kind of thing for AI?
Does that question need to be asked?
Right?
This is what I mean that these responsibilities, you must protect the public sphere.
Right?
What is happening right now that nothing has been done yet?
Right? What is happening right now that nothing has been done yet, right? As of January this year, there's an academic paper that actually came out and said that
57.1% of the internet is now low quality content, meaning that it is LLM generated or LLM translated,
right?
So everything.
So in Nobel lecture in 2021, I called social media toxic sludge. Now what's going to happen is that you are going to have a virtual world full of not just toxic sludge, but people will not know what's real and what isn't. And that will destroy trust even more. Societies that don't, I mean, I've said this so many times.
Without facts, you can't have truth.
Without truth, you can't have trust.
Without these, you can't have democracy.
Yeah, right?
Like think about journalism is going to die in this age.
So if January 2024, 57.1% of the internet is low quality content, what happens when people
tune out, when they distrust everything? That was actually what the Russians wanted to do, right?
How are you going to get, 2024 is an election year, how are we going to get people to care
and understand that despite the crap they are wading through, sorry,
that this is the moment when we must organize ourselves,
our own communities, to stand up for the values and the principles that are critical.
I'm looking to Canada.
Like, Canada can pave the way in a better way.
I know there's a lot of doom and gloom,
but what does a world without manipulative tech look like?
Can we not make it better?
Well, I mean, despite everything you've lived through,
you remain the most optimistic person I've ever met.
And you are still optimistic about technology.
So how do you look at these sets of tools, AI, all these things that are developing,
and see them not just as a potential vulnerability, but as a tool for us to use
to ensure things like reliability information, the survival of journalism, the integrity of
information?
So how could that work in your view?
Well, we have to build it in practice, right?
So Rappler has been, we've been a punching bag since 2016.
I've personally been a punching bag, right?
90 hate messages per hour can screw with your head.
Oops, sorry.
But look, what we learned, if you're a digital news site, there are only three ways you get traffic.
It's direct, it's social, or search, right?
And when we came under attack in 2016, I realized how there was no integrity in social.
So we focused on search.
So Rappler is roughly 60% search traffic.
Still, and it still works.
Well, here's the problem, right?
So last year, Meta began choking traffic to news globally, right? Well,
you don't have the stats in Canada anymore because it stopped. But what Meta did, like
January to June last year, a company like The Guardian lost more than 75% of its traffic
referral from Meta. This is all around the world, anywhere from 50% drop to 90, 95% drop in traffic from the world's largest
distributor of news.
That's what Meta used to be.
They just arbitrarily decided, I don't like news.
I don't like being held accountable, so I'm not going to distribute it.
Or it's not of commercial value to us at this moment.
Therefore, we will throttle it down.
Despite the role of facts in societies.
And then finally, search, right?
What happened when AI, when chat GPT walked in in November 2022, when they rolled it out,
they began an arms race. And so now you have 10 different large language models. When you look at
all 10 of those, none of them are transparent in what data they fed the machine. Stanford did a
study that showed this,
right? They all failed in terms of transparency. Having said that, once search generative experience
really kicks in, that will kill search traffic to new sites. So what do we do? We started building
our own tech, right? And building tech for the public information ecosystem that ensures integrity of information, that should be government's job, frankly.
But it's not doing it, right?
They outsource this to private companies driven by profit.
So this is why we're where we are.
But in Rappler, I was like, people need information.
People want to be in a space where they're not manipulated,
where we're not killing each other, right? So we rolled it out very quiet. It's a matrix protocol
chat app. It's open source. It's secure and encrypted. And we are not manipulating you with
any algorithms of distribution, right? So it's funny, when I was building it, my CTO would say, Maria,
you have to personalize this, personalize this. Every time you personalize, you tear that person
away from the public sphere. That's a design choice. It is. And because you've built this
matrix protocol chat app, which as I understand it is like a cross between a news app and a
community message board. I mean, you're now the
one making these design choices, not just living with them. That's the critical part. We always
built our own platform, but we outsourced to social media. I will not outsource to AI now.
We're partners with everybody. I want to understand what they're doing. But with the
Matrix Protocol chat app, not only are we able to bring our people
together and it is civic engagement in the true sense of the word there are other things we can
do now we can do for example something that we did before the duterte administration project agos
which is where you have your community helping during times of crisis like typhoons right
it's an experiment in motion we We rolled it out quietly last Christmas.
It's working in the Philippines.
It will work for us.
We will be prepared for our elections,
but we can do this globally.
Well, and like you were,
despite your energy and the success of Rappler,
you are still a smallish private organization
building and experimenting.
Is there a, what. How should governments be
thinking about their place in democratic technologies or the development of AI itself?
It's kind of like building roads, right? Private companies can build roads. You have public-private
cooperations in it. But this is not just roads, right? This is literally, it touches your hearts
and minds of every person. This is the reason why information warfare can hit at the cellular level
of a democracy, because we've allowed private companies to do this. How should governments
be thinking about it? Look, I think the first step here is for governments to realize they do not want an Elon Musk, a person who has no accountability, determining whether
Ukraine will be able to fight back in Russia, right? Fight back a Russian drone. Will they
have Starlink? So I think the first step is democratic governments cannot abdicate their
responsibility to protect their citizens. That's the first step. You have to own it. You cannot
outsource it to these big tech companies, and you must limit their powers. Otherwise,
every day that governments do not act, they lose more and more of their power.
So govern and build.
Yes.
I mean, build in the public interest.
Yeah, because in the end, when we do this collectively, right?
Like right now, every news organization is going to need distribution.
But I don't understand this.
And I've now dealt with so many governments.
And they feel like this is beyond their comprehension.
Or maybe they believe the tens of millions of dollars in lobbying that the tech companies
do.
They can't possibly build this.
Maybe they don't want to, but then what they need to do is they need to protect.
And empower the people who are building in the public interest in meaningful ways.
Right.
If they're not going to do it.
Right.
If they're not going to do it, limit so that they're not experimenting in public
and that you make them responsible
for the harms that they've created.
You've said many times we're in the last moments
of democracy potentially this year.
What do you mean by that?
And why is it such a perilous moment again right now?
You know, in the Nobel lecture I said, if you don't have integrity of facts, you cannot have integrity of elections.
If we as people are being insidiously manipulated by Russian, Chinese, Iranian, by information warfare.
And you're seeing this play out now in Gaza, right?
And the target of this, the target are the kids. And I've seen this now on TikTok, for example, right? They're trying to unite the far left and the far right by hitting anti-imperialist narratives for kids.
Who is they? So I can't attribute yet to which country it is, but it is, let's say we're seeing it happen in the United States, we're seeing it happen in the Philippines. I'm a target. That's the reason I know, right? It's so ironic that I am both CIA and a communist. I just don't understand how that's possible.
You're not the first to wear those hats. But this is, you know, what they're doing is, if our young kids believe
that, you know, America is so broken that you shouldn't even vote, well, that's an interesting
choice, isn't it? In 2021, I said 2024
was going to be a tipping point year because half of the world, more than 60 elections in 50
countries around the world, and our information ecosystem is corrupted. We are being manipulated.
How do we make our choices why did
violence happen on January 6 in the United States on January 8 in Brazil
right like this is not a coincidence it is by design so this is it this is the
year we need to take our agency back and I continue to appeal to those who have
the power to change things right now, which are the big tech companies
than the people who control them, have enlightened self-interest. Do you really want short-term profit
over the death of democracy? I mean, we're already a quarter of the way through the year,
and we've had a few elections already. And what gives you hope that we can make these sort of big
changes this year? Poland! That's an easy one.
So like, let's just look.
This is why like immediately in the short term, you have to appeal to the people themselves.
Civil society, civic engagement is what will take us through this.
And Poland is a perfect example, right?
Like there was a government, right wing government that should have won, right?
They were all set to win. But then the government in Poland passed an abortion law
that was so brutal that it brought women and youth
out on the streets and voted, right?
So we are, let me say this,
we are democratically electing authoritarian-style leaders,
except in countries where citizens feel their back is up against the wall.
There's a tipping point there, you think?
You have to walk into the real world, right?
You have to walk into the real world.
And if we're just fighting on Twitter, that's not enough.
It's a waste of energy.
But when our rights are taken and we feel threatened.
And it's happening all around the world, right?
So I think that's part of it.
But we need to do more.
We need a whole of society approach.
But again, the tech has changed so fast.
The tech has leapfrogged regulations and is still held unaccountable.
America has no meaningful regulations.
This is the most unregulated industry, right? How insane is that?
I mean, it boggles the mind.
Oh, last, sorry, on the personalization part, where I was saying that every choice for
personalization tears apart the public sphere of the design, right? When you do that, and every
person, let's say we're in a room full of 25 people, and you give every one of those 25 people their own reality, what they
want, right? That's not a room where we're all together. That's an insane asylum. This is the
world we're building. If you're listening to this, pretty please, get up off your chair,
talk to your family and friends. This is the year that matters.
So I want to back up a little bit for a minute, because as we've been talking about, the social media era has definitely shaped where we are now with AI.
And you really saw some of the harms of that technology before a lot of other people.
So I'm really curious how you were able to do that.
Earlier in your career, you spent a lot of time reporting on Al-Qaeda in Southeast Asia.
I'm wondering what you learned through that, And how did that shape your thinking on the social media that would come
later? Indonesia is the world's largest, has the world's largest Muslim population. I was there
when 9-11 happened, right? And my first book is really about how Al-Qaeda hijacked homegrown movements in Southeast Asia and co-opted them into this global jihad, right?
So I think what it showed me is the process of radicalization.
I went back in history to the Afghan war.
This is where they were radicalized.
They were in training camps there.
Then they moved back to their countries. It's like bringing a virus back. And I was talking about
viruses before COVID, right? Because this is a tipping point.
Before social media. You're talking about social networks before social media.
Yeah, because also social network analysis allows you to be able to see how ideas and emotions spread through groups of people.
What's fascinating, though, is that process of radicalization that turns a young Indonesian into a suicide bomber, well, it's now been unleashed into the political system into mainstream, right? So when you're talking radicalization today, it's radicalization
in politics, where it has become a gladiator's battle to the death.
You signaled at that too, like it wasn't just the individual, the most extreme version of it,
which is an individual youth, young man being radicalized to become a suicide bomber. But you
also looked at political mobs and political violence. And you have this amazing quote about the emotions that drive political violence.
Say the force of the mob destroyed individual control, giving people the freedom to be their worst selves.
And like we're seeing that now too, right?
Globally, right?
I mean, certainly in the United States, January 6th.
I mean, coming from how people are radicalized, they radicalize
in groups, right? And one of the first things that happens is they cut out families, right? So
the group becomes your identity, right? So what we have unleashed in our information ecosystem is
this kind of radicalization that doesn't just polarize by design. It literally also encourages violence.
And I think that's where your experience having lived through those repeated moments of ideas catalyzing into violence,
that experience in history becomes so important because we haven't necessarily experienced that in the same way.
When I saw it happening online, I was afraid.
Like Joseph Conrad looking at the abyss, right?
Because you had seen the kernels of it through other mediums.
And when it starts, it is so hard to stop.
That's it, the momentum, right, of the mob.
And the other thing you lived through that time, though,
that I want to touch on is a transition of the medium of journalism.
So it struck me reading through your sort of time at CNN and that this was when satellite phones and CNN Global was a phenomena.
Like this, we forget just the degree to which that changed our collective perception of each other in the world. I should say it changed how us in the West viewed certain events globally, which we didn't have access to in the same way before that.
But that transition from global cable news through to the Internet and social media, you were in the front lines of.
How did that shape your understanding of what journalism is, its relationship to the medium?
I would say up until 2016, I loved technology.
One of the best parts of working for CNN at that time period
was we were one of the 12 test bureaus.
So any new tech, we would test.
And we were live from everywhere. But I lived through
that transition when we had a reporter and my team would have two weeks to do a story. You can land,
you talk to people, you understand them, and you have two weeks and you come out and you have a lot
of stories, right? I went through all of that. It's so interesting that we now like, or for a time we derided like parachute journalism.
And it, but it was two weeks. Now, what are we? Like, we're not parachuting. We're just
tapping into through streams. I mean, it's just, that seems like a luxury.
Well, the other part now is that you don't know who you're getting the information from, right?
Like, and especially in the age when you can now create video, create audio, right? You can make it up. So I would say the
other part that is I lived through this time period, and I guess this is the reason why I
know gatekeepers are necessary. The gatekeepers are legally responsible for the public information
ecosystem. News organizations are funny things
because we have a set of standards and ethics, right?
So it's funny to hear tech companies now say,
we can self-regulate.
Well, standards and ethics are only the first part,
but we are also regulated by law.
And that is the biggest mistake, I think,
that democratic governments have made because big tech has moved to a place where there are far more negatives than there are positives that they have built into the design of these platforms. like you did then have a chance to start your own media organization and what strikes me about that
moment you return to the philippines you started a media organization and you go all in on facebook
oh yeah like when you write about your enthusiasm like i would have worked for them well at the time
so as you know at the time i was studying the relationship between platforms and publishers
and sort of saying like you know some of you guys might want to be a little careful. And then here you are, like all in, like emotional testing of content.
Why were you so enthusiastic about the promise of Facebook at that time?
After six years with ABS-CBN, which is the largest news organization,
it was the news organization that former President Duterte
took the franchise away from, gutted. Anyway,
say after six years, I realized that the new power was really the internet and social media.
I jumped into it because I was hoping that the Philippines could use technology to jumpstart
development. Like I could imagine, you know, so this would have been,
when did we begin doing this?
2011, we began.
We built a website.
But if the search of Facebook had been better at that point in time,
I may not have built a website.
Thank God it wasn't, right?
Because Facebook at that time, it wasn't making money at that time, right?
But it was perfect for civil society.
It was perfect for journalism.
Particularly in a country that was the most online country in the world at the time, which is remarkable.
So one of the things that I had hoped was that I could imagine Facebook being used in a whole lot of different ways.
Most of my career had been spent on fighting corruption.
Right. And it's top down.
But what if and this is something I tested with ABS-CBN with the largest network.
We began using citizen journalism and it was working in our elections. There are laws, but people violate the laws.
But when you have a photo of a mayor violating the law
and that's sent to the largest network in the Philippines,
it has an impact, right?
So I could see all of the, what participatory journalism,
that's what the academics would call it,
what participatory journalism would look like.
And then I realized
that we can do far more. But then what happened, right? I think it's just greed as the companies,
and it's not just Facebook, Twitter followed, YouTube followed. And look how quickly that
happened. When we started Rappler in 2012, I built an IP satellite OB van, a little mini studio.
So we were going live from different places. We spent, so an OB van for television is about
a million dollars. Our satellite van was $100,000 to build. So we did this, but how fast Facebook Live came in
and YouTube went live.
Even that becomes obsolete almost instantly.
We rolled out our OB van in 2013,
the end of 2012.
And by 2014, Facebook Live was starting.
So it's two years.
I was hoping that we would have,
it's just the rollout of technology.
And it happened so fast. Yeah. You were at the front lines of it. I mean. I was hoping that we would have, it's just the rollout of technology. And it happened so fast.
Yeah.
You were at the front lines of it.
I mean, it was fun.
If it could have stayed just a little slower,
it would have been more positive, right?
I think the pace of technological change
is beyond our comprehension.
But some people comprehend it.
So you comprehend-
You mean the people who make money at it. Well, yes. So the people who are building it certainly comprehend the power,
although they may not understand the consequences. The impact of what they've built. You understood
it in terms of the power to do journalism differently, to mobilize civil society,
so on and so forth. Duterte understood it. And so that's what, so before we get to where we are now opened an office largely for sales in 2016,
right before April.
Our elections were in May, right?
Do you think they taught him how to use it?
Oh, I know they did.
Okay.
Right?
The same offer they were giving to Trump
and so on and so forth, right?
Exactly.
We will help you use our tools to speak to the public.
And I actually warned my friends inside, guys, do you really want to do this?
Like, are you a player in politics or are you communications?
Right. You don't have standards and ethics on this.
But, you know, this is where like the team from Facebook came to Manila and offered their services to the politicians who took it.
Duterte's team, right?
But I think before that, even, we're a social media country. By 2017, 97% of Filipinos on the
internet were on Facebook. And today, 100%. It's 100%, right? So they came in leading up to the
2016 elections. But the problem is, like, we saw this turning into a mob as we were leading up to it.
So by April of 2016, leading up to the May elections, one young student stood up in a discussion in a university with then-Mayor Duterte and asked him about the extrajudicial killings.
Almost immediately, a Facebook page was created demanding his death, the student's death.
He was doxxed, and his parents contacted us.
And I was like, oh, my God, this is, we came out with an editorial, don't turn social media into a wasteland.
So, was Facebook complicit?
I mean, let's talk about Myanmar, right?
Where their own investigation, they sent a team there.
The UN sent a team to Myanmar, and they both said that this platform,
Facebook then, played a role in the genocide that happened in Myanmar.
Did it play a role? Yes, absolutely. Did it play a role in the Philippines that happened in Myanmar. Did it play a role in? Yes, absolutely.
Did it play a role in the Philippines? Oh my God. In 2017, when I did get to talk to Mark Zuckerberg,
I was, I still believed. I did drink the Kool-Aid. Facebook could have been- But you still were optimistic at that point, even despite having seen this-
The worst.
Use, the worst use the worst the worst
possible use yeah you still seemed optimistic in 2017 because it's such power i think part of the
problem is that the tech the engineers and the tech guys who are running this don't know
communications and news they don't understand the cascading failures they put in motion when they decided to reward lies and hate.
Part of what happened after you sort of released these investigations, came out really powerfully talking about these harms, is we started focusing on the worst of the worst.
And I think we spent a lot of time doing that. Election interference, use of social media for political violence, and in some cases,
genocide. There were all these really discreet, bad things. And we spent a lot of time talking
about them. And they're real. Yeah. I wonder now, if you look at social media, where we are now,
like my feeling of it is that yes, those things are probably still present, and I think we should pay attention to them. But the overwhelming feeling is just that, like, our media ecosystem feels cruel, mean, divisive, unreliable. And these aren't individual bad people or hostile threats. This is like an ecosystem problem. So how do you characterize where social media is now?
I don't think so. First, the design of the tech companies, the platforms that connected each of us
turned our values upside down because the incentive is for the worst of humanity, right? If you lie, if you spread fear, anger, and hate,
you are rewarded.
That incentive structure is really critically important.
Russian disinformation, Yuri Andropov, former KGB chair,
former head of the USSR said,
"'Disinformatsiya' is like cocaine.
"'You take it once or twice, you're okay,
"'but if you take it all the time,
"'you're a changed person.'" Right. The chemistry of your brain changes. Now imagine that we are all cocaine
addicts now. I mean, I hate to say this coming from a brutal drug war, right? But we are changed
as people. I feel it. Don't we all? When you turn on social media, your brain, you feel the response you're getting. Addictive, dopamine highs, right?
Anger, emotion.
Yes.
And what's the incentive for here?
It's just for our attention, but we don't get anything back.
These are empty calories that are just making us fat.
Sorry, hateful, right?
It is what it, yes, I agree with everything that you said, because we lived it.
It's if you live on social media alone, and it has corrupted every part of our virtual world
of the internet, you will not believe in the goodness of human nature. That's horrific. You
know, I always use this example of like, there was a cartoon when I was growing up where, you know, you're the kid and you're trying to do the right thing. And there's
a devil on your left shoulder, an angel on your right. And the angel is telling you,
do the right thing, do the right thing. And then there's this devil that's telling you,
do it, do it, do it, do it, do it. You know, what social media did is it gagged the
angel and flicked it off your shoulder. And then it made the devil grow and gave it a direct line
into your nervous system. And that is the societies we've created. It isn't a coincidence that, you
know, by January last year, this is VDEM's report, 72% of the world is now under authoritarian rule, right?
We are being changed in different ways. And I would love academics to study it, right? There
is the change in us as people. The US Surgeon General finally wrote his report in May 2023
about the impact on kids, but it's not just kids. It's all of us, right? But that's
the personal level. The kids conversation is so many ways a proxy for our own concerns about what
we're doing to ourselves, right? We're placing it on kids, but it's a bigger, it's phenomenal,
clearly. I don't know why we're not talking about humanity, which is the next level. So the next
like groups, we started talking about groups, we behave differently in groups. So societally, we behave differently.
You're given permission to be part of a mob. You drop your inhibitions. I mean,
look at the gender disinformation that attacks women. People like leaders like Duterte or Trump
give you permission to be as misogynistic as you want to be.
Well, and that's changed our politics in clear ways.
Who speaks, how they speak, who engages, who doesn't.
I mean, it's changing the character of our politics in a real way.
And the chilling effect.
Absolutely, without question.
Women who are opting out, right?
Because the attacks against women are off the scale.
I think we've been pushed back decades.
And then finally, the last one where I wish there would be more academic studies is emergent human behavior.
Like biologically, what happens when you are constantly on a dopamine high, when the synapses of your brain, which are supposed to go straight, constantly turn right?
So there's that.
Like there's an evolutionary effect.
But beyond that, what does that mean for our species?
Because I think this tech, this thing we carry, the cell phone we carry around with us everywhere, it's transforming not just our systems of governance, the way we deal with each other, but us as a species.
Yeah.
And you don't stop.
Are you holding up okay through all of this i joke that i'm not going to sleep until the end of 2024 but you know yes not so
good but you know it's i feel like like in the philippines when we came under attack by the
government um when you look back a decade from now i this is what I told our team, we want to know that we did everything we could to meet the challenge of the moment.
This moment matters now for the world.
So you may have heard Ressa mention a company called Replica during our interview.
There's an AI startup called Replica, which is offering you, like, you will have a constant companion.
Like, if the first generation AI and social media weaponized our fear, anger, and hate, this one is going to weaponize our loneliness.
So for the next episode of Machines Like Us,
I spoke with the founder of that company,
Eugenia Cuida,
to talk about AI companions
and why she thinks her tech
doesn't exploit loneliness.
It might actually solve it.
It's not realistic to expect people
to just abandon the tech
that they're so into right now.
So we have to solve it with tech. That I think is kind of my ultimate realization up to these
years that we will have to build even more sophisticated tech that will bring us back
together. Machines Like Us is produced by Paradigms in collaboration with The Globe and Mail.
The show is produced by Mitchell Stewart.
Our associate producer is Sequoia Kim.
Executive producers are Kathleen Goldhar and James Millward.
Our theme song is by Chris Kelly.
A special thanks to Matt Frainer and the team at The Globe and Mail.
If you liked the interview you just heard, please subscribe or leave a rating or a comment. It really helps us get the show to as many people as possible.