Your Undivided Attention - Facebook Goes '2Africa' — with Julie Owono
Episode Date: September 2, 2020This summer, Facebook unveiled “2Africa,” a subsea cable project that will encircle nearly the entire continent of Africa — much to the surprise of Julie Owono. As Executive Director of Internet... Without Borders, she’s seen how quickly projects like this can become enmeshed in local politics, as private companies dig through territorial waters, negotiate with local officials and gradually assume responsibility over vital pieces of national infrastructure. “It’s critical, now, that communities have a seat at the table,” Julie says. We ask her about the risks of tech companies leading us into an age of “digital colonialism,” and what she hopes to achieve as a newly appointed member of Facebook’s Oversight Board.
Transcript
Discussion (0)
I mean, we just learned out of nowhere that Facebook was about to launch a subsea cable.
That's Julie Owano, and she's laughing because that subsea cable, which seemed to come out of nowhere,
will actually be one of the longest cables in the world.
At 37,000 kilometers, it will encircle nearly the entire continent of Africa.
In fact, Facebook is calling the whole cable project to Africa,
and all of this came as news to Julie, even though she's an expert on how the internet can reach and reshape nations in the global south.
It's mind-boggling to imagine that Facebook is going to launch very soon this SEPC cable,
which is great.
I mean, I'm happy for them.
But what type of discussion did they have with governments?
What did they trade with them?
As Executive Director of Internet Without Borders,
Julie argues that expanding Internet access is not as simple as running a cable into a country.
It raises thorny questions around sovereignty.
Most of the infrastructure is located within territorial seas.
To build within the territorial sphere, you need to be.
to ask the sovereign for authorization.
And once government's grant authorization, what might they ask for in return?
Did they ask you to have direct access to the infrastructure?
That's a big question.
And if they did so, what guarantee do you offer that human rights in general and rights of
the users will be respected?
These are not academic questions.
In 2019, governments in Africa shut down access to the internet on 25 separate occasions.
That's up 50% over the year before.
So this cable is not just a cable.
It's a vital piece of infrastructure, and like the railroads and streets of the continent's colonial past,
it's being built by a consortium of Western commercial interests, in partnership with local governments,
with almost no say from people on the ground.
We don't know what's happening within an infrastructure consortium.
We have really no idea.
We often talk in this podcast about how Facebook has become the new virtual infrastructure for running a society.
We live, we communicate, we develop our identities, we see each other through these private
technology platforms, but we haven't talked about how technology companies are colonizing the
physical infrastructure as well. And Julie warns that if this continues, we are sleepwalking into
an age of what she calls digital colonialism. So it's critical now that communities
have sit at the table. I fear that if we don't seize that opportunity, internet will definitely
become a tool of repression in places that desperately need freedoms and democracy. But unlike the
colonialism of the past, we can still reverse this trend by giving people like Julie a seat
at the table. And she just got a very big seat at Facebook's table.
As a member of the Oversight Board, I saw this as an opportunity to bring the attention of
the platforms on things that they pretend they don't see.
Today on the show, we talked to Julie Owano, executive director of Internet Without Borders
and newly appointed member of Facebook's Oversight Board, which has been likened to the Supreme
Court for Content Oversight Decisions at the company.
actually met yet, and its first meeting will be in the fall of 2020.
The problem is so obvious. Everybody talks about the fact that there is this information,
there is hate speech in many places in the global south and particularly in Africa.
But there has been very little change from the part of the companies,
and particularly in this case, Facebook.
So I saw this as an opportunity to call their attention to the problems.
And I'm Azaraskin.
And this is your undivided attention.
I actually like to say that I am the product of the internet that I would love everybody to have access to.
When I started blogging on a platform, which is called Double Voices Online, it was back in 2010.
I was a bit frustrated that when we talked about cyber development in Global South and particularly
in Africa, we focused a lot on English-speaking Africa, forgetting the rest of the continent.
So yes, I thought it was important for me to bring in that voice.
And basically, it changed my life.
From being a random immigrant in France, especially a black, a young woman, I suddenly realized
I could have access to platforms that could bring my voice to.
people I thought I'd never reached and make sure that the issues that I think are important
are visible. That's how I started working on internet without borders with this aim of
how can we make sure that the next person like me also have access to that internet that helps
them to change basically their reality at the individual level, but also change the world for the
betters. Could you sketch just briefly how you came to be working on the problems you work on?
like your background is fascinating. Cameroon, Moscow, Paris. You see the world from a very
different perspective. And I think I certainly do. And I think most of our listeners. So I just love
to like hear a little bit of that. Sure. So I'm leading an organization. The aim is to defend
freedom of expression online, among other human rights. We have been focusing a lot on the issues
of internet shutdown. So when governments decide to shut down access either to the whole internet
or to social media websites such as Facebook, Twitter,
but also messaging apps such as WhatsApp.
And what we saw was that initially when this trend began,
most governments, usually repressive ones, would say,
we need to shut down because for very diverse reasons, to be very honest.
But many of these governments understood that they could weaponize
the problems that the platforms have created,
and particularly problems around hate speech and disinformation.
information to further justify that they need to censor.
So basically saying since Facebook, Twitter and all these others, platforms are not doing anything
to deal with these problems in our country or in our region, Africa, in particular, because
that's a region we work a lot on.
Since these platforms are not doing anything about that, well, we have no other choice
than to censor and suppress access to Facebook, Twitter, and others.
So, yes, we've been working a lot on internet shutdowns and tying it increasing
to the problems that the platforms have created.
Oftentimes with our work here at the Center for Human Technology,
we seem to be tackling two competing dystopias.
There's the Big Brother, 1984, shut it all down,
censorship dystopia, where we shut down things,
we shut down what you can say and you can't say.
Then there's this other dystopia,
what we call the Aldous Huxley problem of Brave New World,
where we give people so much information,
so much triviality, so much noise,
that they don't know what's true,
and we just, everyone gets caught in a loop of amusing themselves to death.
And there's these sort of two ends of,
of the spectrum. And what I find interesting is in the government shutdown scenario, it mixes the
purpose of the shutdown, which is, hey, there's so much noise. No one knows what's true. I got an idea.
Let's shut it down to shut down the Huxley dystopia. But it's actually enabling the Orwellian
dystopia because it's exactly during those shutdowns that extreme actions are taken by the
government or people don't know what's going on in terms of human rights abuses, things like
that. I would just love to maybe get a little bit deeper into that. Yeah, the issue of having
access to too much information is really an interesting entry point. The first thing is
a lot of repressive governments have been completely disrupted by just what happened with the
internet. They were not prepared. Increasingly, governments are using some of the problems created
through an unfettered access to information and the lack of regulation and moderation as we've
just discussed to justify that, yes, we need to go back to a time when we had only one information
because that's more security, that's more stability, that's no violence, that's, you know,
and it increasingly speaks to people, honestly.
I got involved in doing the research that I'm doing currently on the link between increased
hate speech on social media platforms and weaponization by governments who further shut down the
internet.
I got interested into that because my work became very difficult, honestly, advocating against
internet shutdown.
But I remember in places I would go and say, hey, we don't, it's not good to shut down the internet.
And people would tell me, yeah, but we don't want hate speech.
So we'd rather have that instead of having hate speech, which I totally understand.
It's a point that should be heard too.
And that's how I started working on hate speech and how to help us from see or see better hates in places they don't know.
It's really dangerous and we need to continue to work against that.
One of the core principles for designing technology that is humane,
these ones that we talk about, is those that are closest to the pain should be closest to the power.
And with Silicon Valley's obsession with scaling, like blitz scaling,
it has never been easier for the person with the most power to be the furthest from the pain.
That is, we're designing our systems to be maximally inhumane because we're designing systems.
that people need that are unsafe, that they're then forced to use, I was sort of surprised
to learn this. Facebook and Google are building physical infrastructure to bring, you know,
the 1.3 billion people in Africa, like aren't online online. What is their responsibility as they
do this? Because they can't argue that they're not on the ground to bring the infrastructure.
They are on the ground. So what have you seen and what's the responsibility?
That's the awesome question. They are totally responsive.
I mean, what I like to say is it's not possible that you want the profit, but you don't want
the political responsibility that comes with that.
That's impossible.
If tomorrow other people, other groups get killed in one of these countries, just like what
we saw in Myanmar two years ago, people are going to come at you, Facebook, whether you like
it or not, and you're going to be held.
I mean, people will ask for account.
But even if you don't want to be politically responsible, you will have to because
these governments, they don't want you to contribute to the beginning of a genocide. Nobody wants
that. So they're going to shut you down, definitely. That's the risk. And that's what they are
already doing. And while they are doing this, we also should be aware that in front of that,
there are other companies from either Chinese, Russia and mostly Chinese companies, to be honest,
that come in there and that propose alternatives that are also interesting to these users.
The risk, of course, is how to make sure that Facebook and also Google don't interpret this responsibility as we have to side with the oppressor because we have to make sure that our infrastructure are protected.
That's precisely where it's important to work with organizations there, not only digital rights organization because they're not plenty, but traditional human rights organization, consumer organizations and many other women's organizations that have.
have been around for 50, 60 years that know the country.
So work with them and make them aware of what's happening.
It's mind-boggling to imagine that Facebook is going to launch very soon this SEPC cable.
And most of the infrastructure is located within territorial seas.
To build within the territorial sphere, you need to ask the sovereign for authorization.
What did you ask when asking that authorization?
Did they ask you to have direct access to the infrastructure?
that's a big question. And if they did so, what guarantee do you offer that human rights in general
and rights of the users will be respected? So to ensure that this guarantee is out there,
we have to make sure that communities have a seat at the table. What does the seat at the table
look like? Because I'm thinking about the civil rights audit that just happened for Facebook.
They had a seat at the table and Facebook just sort of shrugged. So in your best possible world,
like what does that actually look like to have a seat at the table to have that informing product?
Yes. So let's start with the issue of infrastructure. So having a sit at the table when we talk about infrastructure is making sure that the consortium, which will manage basically access to the infrastructure for service providers in each country, how to make sure that this consortium have a seat for civil society organizations or at least a human rights organization and are more transparent because they're not. We don't know what's happening within an infrastructure consortium. We have really no idea.
I did a research about why Internet was so expensive, like really expensive 10 to 7 years ago
in Western Central Africa when in France, for instance, it became way cheaper.
It almost, I mean, was nothing to have access to Internet.
And what we found out was that there are consortia that manage access to the infrastructure,
but for service providers, so usually telcos.
And on these consortia, you would usually have the companies,
organization that put in the money for the cable for the infrastructure. You would have government
representatives and you would have some other private sector representatives. And there was no report
on what they were doing, really no information. That explained why when you went from
Senegal to Gambia, the difference in the cost of access could be multiplied by 10, 20 for no
reason when they had access to the same, exactly the same infrastructure. So,
We think it's the same thing with all these new infrastructure that are being built.
How to make sure that beyond the cost, because the cost is not the problem anymore,
but other issues, how to make sure that if a government wants to shut down internet,
there are certain procedures before that becomes even possible.
But to do that, you need transparency, which we don't have.
So a seat definitely within this consortia.
That would be the ideal scenario.
When it comes to having direct access to product in an ideal world, to have this connection between product teams and companies and grassroots organization, well, again, the issue of transparency.
We have been working with companies on this problem of hate speech.
When I think working with it is trying to alert them.
We never know whatever happens to our reports.
I have no idea.
We just know we report it.
That's right.
On the other hand, we don't really know whether or not what we're doing is it, you know, efficient.
We think it is because we do see some differences, but we certainly don't have the same means of
a measurement that companies would have. They would know better whether or not there has been
an increase or a decrease in hateful discourses on platforms. And for now, they're not willing to
give up on these issues of data. For honestly, privacy is not an argument. I'm sorry, especially
when we talk about potentially genocides. I'm not even exaggerating. I'm scared even to use that
word, but that's true. We're running in general, this grand psychological experiment on what happens
when you plug three billion people into an automated attention information sorting system that
just says what gets the most clicks. And no one's ever run that experiment before. And it's kind of
an unsafe experiment, especially when I think you enter into countries where, you know, not only
you're designing for the assumptions for, you know, what it looks like to go to work and speak with
people in San Francisco, California, or Silicon Valley, but in Africa, I know there's
something like 1,500 to 2,000 different African languages,
and you only have a capacity as a company to do, let's say,
content moderation in a handful of language.
I believe Facebook only has something like 20 major languages
or something like that that they do fact-checking for.
And so if I'm Russia or if I'm Cambridge Analytica
and I want to go into your country,
and I can just say, well, let me go into a country
where I know Facebook doesn't have the fact-checkers in those languages.
now I can sow misinformation in exactly the known blind spots where I know the companies don't
have the resources to do the safety check.
So I'm just curious how you think about this, because I know in your work you've talked
about digital colonialism, and I'm just curious how you think about those things.
Yes, you mentioned the case of Russia.
So increasingly, Russia is using the African continent as a proxy to target voters in the
United States. And they're doing that by exploiting resentment with regards to the history and
particularly the history of colonialism and imperialism, especially in Africa. And in that history,
Russia at the time sided with some of the independent fighters against former colonial powers.
The Russians have really understood that this resentment happens to mirror the resentment felt
here in the U.S. or in Europe against racism, institutional racism.
That's why it's important to have a very acute knowledge of these dynamics.
It's important to know what groups are being politically weaponized against each other
because that's what's going to be used later on,
not only against the populations in that particular country, in that particular continent,
but it plays out between nations that are increasingly antagonists.
So on the one hand, Western nation.
And on the other hand, Eastern ones, you know,
who want to play a role on what's happening.
on the African continent, on the destiny of the continent.
But what we have worked to do, and particularly I have worked to do a lot,
is explaining the intersections of all the problems and saying,
maybe have a look at what's happening, I don't know, in South Asia right now,
and probably you'll have an idea of what may happen a few years to come in Europe as well.
I have a very great in the sense that it's very illustrative example of what happened in
in Libya. So in 2011, when there was a Libyan revolution, a French company worked with the
Gaddafi regime at the time. They sold deep packet inspection technology, and which allowed the
regime to basically map and arrest all the opponents. And that same company was asked by the
French government to create a huge database of all information about French people. So before you had
different databases for different services. But after the terror attacks in France in 2015,
the government decided that they need to, well, centralize everything. And they said it would be
easier to access information, all information about individuals. And the company that was
consulted to do that was precisely that company that helped Gaddafi arrest protesters in Libya
a few years back. So, yeah, for us, it's important to think about all this, well,
there is an expression now
the rest of the world
how the rest of the world
is definitely a testing ground
of what is going to come
obviously in a few years
here in the US or in Europe
or in more developed places
so we think it's important
to pay attention
and yeah be ready
do you have any examples for listeners
of those nuances of hate speech
that might be different
across the thousands of languages
in the African continent
sure my team and I
and me here, we have been tracking and mapping what hate speech looks like in five different
countries in Western Central Africa. And what we have seen is that the dynamics are almost more
or less the same. So big political event or fracture. So event could be a very disputed election
or a political party that has partisans from one region of the country versus another one that has
more partisans from another part of the country, and that party is usually the one that is
ruling and ruling with a heavy fist. So that's one of the dynamics that we've identified,
which is common. Also, the issue of gender, we barely mention that, but the first way to
identify ethnic hate speech, I would say, is to look through sexist speech. Phrases such as,
oh, women from this group are prostitutes, so you should never marry them.
or women from that other group, they like to steal your money.
I'm just giving random examples.
The names will change, the country will change,
but at least the similarities, you know, guide you basically
on what type of speech and information you might look for.
We are currently working on this project of building a public database
of what hate speech looks like in some of the countries that we work on.
We think it should be public because it will inform not only the main platform,
but also others, because we don't talk about TikTok, but TikTok is highly problematic,
especially in these countries.
I was chatting with a friend, who is based in Abidjan, in Cote d'Ivois,
and she was asking me, how come many of the videos that I come across on my TikTok
are related to people getting lynched or, you know, very violent videos?
We think that public databases will force platforms to make their current hate speech detection better.
what we also are trying to push is this idea that tech companies need to rely and they need to
accept that they need to rely on expertise outside of the company. And that knows way much more
than any expert out there at the Silicon Valley will know. This really struck me when I was
recently in Palo Alto to meet people from the product team of a big company. They do have some
internal people who are more or less aware.
I'm working on elections in Africa.
So they're more or less aware of the context.
I shouldn't say more or less.
They have PhD, so they are very aware of that.
But having a PhD is not like, you know, being a journalist in, I don't know,
Kigali or Bojambura or wherever.
You have a different perspective that's certainly valuable out there.
And when you actually work with local experts, as we call them for now,
You also empower them and make them agent of change, you know, if they understand that what they're doing is important to make sure that the platform remains healthy, well, they will inform others of what's happening.
They have newspapers. They have organizations. So it's important, basically, to step out and go and work, not speak to them because all these companies, they like to talk, take all their engagement. We know that.
But in addition to speaking with them, work with them, and trust their expertise.
That's what we're telling them.
One of the issues here is that when you go into a country that might have hundreds of languages
or something like that, there aren't hundreds of newspapers necessarily or hundreds of
institutions that represent all of those different constituencies, tribes, representatives,
histories, et cetera.
But then you have this issue of Facebook's free basics where they're actually building
the infrastructure.
So there's actually no way for organic local language.
competitors to compete with that infrastructure that Facebook's providing, because they've got
asymmetric resources, asymmetric power, asymmetric capacity to lobby the government.
Boom.
They're planting all the infrastructure.
Second point is on the how much content is available in all those languages.
So now let's imagine Facebook goes in and they're allowing these 200 languages to, you know,
speak, right?
Well, now the 80 to 90 percent of that language's speech is now best represented by Facebook
because where else are people publishing this stuff?
there aren't, again, those, you know, 200 newspapers for all those 200 languages.
So now Facebook is actually the primary place where all that language is getting voice,
getting amplification, there's no one who can counter speak, who can say that was a rumor,
that was a conspiracy theory, that lynching thing, that didn't really happen, that video saw,
that was a deep fake.
And so one thing I find interesting is like almost going back to the kind of Colin Powell pottery barn rule
when he said to George W. Bush, you know, if you break it, you buy it.
you know, and he was saying this with regard to going into Iraq.
If you go into Iraq and you go in because you want to bring liberty and freedom to the country,
but you broke everything, then guess what?
It's your responsibility.
But more so, what's so interesting to me is, and I'm sorry to pick on Zuckerberg here,
but he says, you know, look, I shouldn't be the one responsible.
Don't ask me to set a policy for all these people because you think I know what that local tribes language or culture is or whatever.
But he's created a situation where 80 than that.
90% of that language's representation is actually happening on his platform.
He's displaced the competitors who can counter speech against anything that he's saying.
So now it is his responsibility.
So this is, I see this almost like Iraq times 1,000, because you're going into hundreds of
countries and into all these different tribes and civic conflicts, except you have no capacity
now.
But you can't say that it's not your responsibility.
And, you know, we're in this predicament where this is just the reality that we now live in.
But what are we going to do about it?
Because we all don't want this to happen.
And it seems like we have these two routes.
We either sort of shut it down completely, which is the direction.
Increasingly, you're saying even citizens are saying we should go because they can't deal with
the amount of stuff that's on there that's false, that's just creating conflict.
But then that just like you're saying, favors the oppressors.
How do we get out of this, Julie?
And especially now speaking to, you know, obviously Facebook starting up this content oversight board
with the Supreme Court for content and trying to deal with these issues, do you want to speak
about, you know, how both with your role there and more broadly speaking, how do you see us
finding a way out of some of these problems?
So speaking as an activist who have been working on these issues and these places that are
in Africa for 10 years, I should say that the only way for what is having more groups
who demand accountability from Facebook.
We have seen that it works.
It probably takes a bit of time, but it works.
I remember in India, you were mentioning free basics.
Free basics is not in India.
despite the fact that there are still millions, hundreds of millions of people that need to get online
and that are poor. So there was an outcry and Facebook went out. But they went out and just came
to Africa where they are now, I think, in more than 30 countries out of 54. And nobody asked them
for anything. No question. So there is really, and I insist again on that, the need to, I don't want
to use build capacities because that's vocabulary from the development.
development sector, and I have a lot of criticism with the development sector, it's important to
have groups that will be able to see critically things, not only see the good, because it's, of course,
if Facebook tells you they want to help connect people, connectivity is great. We're using it now,
so it's great. But what comes with that? That's what people should be educated, to question always.
And we should have more groups doing that everywhere in the world, and particularly in Africa and
the Global South in general, to hold the company accountable, to ring the alarm when they don't
deliver on democratic principles and, you know, freedom principles. And, yeah, hold them accountable for
the responsibility, whether they like it or not. I mean, now as a member of the Oversight Board,
that's precisely why I chose to join. I saw this as an opportunity to bring the attention
of the platforms on things that they pretend they don't see.
I say pretend because the problem is so obvious.
Everybody talks about the fact that there is this information,
there is hate speech in many places in the global south
and particularly in Africa.
But there has been very little change from the part of the companies
and particularly in this case, Facebook.
So I saw this as an opportunity to call their attention to the problems,
telling them that, yes, there are lots of issues
with Russian interference in the U.S. elections,
which I think are one of the main reasons why the oversight would exist even.
That's great.
But look, there are also lots of problems in X, Y, Z place,
and here's why we think your community standards are wrong.
They don't encompass the complexity of the issue that you're trying to deal with,
as they are written now.
And on top of that, they're not compliant with, you know,
international human rights law that protects the freedom of expression,
which you say you want to protect.
So, yeah, he's probably a better way to do this.
The cynic in me, when I saw the oversight board sort of being announced,
it was like, oh, this is another sort of impact-washing kind of move
because one of the themes of I think this conversation has been,
one size does not fit all.
Like if you're going to be around the world,
you really have to have solutions that are bespoke to the people
and the context and the history and the language they're going into.
and then the oversight board just structurally is a small thing far removed from all of those
many places. And what I think I'm hearing you say is, yes, that's true. And the reason why
you're taking the position is to raise visibility about the specific problems that you care about
rather than thinking of the oversight board is the solution. Am I understanding right?
That's exactly. I really don't think, honestly, humanely, it's impossible to, we're 20 at the moment.
and will be 40 when things are completely ready,
it's impossible.
40 people can't, and first of all,
we're not Facebook moderators.
We're not here to do moderation in bulk.
That's not interesting for us.
But rather what's interesting is out of one case,
so starting from one particular case,
trying to identify the issues that are at play
with regards to freedom of expression and safety
and many other important values and rights
and the interpretation we can give to that
that will help and guide further policy updates by the company
and will change the way the company see this specific issue
that we dealt with with this case.
So it's a complement to many other things.
I mean, it's complementary.
But what I did like also about this particular initiative
is that it did,
bring in the idea that platforms are not arbitrary powers that they need to be held accountable.
So for now, in the U.S., the law won't do that for various reasons related to some legal immunities
and also other places in the world won't do that.
And honestly, I don't even know whether or not it should be good that, you know, governments
hold companies accountable because that can also be very wrong, at least in the way they
understand it for now. So it's a way to bring in a bit of checks and balances to that whole
arbitrary, discretionary, we decide what free expression is thing. I really hope it's not going to be,
and I really hope we're also going to be checked and balanced. We need to make sure that there's
always someone who can call you out when you're going on the wrong direction. I think that's
important to avoid arbitrary and discretionary decisions that are not grounded in reality.
an interest for human rights.
I wish people understood more of the examples of just how bad some of the, you know,
the situations are because, you know, it's like when you double-click on these things
and you just get a sense of how, you know, we have 20 people on the Facebook oversight board.
We've got 3 billion Facebook users and 100 billion posts moving through the system every
single day.
More than 100 billion, according to Nick Clegg.
We've got what, AI, they're going to perfectly detect in 2,000 languages the AI's never
been trained on, the right way to sort, handle that speech.
when we know that the default bias, if you zoom your eyes out and blur your eyes,
is bullies win, hate speech wins.
I think it's interesting to think about TikTok here too,
because the fact that you can just see video after video
of lynching of that other minority group that you hate,
it's like having automated machines run the information ecology
that 3 billion people depend on
to make decisions and understand whether they should feel at peace
or angry about everything constantly.
We're left with words like content moderation and algorithms
which don't speak to, this is how people wake up
and then feel either at peace or angry today.
You know, we're so far into this
that I don't know what to do now,
but I know that I'm grateful that you came on
for the podcast and that we could talk about it
for a little while.
And I just hope that this gets people more interested
and hopefully more understanding
of the hundreds and hundreds of countries
in which this is happening
and we don't know what the results are going to be
until the consequences unfold, you know,
because so many of the dominoes have been set in motion.
And the Oversight Board, content moderation, these are all frames that are retroactive, that are defensive, that are waiting for something bad to happen, and then trying to block it.
And it sort of ignores that the structure of Facebook, the structure of TikTok, the structure of Twitter is to reward us when we say things that are hateful, that get lots of reactions.
and so it's like we're getting injected with the sort of like bad behavior.
It's the hate virus.
Yeah, yeah, exactly.
And I'm curious where you see hope or if you see hope for that kind of structural change.
I do see a lot of hope.
First of all, it's not because I'm on the podcast, but I really love what you guys do.
Honestly, I really like that more and more people who used to work in those companies
are speaking out against what they seem.
And I'm also very hopeful that, I mean, what still gives me hope is, again, this possibility that exists out there and that we haven't harnessed yet as much as we should of working more together with people in different groups.
And that works as well for companies.
There will be continued dialogue and collaborations, cross-border collaboration, transactional collaboration between groups in the U.S., in Europe, and also in Africa.
and other places in the global south that really do need it, critically need, there is
a lot of room for that. So yeah. Awesome. Well, Julie, it's been such a pleasure having you
on your undivided attention and a privilege to get to meet you. Thank you. I'm a great fan
and you guys give us hope. Thanks for that. It's mutual. Great to meet you.
Your undivided attention is produced by the Center for Humane Technology.
Our executive producer is Dan Kedmi and our associate producer is Natalie Jones.
Nor al-Samurai helped with a fact-checking,
original music and sound design by Ryan and Hayes Holiday.
And a special thanks to the whole Center for Humane Technology team
for making this podcast possible.
A very special thanks to the generous lead supporters of our work at the Center for Humane Technology,
including the Omine Network, the Gerald Schwartz and Heather Reisman Foundation,
the Patrick J. McGovern Foundation,
Evolve Foundation, Craig Newmark Philanthropies, and Knight Foundation, among many others.
Huge thanks from all of us.