TED Talks Daily - The arrest of Telegram CEO Pavel Durov — and why you should care | Eli Pariser
Episode Date: September 12, 2024Online democracy advocate Eli Pariser explains the details surrounding the August 2024 arrest of Telegram CEO Pavel Durov by French authorities — and what it means for the future of tech ov...ersight and free speech. (Recorded live on Wednesday, September 4, 2024)
Transcript
Discussion (0)
TED Audio Collective.
You're listening to TED Talks Daily,
where we bring you new ideas to spark your curiosity every day.
I'm your host, Elise Hu.
Today, the latest episode of TED Explains,
where we take the biggest headlines of the moment
and offer clarity around what it all means,
context on why it all matters.
In this live conversation recorded on September 4th, 2024,
online democracy advocate Eli Pariser
explains the details surrounding the arrest
of Telegram CEO Pavel Durov.
Eli sits down with TED's Whitney Pennington-Rogers
to discuss what it means for the future of tech oversight,
free speech, and more.
All coming up after the break.
Support for this show comes from Airbnb.
If you know me,
you know I love staying in Airbnbs when I travel.
They make my family feel most at home
when we're away from home.
As we settled down at our Airbnb
during a recent vacation to Palm Springs,
I pictured my own home sitting empty.
Wouldn't it be smart and better put to use
welcoming a family like mine by hosting it on Airbnb?
It feels like the practical thing to do,
and with the extra income,
I could save up for renovations
to make the space even more inviting
for ourselves and for future guests.
Your home might be worth more than you
think. Find out how much at airbnb.ca slash host. AI keeping you up at night? Wondering what it
means for your business? Don't miss the latest season of Disruptors, the podcast that takes a
closer look at the innovations reshaping our economy. Join RBC's John Stackhouse and Sonia Sinek from Creative Destruction Lab
as they ask bold questions like,
why is Canada lagging in AI adoption and how to catch up?
Don't get left behind.
Listen to Disruptors, the innovation era,
and stay ahead of the game in this fast-changing world.
Follow Disruptors on Apple Podcasts, Spotify, or your favorite podcast platform.
I want to tell you about a podcast I love called Search Engine, hosted by PJ Vogt.
Each week, he and his team answer these perfect questions,
the kind of questions that, when you ask them at a dinner party,
completely derail conversation.
Questions about business, tech,
and society, like is everyone pretending to understand inflation? Why don't we have flying
cars yet? And what does it feel like to believe in God? If you find this world bewildering,
but also sometimes enjoy being bewildered by it, check out Search Engine with PJ Vogt,
available now wherever you get your podcasts. And now our TED Talk of the day.
Hello, and thank you for tuning in to TED Explains, where we take the biggest headlines
of the moment and offer clarity around what it all means and context on why it matters,
especially for you. I'm Whitney Pennington-Rogers, and I'm your host for today's conversation.
Now, a week and a half ago, Pavel Hordorov, the founder and CEO of instant messaging app Telegram, was arrested in France as part of an investigation into moderation of illegal activity on the platform.
The Roves arrest has raised questions around free speech, tech regulation, and the role of government in all of this.
I'm joined today by author and online democracy advocate Eli Pariser. He's the co-director of DoPublic, a non-profit R&D lab for better digital spaces,
and his two TED Talks on social good and the internet have been viewed collectively more
than 8 million times. Hi, Eli. Hey, Whitney. Hi, thank you so much for being with us today.
Glad to be here. Well, let's just dive right in. I think first for the uninitiated, what is Telegram? So Telegram, it's interesting.
It's a kind of a riddle.
It is part messaging app, like Messenger or Signal or WhatsApp.
And it's part social media.
Like if you imagine kind of a mashup of Discord or Slack
and a messaging app, that's what Telegram is.
And the other kind of weird thing about it is that part of its claimeling space, you know, that it doesn't respond to government requests and that you can do whatever you want there. of what goes on in Telegram is not encrypted, which means that governments around the world
can see into Telegram and all of the things that are going on there.
And what does that mean for the audience that you see Telegram has attracted?
Is the target audience a little bit different than some of these competitor apps?
Yeah, I mean, it's a fascinating, it's big.
At this point, they claim up to a billion global users. But, you know, it's this mix of the unfiltered reality of and spin of what's happening in the war in Ukraine from the Russian side, from the Ukrainian side, same thing in Gaza and Israel, you get kind of
the unfiltered and, and also fabricated narratives from both sides there also,
you know, became big in crypto circles and alt-right circles. And in addition, it's become
kind of like a great place on the internet to talk about
and sometimes do crime.
And so, um, if you're looking, there's a great piece from Willow Ramis at the Washington
Post, uh, recently where the best way to figure out sort of the illicit ads on Facebook and
on Instagram is just to see if they list a telegraph, a telegram group in the ad, because that's a good
tell that something suspicious is going on. So it's, it's become this kind of wild
transnational place where people are, are in some cases organizing and pushing back against
authoritarian governments. In other cases, you know, doing some of the worst things on the internet.
Well, and that brings us to the charges that Pavel Durov received when he was arrested. So
he was arrested for allowing criminal activity on his platform for things like
child sexual abuse or sale of illicit drugs. Can you break down the charges and
what actually happened here? What led to his arrest?
So it's worth saying, Whitney, like there's a lot that we don't know yet about exactly what the
case is that the French government is pursuing. And I guess the way that this works in French law,
you don't sort of have to specify all of this at the outset
when you arrest someone.
You sort of say, like, these are the things that we're looking into.
So there's a range of things that were mentioned,
including complicity in storing child sexual abuse material, the most kind of serious crime,
and also kind of a failure to respond to law enforcement requests,
which is kind of part of the telegram legend is that, you know, they are uninterested in being responsive to governments.
But in this case, it sounds like perhaps the French government made a request as part of an
investigation, perhaps around child sexual abuse material, and Telegram didn't respond or didn't
respond satisfactorily. And so when Pavel Durov, who is a French citizen, you know, came back onto France soil, he was arrested.
It's worth saying that there's one other charge that's part of this, which is that a charge about sort of not registering Telegram as a encrypted service in France. And we don't totally know what that looks like,
but that's the people that I listened to and talked to in the space see that
as kind of different and potentially more,
more worrisome than some of these other charges from a free speech standpoint,
because there's a real question of when and how governments should be able to
crack down on the possibility of encrypted communication.
And we can get more into that as well.
I mean, this is not, there's sort of precedent for this.
This is not the first time that these sorts of challenges have arisen for Telegram.
I know in 2022, Germany fined Telegram for failure to comply with law there.
And last year, Brazil temporarily suspended the platform for failure to surrender data connected to what they saw as neo-Nazi activity.
So I guess what precedent do we already see here for Telegram receiving pushback from governments
or sort of having this level of oversight in other countries?
Yeah, well, I think just to zoom out,
I mean, there is precedent for governments pushing back on Telegram.
There is not precedent for the head of a tech company
that's a global tech company being personally arrested
and for a government trying to hold
them accountable. And that raises a whole new set of questions. But just to zoom out for a second,
I think, you know, it's interesting, this is all happening at the same time that Brazil has banned
X. It feels like we're at this moment where there is this real grappling around one of the most
central and hardest questions of the internet era,
which is basically like, who should be able to control our digital spaces? And is that,
should they be under the sole control of some tech dude? Should they be subject to national laws?
And if so, how should that work when they're actually kind of
international conversations? Should they be in the control of the communities that gather on them?
And so I think we're just, we're at this really interesting moment where all of that is kind of
coming to the fore. And the French government seems to be saying, perhaps is saying, once we know what they're saying, you know, hey, you can't run a company on French soil and ignore our law enforcement people.
That's not OK. overtones here, because any time a government seeks to hold accountable or start to place legal
constraints around a tech firm that's facilitating speech, you really have to wonder about and worry
about the free speech connotations. So all of that is kind of like in the mix here.
And to the point of free speech, I think speaking about DeRove himself,
there's sort of an interesting thing happening where you have the organization Telegram,
then you have DeRove, who has a history sort of in this space. He refers to himself as a free speech
absolutist. He, in addition to Telegram, founded what was at one point and may still be the largest
social media platform in Russia
and came under scrutiny by the Russian government for some of the content that he allowed to be
shared on that space. So, because how much of this arrest do you think is about
Durov and how much of this is about Telegram as an organization?
I mean, I think they're very linked.
And I would also say this is one of the places where one of the challenges we have right
now is that Telegram is a very opaque organization.
We don't know a ton about it.
We don't know a ton about the ownership structure or its incorporation or a lot of the things that you kind of want to know about an institution that was becoming important infrastructure for millions or billions of people around the globe.
And so there's a way in which all of these tech companies, you know, are sort of personalized and closely associated with their founders.
Mark Zuckerberg, legendarily, you know, has this super voting status at Meta and so has
almost unilateral power over the way that people communicate in billions of conversations
around the globe.
You know, Pavel Durov and Telegram similarly are kind of like closely linked.
And we don't really get to know how closely linked they are because we don't have a lot of the information about Telegram that we might have about other companies.
And so, you know, I think there's there's it has to be both.
But I would say, Whitney, like there is this conversation about free speech, and that's a really critical and important part of the conversation here.
But it's not the only part of the conversation. And the metaphor, maybe I'm like stretching this a little bit, but you know, if you're a newspaper, but your storage room is being used by people to sell
counterfeit items or whatever. And the government says like, hey, we were not into this use of your
storage room for legal activity. Like that's not a free speech issue. That's like a crime and
government issue. And if you don't let the government in, then you might reasonably get arrested at some point. And so I think one of
the questions that's swirling around in all of this is, you know, how much is it a question of
free speech? And how much is it a question of a democratically elected government getting to say
what is and isn't acceptable behavior by companies
that are operating on its turf. I mean, I think that the nuance there around this, you know,
both thinking about Telegram and how they operate, but then also, you know, sort of what it seems
is matter is it matters to drove around this idea of free speech and creating a space where people can can share information freely without much moderation from or from the the the app itself
from the company itself I guess what are some more of the nuances around how information is
shared and people are able to communicate freely and the on the telegram app that are really
important for people and understanding why this matters for us, whether you use Telegram or not, why this moment is
really important.
Yeah.
So I guess just to pull this apart a bit, I mean, number one, when we talk about free
speech, we're using language that was developed, you know, and a set of ideas that was
developed at a time where some of these dynamics were just totally unimaginable. Like, you know,
when the writers of the U.S. Constitution were writing the United States, or vice versa,
in any kind of like large scale way, certainly in any kind of broadcast way,
you know, that just would have been nearly unfathomable. And so, you know, some of the
paradigms that we have about how free speech should work and democracy should work are really just challenged by the scale and the speed and the freedom of the dynamics we have here is that on the one hand, people being able to talk to each other in a way that the government can't see into and control encrypted speech is really important for a democracy.
And, you know, encryption used to just be like walls.
Like I would meet you in a place and have a conversation with you and no government could see into our house.
And that was really important for people to be able to organize and fight power and fight
bad governments.
And so we want to be able to hold on to that ability for citizens to organize and to coordinate
with each other, even though we know that in some cases when people meet and have conversations, they may be talking about crime.
On the other hand, the ability to do large-scale crime anonymously and across national boundaries
is reasonably concerning. And when you talk about child sexual abuse material,
if you're posting a picture of sexual abuse
in a large group chat that is populated
by anonymous individuals from places all over the world,
that is deeply, deeply harmful.
And it's something that governments have a reasonable
reason to want to control. And if you're facilitating that to happen, I think you can
expect that you're going to get a knock on your door from governments that don't want their
children to be abused. I think that's a reasonable position for almost anyone. And so, you know,
I think the challenge here is we have an app that's doing a whole bunch of these things at
once. It's offering some encrypted communication channels, although way less than I think
its boosters like to suggest.
Most of Telegram is not encrypted and you have to enable it in order for it to be encrypted
unlike an app like Signal.
But you're doing some sort of really important free speech, classic free speech work where
you're enabling activism and enabling people speaking truth to power.
And you're enabling a bunch of these, you know, sort of really pernicious crimes and communities.
And the question is, you know, sort of what should we do about that? And I guess, you know,
where I gravitate myself in this is that it is important for, you know, people to have a voice
in how tech platforms operate. And the way that we do that is that we form democratic governments
and we create laws. And so in places where democratically elected governments are saying like, hey, we don't want this kind of
behavior on our platforms. I think that, you know, I kind of like my impulse is to say,
that's a really important thing for these tech companies to listen to. And there should be
consequences if they're not listening to the voice of the people, because who else ultimately
should you be accountable to? Well, and it seems like at the moment, each government addresses this sort of individually for their country.
And there is no broader oversight here for how this will affect everyone.
And as you mentioned, you know, Telegram is you're not just communicating with people just in your nation, but you can communicate with people all over the world. And so what do you think is the right approach here for how
you think about regulating this and whether or not, you know, I think that's even that part
seems a little, is a big question, I think, for a lot of people is whether government should
have a say in how this is managed. How do we do this broadly for folks around the world
and not just look at this nation by nation, case by case?
So I guess I would say, so one of my heroes
is Eleanor Ostrom.
And she was this economist who won the Nobel Prize studying
how commons are managed.
And she really pushed back against this idea studying how commons are managed.
And she really pushed back against this idea that human beings, that there was this tragedy
of the commons, that human beings were bad at managing commons.
She pointed to all these cases where people are actually really good at managing commons
and identified a bunch of key rules about sort of what makes it possible to manage a common area well. And I'm raising all
of this because in some ways these are, you know, information commons that we're talking about.
And so one of the things that Ostrom observed is that it's really important to kind of have this
quality of subsidiarity where the right
decisions are being made at the right level and that people have some control at a local level
of the things that most affect them. And I think that's so wise about what works in human nature
and so contrary to how we've structured our digital systems in general.
You know, you asked a question about sort of how would we govern this at a global level.
And I guess my feeling is, you know, and Ostrom said this so beautifully,
she always talked about sort of no panaceas. There's no one size fits all solution to a bunch of these problems.
You have to think in terms of structures that allow people a level
of local control. But we built this digital ecosystem, including Telegram, including Meta,
including a lot of these tools that are essentially one piece of software, one algorithm is supposed to serve, you know, 2 billion, 3 billion people,
whatever it is, around the globe. And I think that's impossible. Like, I think we're reaching
the acknowledgement that you can't do it that way. And you certainly you don't want to do it
that way. Because whether it's Mark Zuckerberg, or it's Pavel Du Durov or it's Elon Musk, you know, relying on one centralized person or even one centralized entity to work through all of the nuances of how human beings should communicate in millions and millions of different places around the world, you know, is not going to work.
And so what's exciting to me at this moment is that we're also starting to see real explorations of what it would look like to do this differently.
And that means kind of allowing people ways to govern themselves at a more local level online and to take on some of these responsibilities at the right level.
Now, it's worth saying that just like with a federal government, there are some things that probably need to happen in a more centralized way. And probably, you know, there are pieces of getting rid of child
sexual abuse material online that need to happen in that way, right? If I'm starting,
it's like being a small state and you don't want to have your own army. You want the protection
of the national army. So there are some things where you really need that kind of centralized,
federalized infrastructure.
But there are a lot of these decisions that should be pushed to the edges.
And I think we're seeing the birth of a new set of communication tools
and social media tools that are investigating what does that really look like to do. And now back to the episode. And what are some examples that if you would share of
these sort of tools that are approaching this and what you would think is a better way? Yeah. So, you know, we're starting to see this with first Mastodon and then Blue Sky
and some of these other sort of things that initially look like a Twitter clone,
but that are actually structured really differently so that you can govern the communities, the rules and dynamics on them at a more localized level.
It's worth saying that these kinds of decentralized or federated networks pose a whole
bunch of interesting new questions about where legal liability lies. But they do, because they work more like email,
where there's a kind of open protocol
that allows different entities to communicate with each other
rather than a kind of single closed piece of software or application.
You know, they offer much more of an opportunity
for people to kind of experiment with different ways of operating.
I think one thing that's really compelling that you've shared is that you have these platforms
that enable you to, you know, create space where you can activate your community around issues that
really matter, you know, affecting things like human rights and other issues that are just
critically important, but also allow for the proliferation of things that maybe are not so great.
So are there solutions in some of these platforms to how you can allow for one thing to happen and not the other?
Or does that still seem to be a challenge that's just really remains insurmountable?
So I think both, like just to be clear, this is so, so hard. These questions
are really, really hard. And the question of, you know, content moderation in general is sort of
like a, like, root level human culture question of what is acceptable and what's not acceptable and does this word mean this or that um these are
really really challenging um uh things to get right which is one of the reasons that i think
we need uh we need ways of doing this that are much more um kind of uh uh flexible and, and, um, less, you know, centrally, uh, centrally controllable. Um,
but I think, uh, uh, in terms of, uh, uh, activity pub, which is the protocol that
Mastodon runs on, there's a really interesting and exciting set of efforts, um, to build some
of these kinds of infrastructures that would support allowing someone who sets up their own server to make sure that that kind of stuff is filtered out
and they don't accidentally become a hub for something really damaging or harmful or pernicious.
So I think we are starting to, you know, and Blue Sky likewise has this really interesting
idea around different kind of moderation services where, you know, imagine that you can kind of apply these filters or these layers
to a set of conversations so that you can decide, you know, maybe you want a place where
people are not using slurs or not using profanity.
Okay, apply that layer optionally at the level of an individual or the level of a community, but you don't have
to apply it at the level of the company. And I think, you know, it sounds like progress,
and that seems that there's an opportunity there for us to really see some growth in this space,
which is exciting. And going back to the charges against Dereaux from France and thinking about government's role in all of this.
So we're clearly still not in a place yet where we really understand how government should be involved here.
And I imagine this is obviously a space where there's a lot of disagreement.
And since his arrest in France, we've seen South Korea has filed similar charges against him.
So I'm curious if you think that this is a moment where we'll see more countries come out with even more charges.
Will there sort of be an avalanche of charges against him coming from all over the world?
And what do you think will be the long-term impact of that?
So I think it's, you know, again, it's really hard to tell because we don't have a bunch of
the critical details. And maybe we'll get them and, you know, the whole thing will feel like
a oppressive attempt by the French government to stifle a free speech advocate, maybe we'll get them and we'll go
like, yeah, this dude should have replied to the subpoena that came from law enforcement trying to
solve a really important crime. So I think, you know, and maybe it'll be a mix of both. And this
is part of what's so challenging in these issues is, you know, when you run a platform that messily
combines all of these different attributes, you know, it's hard to tell. Is someone, is a
government coming after it because they want to stifle the conversation about Israel and Gaza,
or are they coming at it because they want to protect their citizens' ability not to be harassed or to be sold counterfeit goods
or to be deepfaked into pornography online. And so, you know, I think Telegram at this point
facilitates a lot of those things. And that's what makes it kind of like messy. And that's also what makes it unusual.
I think even acts under Elon Musk,
I think is probably more responsive
to law enforcement requests than Telegram is.
And certainly when you talk about a Google or a Facebook
or some of the other large platforms,
they've decided that they're going to obey national laws in most cases. And so, you know, this is kind of like, I think I would
hesitate to read too much into it, just because it's such an unusual posture that Telegram has taken.
And so to you, then, it doesn't totally feel like, I guess, a watershed moment in how
we think about tech regulation or, I guess, even thinking about how other
founders or CEOs of tech companies should expect that they might respond to similar inquiries from governments.
This isn't a shift in what we've already seen in terms of reaction and cooperation.
Well, I think it is indicative of this important moment about, again, who gets to decide what's going on in these spaces and what control do governments
have realistically? When you have a company that's incorporated outside your soil,
that doesn't require, you know, doesn't have a business that is easy to cut off, you know, how should governments respond to that? And when it's both
a service that's allowing for some really important speech and allowing for some really
bad crimes, like what is the appropriate response? And so I think that's all coming to a head
with Telegram. I think it's coming to a head in some ways with X and Brazil. And I think we'll continue to see that. That's going to be a big part of the story of tech and government in the next decade is how does this stuff work itself out? can we expect and require these businesses to moderate themselves? And where do democratically
elected groups of people weigh in? And P.S., also, you know, how should we think about this
in more authoritarian contexts where, you know, governments are trying to stifle speech that they don't like,
and they don't have a democratic legitimacy. And I think, you know, what's really interesting here
is that it's so gray, right, that there's, you can have governments that are interested in
both sides of these things, that it's not just very black and white, where governments just want
one thing, of course, right?
And so we actually, we shared on our social channels yesterday that we were going to be having this conversation and we gathered some questions from our community. And also for those
of you who are watching live now, we encourage you to also ask any questions you have. And we'd love
to pose those to you as well, Eli. But one of the questions we received was how do we hold people
on social responsible for the information they disseminate?
So I guess this goes beyond just thinking about, you know, the Darovs and Zuckerbergs and people who are running these platforms, but the actual users.
What are some better ways we can ensure that there's more regulation of what people are actually doing?
People are taking more responsibility for their own actions in that way. Yeah. Well, I think, I mean, first, let me say, like, I consider myself
a fan and supporter of free speech. And part of that means like being okay with and even fighting for the rights of people to be deeply wrong on the internet and beyond wrong,
you know, gross and cringy and like saying things
that I really hate.
And so I think to some degree, that's
just part of living in a world where you're hearing from lots of other people is that you're harmful, that's untrue, libelous,
a crime itself, as with child sexual material, you know, and there aren't kind of the the circuit breakers that prevent that from happening.
And so whereas people always used to be able to say pre-internet terrible things to each other,
whatever they want to each other, misinformation or disinformation or bad things,
but they weren't able to access groups of, you know, 100,000 people, a million people
easily. And now that's easy. And so I think we have to start thinking about this in those terms,
which is that kind of with great reach comes great responsibility. And that there are probably tiers of expectation in terms of what your behavior is and what your adherence to the norms in the space are in terms of your ability to reach larger and larger groups of people.
And that if you want to reach a million people, then you need to take some responsibility and some care in a way that is different than, you know, talking to your
friends or talking to another individual. And when we think about countries that sort of have
best practices when it comes to thinking about the way they work with tech companies and work
with these platforms to ensure that things are happening safely in democratic ways as well,
we have a question here where someone asks, how can we ensure that tech platforms remain free happening safely in democratic ways as well.
We have a question here where someone asks,
how can we ensure that tech platforms remain free
and open without the heavy hand of government interference?
And then they say, as we've managed in the UAE,
which I think is really meaningful here
because that's where Telegram is headquartered.
And I'm curious if you think that there are countries
that seem to have a better approach to this than others.
I don't know if I would ally myself in general in a policy sense with any particular country.
I think a lot of countries have, it's a really mixed bag.
And there's also so many different issues here.
So if we're talking about encryption and the ability to have conversation that's truly private,
there are some countries like Germany that are really pushing for encryption to be more part of the tech ecosystem
and even to require it in some cases.
There are other countries like Spain, just to name two different countries in Europe, that are seeing it as a
threat to the government and seeking to undermine it. And no doubt there's all sorts of machinations
behind the scenes by intelligence agencies and others to break encryption and to be able to see inside the black box that exists.
So, you know, there's a really,
but those countries then may have really different policies
when it comes to misinformation or hate speech
or, you know, doing other kinds of crimes.
So I guess we're still in this moment where
it's not like there's a country, I think, doing other kinds of crime. So I guess we're still in this moment where
it's not like there's a country, I think, that has totally nailed it on digital speech policy.
And I think we're also in this moment where,
because there are ways in which virtual space,
we imagine it to exist beyond any given jurisdiction, but actually it does exist in particular places.
My body is in a particular place.
That means I'm subject to a set of rules in the United States.
The servers that I'm using are in a particular place, and that means they're subject to the rules of the places that they are. And so, you know, I think part of what we're experiencing is the bringing to ground kind of some of our more fanciful notions of the Internet, that it's just this thing that exists virtually.
Well, no, it actually like, you know, if you have a military, you can go like take some of the pieces of critical infrastructure.
And so what the laws are and the places where that infrastructure exists really matter.
And so there are some questions also that we received around, like, what does this actually mean for for Telegram? What happens now? So do you foresee that that Telegram itself will be restricted in some way as a result of of these charges?
It's really hard to know how it's going to play out.
And Telegram seems to be adopting this kind of fighting posture and has certainly lots
of communities that want to see it fight.
I think, on the other hand, I think that in the large scheme of things, the idea that you can resist all of the
world governments in applying their laws is probably not going to work out. And I also think
there's this irony to how Telegram is created, which is, on the one hand,
it's kind of talking this talk of being opposed to government intervention in what goes on there.
On the other hand, part of the reason we're talking about Telegram
and we're not talking about Signal or we're not talking about WhatsApp
is that Signal and WhatsApp are encrypted by default. So those companies don't have any idea
what's going on on their platforms and neither does anyone else. And surely there are bad things
and crimes happening there too, but governments and civil society and other people can't look into them. And so, you know, Telegram is in this very
awkward position of essentially inviting people to do crimes out in daylight, and then at the
same time trying to stop the police from coming and disrupting those crimes. And I'm not sure
that that is going to be a sustainable model. It seems like the question also then should be around encryption and whether or not that is something that should be regulated in some way.
I'm curious what your thoughts are on that.
Again, I'm a huge fan of Signal.
I think Signal is really an exciting model of what tech done right might look like.
And part of the reason it's exciting is it's not kind of data extractive.
It's not trying to sell you as a product to someone else.
The encryption is strong, and that's kind of the core product.
But Signal, very deliberately, is not a place for large groups.
And I think that's a smart decision for several reasons.
One is, I think encryption, the ability to have a truly private conversation,
makes sense as something to protect when we're talking about a person or a small group of individuals.
When we start talking about a larger group of people, you run into two problems. One is, you know, I think from a regulatory standpoint, again, it starts to push
on these questions of like, well, if you're talking to millions of people, maybe you do have
some responsibility about what you say, and maybe that ought to be more public than your conversation
with your friend. On the other hand, when you're talking to millions of people,
it's going to be really hard to make that in a way that doesn't leak, essentially,
that isn't visible to someone in power anyway.
And so what does it mean to have an encrypted service
that millions of people are using in one chat?
That is another set of questions.
So I think encryption is absolutely an important part of enabling people to have, not just
to have real relationships, but to fight power online.
If I'm operating a company and I'm trying to stop people from unionizing, the thing I want to be able to
do is to see into everyone's conversation so that I know who the underlings are who are fomenting
trouble. And if I'm someone who wants to create a union, I want to be able to have that conversation
in a way that the boss can't see what I'm doing. And so protecting that is really important.
But again, I think that's different than allowing these really
large scale conversations to happen with no oversight or regulation whatsoever. And that's
kind of where I would draw that distinction. And I guess I wonder for other tech companies
that are watching this, you know, what do you think are the actions that are happening,
the conversations that are happening, I guess, in those companies, the actions that might take place in the days or weeks to come sort of preemptively as they're watching this whole thing unfold with Telegram, which I think has under 100 staff, reportedly, it's a pretty small
operation.
Signal, likewise, I think the total budget for Signal is $40 million, which is a rounding
error for Meta, which has whole kind of like ambassadorships to each government that are making sure that everyone feels happy and
comfortable with what happens on Meta. And so I think you have this real distinction between
kind of the way that big tech is playing. And I'm sure big tech is not very bothered by any of this
because, again, you know, they've wired up a lot of these relationships on the other hand
you know if you're a smaller operator then you're really thinking about sort of how
how do i operate in this international fraught international context where it's really hard to be responsive to, you know, 120 different governments across
the world with a team of 60. And I guess, you know, the challenge there is, I think probably,
you know, this is just one of the weird things about the internet that you can reach this kind
of scale of reach, that you can be in 120 countries,
but really be unequipped to deal with rules or regulations of all of those different places
that you've leapt into. And I guess I don't, it's not clear to me that that doesn't mean that we
should actually kind of scale those things together, that you ought to be able to be responsive to the governments
of places where you're operating. And so when we're thinking about this story, so
now, Derov was arrested, he's been released, and he's now been ordered to report to a police
station in France, I think, every two weeks. So what happens now? I
guess, what can we expect to see happen over the next few days or weeks? And how long before this
story feels like it will have fully unfolded and resolved? Well, you know, I think in the near term,
we'll learn a lot more about exactly what the French government's case is and how strong that case is.
And again, I think there are a lot of people who I trust who are legal experts who say,
there's a very good chance that this is actually a pretty clear case of someone
ignoring the law and they can't. There's also a real possibility that this looks much more borderline and even that it looks like an attempt to stifle speech that's unwanted, that isn't, you know, the sort of worst. It's easy
to say as soon as you start talking about child sexual abuse material, then that justifies almost anything. And so,
you know, that would be the worry. So we'll find out a lot more about that in the days to come.
As you said, there's also this other question of how other governments are going to respond
to Telegram and to all of the behavior that's happening on Telegram. And then there's this question of how other platforms
will. And I guess I think, you know, what's interesting is a lot of these tools, they're
not interchangeable. And certainly Telegram has a bunch of unique aspects to it that make it
desirable for the people who are on it. But one of the things that we saw, Whitney,
that I thought was really interesting
was that when the Brazilian government shut down X,
within several days, over 2 million new people
had signed up for Blue Sky, which is, again, sort
of a Twitter clone based on more decentralized principles.
And so it's not that hard to move. And one of the questions that we'll
have is, you know, on the one hand, maybe this is just whack-a-mole forever. Maybe someone's
going to create, you know, Telegram 2, and everyone's going to go there until they find that guy. On the other hand, you know, maybe this is an opportunity to gravitate toward infrastructures
that make a little more sense, are a little more kind of thought through from a philosophical standpoint
and, you know, and maybe will land in a place that is a more stable equilibrium.
And who do you see as the driver of that?
I mean, I guess in this case, there's this arrest.
But is this something that you think comes from the people, the users?
How do we initiate that sort of chain of events?
Well, I guess, you know, to me,
I mean, a lot of this does end up
being kind of group behavior
and that's what we saw in Brazil
where, you know,
why did people go to Blue Sky
and not Mastodon or not Nuster
or not some of the other,
you know, sort of decentralized
alternatives to Twitter or to X?
That's where everyone decided to go. And so,
you know, some of these things really are kind of group behavior, herd behavior.
But I really think one of the lessons here is it's important to kind of be an informed consumer of the tech that you're using. And there were a lot of people who were saying without any evidence that Telegram
was the super secure place to be.
If you're turning on the encrypted group,
the encrypted one-on-one chat,
like that is encrypted,
but actually most of what's happening on Telegram
is less secure than a bunch of the stuff
that's happening on WhatsApp and Signal.
So, you know, kind of you can't believe everything that you hear about what these platforms are
and starting to really understand, you know, it's almost like, you know, with food,
we now know that like where things come from and how they're made really matter. And
you can have something that superficially looks the same, but depending on, you know, how it's,
how it's created, how it's sourced, it's going to actually affect you in a really different way.
And so, you know, that's kind of where we are with tech right now, where I think people are realizing that they need to be more tuned into what are the values of the platform that I'm using?
What are the, who owns this?
Who and what are they trying to do?
And, you know, and how does that match with my goals?
And, you know, I think that's part of how we're going to need to do tech in the 21st century.
Well, I mean, I wonder then for folks who are watching this who maybe feel like this is an interesting story for me to watch happen, but I don't feel very connected to this. Like this is not something that actually is impacting me or affecting me in some way. And I imagine if you've stuck around
for this long, you probably are not in that camp. But, you know, I'm curious for people who really
look at this and they're like, I don't use Telegram and, you know, I'm not as
keyed into what's going on in that world. What are the big takeaways, I think, for all of us, regardless of our level of
involvement here? Yeah, so I think, I mean, the big question we should all be thinking about,
because it affects all of us, and it will affect all of us, is how do we, who do, who should have control?
How do we want power to work in our digital lives?
And if you think it matters now,
it's going to matter more as we spend time and, you know, whatever metaverse AI stuff is cooking, right?
Like, and so I think we're in a pretty broken moment in terms of how power works online.
I think the idea that people around the globe have to ultimately submit to the changing opinions
of a few billionaires, largely centered in America, like that's a crazy way to run a
communications environment. We should probably like think beyond that. And I would also say, billionaires largely centered in America, like that's a crazy way to run a communications
environment. We should probably like think beyond that. And I would also say that, you know,
while it feels like a big project, building communication infrastructure is something that
people and countries have innovated and done many times through history. And now might be a good time
to think about doing that again.
So, you know, we've created public media, we've created public libraries, we've created
a whole bunch of institutions that help facilitate people communicating in a democracy.
And I think if we're reaching the limit of kind of this particular freewheeling, individual founder driven, largely Silicon Valley moment.
Maybe that's a good thing because maybe there's some some some new paradigms to be discovered around the corner.
Well, it sounds like we're definitely at the dawn of a new era when it comes to these conversations. And I just really
appreciate you for taking the time today, Eli, to talk to us about all of this. Thank you so much.
I learned a lot and I hope all of you who are watching also did. So thank you for that. And
thank you to those of you who have joined us. Thank you. Thank you. Really great to be here.
Support for this show comes from Airbnb. If you know me, you know I love staying in Airbnbs when
I travel. They make my family feel most at home when we're away from home. As we settled down
at our Airbnb during a recent vacation to Palm Springs, I pictured my own home sitting empty.
Wouldn't it be smart and better put to use welcoming a family like mine by hosting it on
Airbnb? It feels like the practical thing to do, and with the extra income, I could save up for
renovations to make the space even more inviting for ourselves and for future guests. Your home
might be worth more than you think. Find out how much at Airbnb.ca slash host. That was Eli Pariser and Whitney Pennington-Rogers
in the latest episode of TED Explains.
If you're curious about TED's curation,
find out more at ted.com slash curation guidelines.
And that's it for today.
TED Talks Daily is part of the TED Audio Collective.
This episode was produced and edited by our team,
Martha Estefanos, Oliver Friedman,
Brian Green, Autumn Thompson, and Alejandra Salazar.
It was mixed by Christopher Fazey-Bogan.
Additional support from Emma Taubner
and Daniela Balarezo.
I'm Elise Hugh.
I'll be back tomorrow with a fresh idea for your feed.
Thanks for listening.
Looking for a fun challenge to share with your friends and family?
TED now has games designed to keep your mind sharp while having fun.
Visit TED.com slash games to explore the joy and wonder of TED Games.