The Current - Tracking Telegram
Episode Date: May 6, 2026When the Canadian Centre for Child Protection found what is says are images of child sexual abuse on the messaging app Telegram they took that allegation to Britain's online safety watchdog Ofcom. Tha...t is because Canada doesn't have a regulator to look at how online platforms deal with this type of illegal content.
Transcript
Discussion (0)
Look, you may have noticed that it wasn't just the outfits that people were talking about when it comes to this year's Met Gala, but the politics surrounding fashion's biggest event.
I'm Alameen Abduh Mahmoud, and this week on my podcast, commotion, we're talking about billionaire Jeff Bezos and his wife, Lauren Sanchez, co-chairing this year's Met Gala, which managed to upset activists and fashion insiders.
Check out the conversation that I had with fashion critics about how billionaire involvement changed the Met Gala.
You can find and follow commotion on YouTube or wherever you get your podcasts.
This is a CBC podcast.
Hello, I'm Matt Galloway, and this is the current podcast.
It is a major investigation into a worldwide messaging app, and the tip came from a Canadian
organization.
But when the Canadian Center for Child Protection found what it says is child sexual abuse material
being shared on telegram, they took that allegation to Britain's online safety watchdog,
and that's because Canada doesn't have a regulator to look at how on
online platforms deal with this type of illegal content. Lloyd Richardson is Director of Technology for
the Canadian Center for Child Protection. He's in Winnipeg. Lloyd, good morning. Good morning, Matt.
Thank you for having me. Thanks for being here. This investigation, as I say, came from a tip from your
organization. What was the tip? What were you suggesting was happening on Telegram?
So yeah, I don't want to suggest that this is the first time we've ever noticed bad activity on Telegram.
I think this is probably more related to the egregiousness of the volume and distribution of child
sexual abuse material that we see on this particular platform. So we operate Canada's national
tip line for reporting child sexual abuse material in general exploitation against children on the
internet. And for years, we've seen issues with telegram. This particular one, like I said, was of the
level that we obviously notified law enforcement as well. But given that regulators are sort of new in this
space on the internet, we reported this to offcom as well. And I don't want to get into specifics, but
Is this images, videos, is this, what made it reach that level of egregiousness, as you say?
It was images and videos, and it was high volume, very high volume, both images and videos that we had not seen before,
as well as images and videos that we've seen in circulation for the last 20 years.
How are you finding this?
Well, we do have some proactive detection tools that we use.
We have something called Project Erachnit that can scan various areas of the internet for this material,
but we aren't explicitly targeting telegram in any sort of way with this.
Some of our intelligence came through this tool to point us to a number of places on telegram
where we were seeing this type of activity, and we decided to dig in a little bit more
on those little nuggets of information that we found.
And so you took this to the British regulator, Offcom.
And we know that child sexual abuse material is also illegal in this country.
Why did you take it to the UK?
There's a number of regulators we've actually notified, I would say, about this.
and there are a few sort of burgeoning ones within the world right now.
What's interesting is there's a space that exists between what law enforcement might take action on and the rest.
So the notion that if we notify Telegram about something, for example, about egregious transfer of child sexual abuse material on their platform,
their reaction typically to that would be to perhaps remove a group or remove an account and not do any sort of root cause analysis.
So it lives in this space of, okay, well, they're doing the bare minimum of removing something when we're telling them to,
but there's a wider problem that's not really being dealt with there.
And I think that's the area that we see regulators playing a role in
where something isn't necessarily in the scope of law enforcement
or law enforcement has other priorities.
And we need to look at areas where law needs to be enforced in a different way
that is beyond the simple, I pointed something out and removed something.
But is there, in the introduction, I hinted that there perhaps isn't a regulator here
that could deal with this sort of content.
Is there nothing in Canada that would do that beyond law enforcement?
Nope, that would be it.
I mean, we have the CRTC, which is a regulator for other things, but we don't explicitly
have a regulator here for online content.
And the concept of that is really new worldwide, I would say.
How often does Telegram come up in the work that you do to protect children?
Quite frequently.
I often say that many or all roads lead to Telegram in this space when it comes,
especially to monetized child sexual abuse material.
So we'll see people trying to sell this material using platforms such as X or Instagram.
They'll put up something that looks somewhat innocuous or maybe a little bit suggestive relating to children.
And inevitably, they're going to be directing people to a telegram account to essentially exchange money in exchange for child sexual abuse material.
Or another example might be where we see children being sex-dorted.
So someone pretending to be someone else online and exchanging nude photos to try and.
elicit photos out of the child, in a lot of cases, we'll see children not necessarily
approached directly on telegram. They're approached on something like Discord or TikTok,
and then inevitably they'll be moved to a platform like Telegram where the discussion
will continue. So there's a lot of intersection that we see with bad activity related to
children with Telegram. And move to Telegram. We're going to talk more about this platform in a moment,
but move there in part because privacy is key, right? Partially, that's the notion they give. But
Telegram is actually a different sort of animal.
It looks different than Signal and WhatsApp in that it has the notion of very large groups or public spaces in it
that is not really designed for in the way that Signal and WhatsApp do.
And these large groups on Telegram are, in fact, not end-to-end encrypted.
I think there's a sort of misnomer's going on about that.
But when you look at a really, really large group like that or the capacity to have a large group like that,
it creates a lot more risk and responsibility for the platform because,
It creates the avenues for transfer of this type of material and other illegal things in a way that you wouldn't see on a traditional messaging app.
Telegram has said, and this is a statement that it gave, I want to read the statement that it gave to us,
but first the statement that it gave the Globe and Mail in reporting this is that Telegram says it has virtually eliminated the public spread of child sexual abuse material on its platforms through algorithms and cooperation with NGOs.
What do you make of that?
I would take issue with that statement. If I were to compare a telegram against any other technology company of their size, I'd say they're doing an incredibly poor job of detecting child sexual abuse material. If I go back to the original comment I made about the type of material we were finding on the unencrypted side of telegram, we're finding known child sexual abuse material that any large technology company can relatively easily detect. So the notion that they're using, quote, world-class detects.
actions algorithms, that rings pretty hollow for me.
In its statement to us, Telegram said, Telegram categorically denies off-com's accusations.
We are surprised by this investigation and concerned that it may be part of a broader attack
on online platforms that defend freedom of speech and the right to privacy.
What do you make of that?
Again, it's another sort of boilerplate response that you see where you try to deflect what's
going on, right?
you're starting to raise the specter of freedom of speech and privacy.
To go back to what I said, this is child sexual abuse material.
I don't think there's a freedom of speech argument here.
And they're talking about right to privacy when they're talking about things that are happening essentially in a public group of sorts.
What would you like to see the Canadian government do?
There was an online harms act that was proposed.
It was controversial at the time.
It died when Parliament was paroched in January of 2025.
There is yet an alternative that has been proposed.
opposed, although there is one coming, we are told. There was a rally last week on Parliament
Hill calling for the government to retable the bill with a focus on protecting children from
online predators. What could the government in this country do? I think resurrecting that legislation
would be absolutely critical because it has the potential to build on the notion of putting
rules in place for companies like this within Canada to be able to apply sovereign law to
companies like this who essentially operate without any. And it also raises the policy.
possibility of building up a regulator within Canada. I think we need to be realistic about what that
looks like. This isn't a process that you can just snap your fingers and make it happen and everything's
going to be perfect afterwards. It's going to be a bumpy road, but it's one that we need to start
going down now. What should the regulator look like such that it would actually be effective in
preventing the spread of the material that you're talking about? That's a difficult question because
it's such a new topic. And it's interesting how it's sort of played out in terms of telegram in this
space because it looks funny that a charity within Winnipeg, Manitoba, Canada is giving intelligence
reports to a regulator in another country. I think that when a regulator is put in place, you need
to have these relationships with other organizations that are able to do some of that boots on the
ground or frontline work like the work that we're doing here. So I think integration with a
entity such as us to be able to get these sorts of things done is really, really important. Obviously,
is going to have to be staffed properly. This isn't going to be an inexpensive sort of thing,
but I think that there's creative ways to be able to levy fees, etc., from some of the large
tech companies that are, you know, facilitating some of this harm. That's a possibility.
Just finally go back to the concerns that people had about the Online Harm's Act, not what
Telegram is saying, but what many Canadians have said, that this act would infringe on people's
freedom of speech and the right to privacy. How do you respond to that?
Yeah, I think that the first act had a number of things mixed in with it. I think that child protection is a unique sort of space here. And I don't think you have some of the same sorts of difficulties you would have if you fenced online harms specifically related to children. I think there's more universal support for that. And I'm not saying that the other harms are unimportant. They absolutely are. But sometimes it's difficult to push through a bunch of
different harms all in one swath. It becomes more difficult for people to get their head around
what that looks like. But again, I think people also don't like change, right? The notion that there's
any sort of regulation coming for the internet tends to ruffle people's feathers because of the
notion of the sort of 1990s internet, everything is free. Anarchy is great. We're sort of coming at
a crossroads, I think, right now where we're starting to see the harms that have come out of this
sort of social experiment that has happened over the last 30 years, essentially.
and I think people are starting to realize that this is not where we wanted to end up.
So, yeah, I'm hopeful that the retabling of some legislation in Canada is going to receive
a lot of support from everyone across Canada because I think it's very important.
Lloyd, good to talk to you. Thank you very much.
Thanks, Matt.
Lloyd Richardson is Director of Technology for the Canadian Center for Child Protection.
He was in Winnipeg.
Hello, I'm Emma from the Earth Rangers podcast.
If you think that animals are amazing, then our podcast is the one for you.
Join me as I travel the world to discover the wildest animal facts out there and solve nature's biggest mysteries.
If you're a fan of animal expeditions like on PBS's Wildcrats and the BBC's planet Earth,
or the mystery and intrigue of Carmen San Diego, then you'll love the Earth Rangers podcast.
This is a journey you won't want to miss.
Follow the Earth Rangers podcast on Apple Podcasts, and for more information,
go to GZMshows.com. See you soon.
Tachiana Locott is an associate professor in digital media and society at the Dublin City University.
She's in Dublin, Ireland. Titiana, hello to you.
Good morning or good afternoon.
For people who don't use Telegram, it isn't hugely popular in Canada,
something like 14% of people in this country use it.
Can you just describe what it is and what makes it different from other messaging apps that we might be familiar with?
Unlike, say, WhatsApp or Signal, telegram is, is,
different in that it has this combination of private one-on-one or small group chats with this
kind of feature of large groups or large channels, which are essentially more like broadcast tools
where somebody can create a large channel with tens, if not hundreds of thousands of subscribers,
and that feels more like almost like a news feed where somebody can broadcast. But that really is
what makes Telegram different because it combines these very private messaging features.
with these large-scale channels, which I think are very different in nature.
And it's why it's become really popular with individual users,
but also media outlets, you know, organizations.
So it's a little different, I'd say, than some of those other messaging services.
Tell me more about who uses the app.
I mean, it really, I think, depends on a national context.
For instance, in Eastern Europe, and in countries like Ukraine or Russia or Belarus,
it's just used by a large percentage of the population for all sorts of things,
from talking to your friends and family to getting news updates, local news like apartment building chats,
talking to shops, online salespeople, whatever.
So it's quite broadly used.
And I think that really is very specific to particular regions, whereas in other countries,
it seems like it's more popular with kind of fringe groups or undercover groups
or people trying to organize away from the public eye.
but I think it's quite context dependent,
which is why for a country like Canada,
where a very relatively small percentage of the population
are familiar with the app,
they might have a very different perception of it
than say people where it's kind of part of everyday life.
It's often been called the dark web in your pocket as well.
How often is it used for those purposes?
We just heard Lloyd talking about some of the images and videos
that were circulating that alerted his organization.
attention. How often is it used to distribute that kind of material? Well, unfortunately,
quite often, there's quite a long history, I think, of problematic content circulating on
telegram, sometimes in these very large-scale networks or communities. You know, we've had reports,
for instance, OCCRP, which is an international investigative journalism network. They've
reported on non-consensual sexual image sharing in telegram groups. We have examples, for instance,
from Azerbaijan where these groups target dissidents and specifically share non-consensual sexual
imagery of activists to discredit them. We've also had examples of broader networks of groups and
channels sharing non-consensual nude images of women. Some of those networks reach across more than 20
countries. On average, it seems like telegram is also quite popular with various conspiracy theory
groups which don't just spread content that builds on conspiracy theories, but also monetizes
this content. And in general, also quite popular with far-right, accelerationist, other extremist
networks in countries like Canada, Ireland, and a lot of other countries. So it has this sort of feeling
of being a safe haven for these groups who recognize that their activity may not necessarily
benefit from being in the public eye, and they see Telegram as a kind of safe space for sharing
these ideas and promoting them. There was an investigation by CNN earlier this year into men
using the app to trade information on how to drug their partners. You could apparently buy some
sort of liquid that was tasteless and odorless, and one of the people who was selling it on Telegram
said, your wife won't feel anything and won't remember anything. Telegram took down the chat group
that CNN looked into in the wake of its investigation.
But how difficult is it for apps like Telegram to track and remove this content as you understand it?
That's an interesting conundrum because Telegram claims that it has pretty advanced content moderation
technology. It uses, you know, AI-assisted content moderation. And seemingly it claims it's doing a
really good job of taking down illegal content. But also, it may actually not be such a
a good job because on the one hand, Telegram cultivates this sort of image or reputation for, you know,
being a kind of haven for free speech and privacy. And it really highlights these as kind of part of
its origin story, you know, and its values. And in a way, it's the reason why it has acquired such a
loyal fan base across the world, although among some fairly problematic communities. And it highlights
support for internet privacy is one of its key priorities. Its privacy policy, I think, actually
says that it protects private conversations from snooping third parties. Quite often, when Telegram is
accused of under-moderating or having this lax approach to moderation, it basically says, well, A, this isn't
necessarily our priority, our priorities to keep people's communications private. And then it almost
always cites encryption as the reason for why it doesn't do much content.
moderation. And here is where it gets challenging because
Telegram says, well, we offer encrypted communication,
but in fact, the only part of Telegram that is end-to-end
encrypted is the secret chats feature, which is not enabled by default.
Users have to enable it, and it usually only works in chats that are
between two people. So all of the bigger communities, the groups,
the public channels, they're not end-to-end encrypted.
and in fact, channels are public in their majority.
So in fact, Telegram does actually have access to content in those groups,
and it can, if it chooses to or if it has the right tools,
moderate that content.
But we've seen all of these examples where really problematic content
and accounts aren't removed until after they're called out
or until after they're reported.
The problem with reporting, of course, is that Telegram also,
it doesn't do a great job of providing tools to people to report problematic content.
And it has actually faced criticism from digital rights groups such as Access Now for not having
very transparent policies for content moderation and lacking workable support channels.
So it's actually really hard to get problematic content taken down.
The founder of Telegram was arrested in France in 2024.
Pavl Dorov charged with alleged complicity in the distribution of child sexual abuse.
material and drug trafficking and fraud on this platform. This is an ongoing investigation, but he's
been allowed to travel in the wake of his arrest. It's interesting, in part, because this is, in many
ways, the exact opposite of what happens in the United States where the tech CEOs who are in
charge of huge platforms are not held to account for the things that go on on their platform. What
did you make of that arrest? Is that a way to protect the public targeting the people who are
running those platforms? I mean, I think it's part of an attempt to protect the public. But I also
found it interesting in the French case that it wasn't necessarily that he was charged for doing
something. It was more that he was charged for not doing something. Being complicit in that.
And in fact, the French charges actually mentioned the platform being used for enabling
illegal transactions, including the distributions of pornographic images of minors or C-C-
Sam, but also, you know, like drug possession and drug transport, and basically just saying
Telegram didn't bother to stop these clearly illegal activities.
And I think perhaps calling attention to the fact that, well, if you don't do it, you are
complicit.
If you're not doing what is being asked of you, you are actually also accountable.
It's a good argument, right?
Even following the French investigation and arrest,
Telegram has actually changed some of its policies
and actually undertaken some action to start investing in content moderation,
whereas before it was very, very lax on content moderation.
So I think it's one of the ways,
but certainly holding platforms accountable is par for the course,
and I think it should be part of the efforts across the world,
because otherwise platforms choose to basically position
themselves as intermediaries and take no responsibility whatsoever for the kind of content that
is platformed or the kind of content that is amplified on their servers, on their platforms.
But I don't think we can buy into this rhetoric.
If you're providing infrastructure for this very dangerous or damaging or clearly legal content
and your platform provides an opportunity for this content to be shared, amplified, retweeted,
posted, then certainly you cannot be said to not be involved or not be complicit.
Does that answer the question around freedom of speech as well? Because his defenders,
in the wake of his arrest, said that this was an attack on freedom of speech. But we already
read that statement as well around the off-com investigation saying that their telegram is concerned
that the off-com investigation may be part of a broader attack on platforms that defend freedom of
speech. There are people, and I mean, Lloyd is one of them who would say that if you're
distributing child sexual abuse material, you have not.
no right to privacy or freedom of speech.
But how do you see that?
I think this is not necessarily out of habit for telegram.
This is the defense that they give to any accusation that they're not cooperating or not
following the law or not taking down content that is clearly legal.
But I think partly for them, it's because they've cultivated this reputation as the
protector of free speech and protector of privacy, which is obviously quite convenient.
for them because anytime they're criticized
for anything, they frame it as an attack on free
speech. But I think that, you know, we can
probably agree that there is a difference between
platforming resistance, protest, and dissent
and freedom of opinion and, you know,
circulating sexual imagery,
either clearly non-consensual or images of minors
that are clearly outside the law.
And so I don't think in this case,
this defense is particularly
effective for Telegram. Can I ask you just finally, the Canadian government, as I mentioned,
is working on this new online harms bill. Do you need to have coordination with governments
around the world to tackle this in a meaningful way? That it can't be just Europe. It can't be
just Canada. That everybody has to be involved in this together. I think so. In an ideal world,
we would have this coordination. I think attempts are happening to some extent because platforms are
you know, on the internet and the internet is a global network, although it's obviously
differently regulated across different jurisdictions. But I do think that, you know, if there is
a sense of the kind of internet we want to have that preserves freedoms, but also holds
people responsible and platforms responsible for harms, then coordination is required. And I think
then when countries do attempt to coordinate their efforts to,
coordinate should be based around a certain shared set of values. But that obviously is a
conversation that needs to be happening actively. I don't think we're necessarily at a point
where it's happening yet. Tatiana, really good to talk to you about this. Thank you very much.
Thanks for having me. Tatiana Locot is an associate professor in digital media and society
at the Dublin City University. She was in Dublin, Ireland. You've been listening to the current
podcast. My name is Matt Galloway. Thanks for listening. I'll talk to you soon.
For more CBC podcasts, go to cBC.ca slash podcasts.
