Darknet Diaries - 93: Kik
Episode Date: May 25, 2021Kik is a wildly popular chat app. Their website says that 1 in 3 American teenagers use Kik. But something dark is brewing on Kik. ...
Transcript
Discussion (0)
Okay, so this episode has mature content.
I don't recommend listening to this with young ears around
or on some kind of speaker where others might hear it
because this episode gets into some dark stuff.
So listener discretion is advised.
All right, so this episode is about a mobile app called Kik, spelled K-I-K.
It was made in 2010 to help mobile users be able to chat with one another easier.
See, back then, people who had Blackberry phones had a hard time chatting with Android users and iPhone users.
So some Canadian college kids made Kik to be able to let people chat freely no matter what type of phone you had.
Kik was an immediate hit.
Just one month after launching, they reported to have 1 million
users, and it's been growing rapidly ever since. But something happened in the Kik chat app,
which sent this app down a dark path.
These are true stories from the dark side of the internet.
I'm Jack Recider.
This is Dark by Delete Me.
I know a bit too much about how scam callers work. They'll use anything they can find about you online to try to get at your money.
And our personal information is all over the place online.
Phone numbers, addresses, family members, where you work,
what kind of car you drive. It's endless and it's not a fair fight. But I realized I don't need to be fighting this alone anymore. Now I use the help of Delete.me. Delete.me is a subscription
service that finds and removes personal information from hundreds of data brokers' websites
and continuously works to keep it off. Data brokers hate them because Delete.me makes sure
your personal profile is no longer theirs to sell. I tried it and they immediately got busy scouring the internet
for my name and gave me reports on what they found. And then they got busy deleting things.
It was great to have someone on my team when it comes to my privacy. Take control of your data
and keep your private life private by signing up for Delete.me. Now at a special discount for
Darknet Diaries listeners. Today, get 20% off your Delete Me plan when you go to joindeleteme.com slash darknetdiaries and use
promo code darknet at checkout. The only way to get 20% off is to go to joindeleteme.com slash
darknetdiaries and enter code darknet at checkout. That's joindeleteme.com slash darknetdiaries.
Use code Darknet.
Support for this show comes from Black Hills Information Security.
This is a company that does penetration testing, incident response, and active monitoring to help keep businesses secure.
I know a few people who work over there, and I can vouch they do very good work.
If you want to improve the security of your organization, give them a call. I'm sure they
can help. But the founder of the company, John Strand, is a teacher and he's made it a mission
to make Black Hills Information Security world-class in security training. You can learn
things like penetration testing, securing the cloud, breaching the cloud, digital forensics,
and so much more. But get this, the whole thing is pay what you can. Black Hills believes that Thank you. to the MetaCTF cyber range, which is great for practicing your skills and showing them off to potential employers.
Head on over to blackhillsinfosec.com
to learn more about what services they offer
and find links to their webcasts
to get some world-class training.
That's blackhillsinfosec.com.
blackhillsinfosec.com.
This is the tale of two different Kik users.
First is a guy who we'll call Doctor. Some person, I don't remember their name, said that Kik was a good app to meet people on and to join some communities.
And I figured, well, why not? Might as well try it.
And so I joined. Sounds innocent enough. Everyone can use, well, why not? Might as well try it. And so I joined.
Sounds innocent enough. Everyone can use new friends, right? So that's why Dr. joined Kik.
And by the way, Doc is like 20 something years old, and I'm just disguising his voice a bit.
Now, when you join Kik, you can do so anonymously. All the questions are soft,
like it asks for your email, but you could just put anything in there because they don't email
you any sort of registration confirmation link or anything.
And it asks for your phone number, but you can skip that too.
So yeah, when you get on Kik, you can be whoever you want
without giving any details as to who you really are.
The other guy we'll hear from we'll call Azrael.
Everybody was telling me to make an account
just to hang out and message them on
there instead of whatever we were using at the time. I don't even remember. This is called the
network effect and it's powerful. The more people who join and want to chat with other people,
the more they invite other people to join. So you're either on Kik or you're not. And if Kik
is where all your friends are, you might as well get in there and be there with them. Now, these two people, Doctor and Azrael, have entirely different stories that take them down
two different paths while using Kik. And we're going to weave in and out and back and forth
between the two of them. It should be fun. Kik is an old instant messaging app.
And you can chat, share pictures, videos. On the app, you can connect
with someone you know and send them private messages, or you can join chat rooms. In fact,
when you first install the app, it tries to show you some cool chat rooms to try joining.
And rooms have themes, like you can join a room about Pokemon or Fortnite or different regions
of the world where people live. And you can find different groups based on search preferences.
So you can use different tags and such to try and find specific groups that might share interests with what you're looking for.
And it's like really it's the and make good groups and start good communities.
But it is not maintained well enough, so it has its darker side.
Ah, yes, the dark side of Kik.
That's what I'm interested in.
So let's go there. But first, let's back up and talk about the company side of Kik. That's what I'm interested in. So let's go there.
But first, let's back up and talk about the company who runs Kik.
You're going to be very curious about this eventually, so we might as well get into it now.
Kik was created by some Canadian students in University of Waterloo in 2010.
A month after launching, it had a million users.
Five years later, the Chinese company Tencent invested $50 million into Kik, trying to make it the WeChat of the West. WeChat is a massive chat app in Asia,
so Tencent saw potential in Kik. This helped explode Kik's popularity. By this point,
the company was called Kik Interactive, and they had a whole team of people working there. There
was a CEO who oversaw everything, bunches of employees, marketers, a whole team of people working there. There was a CEO who oversaw everything, bunches of employees, marketers, a whole team of people who were working on Kik. And their team was expanding fast too, with dozens
of employees at the time. In 2015, Kik reported it had 240 million users and 70% of them were
between the ages of 13 and 24. I actually went and looked at the terms of service because I wanted to like actually know what you're supposed to do with the app.
And it's marketed for kids 13 and up.
It's actually you're not supposed to share any not safe for work imagery.
They actually have certain words blocked so you can't name your chat certain things.
Like you can't use any curse words in the naming of your chats.
That's right. Kik is for kids. In fact, if you go to kik.com slash brands, which is a web page
they use to pitch to potential advertisers, the title of the page says, Reach Teens in Their World. And it goes on to say that one out of three American teens use Kik.
And Kik loves marketing this app to teens because it's well known that teens are trendsetters.
So if teens make a social platform popular, then everyone else will eventually come too.
That's what happened with MySpace, Facebook, and so many others.
But despite it being super popular, Kik didn't really have a way to
make money. In fact, they were losing money fast. So they had to come up with a plan to make Kik
profitable. They saw that WeChat is not only a chat app, but also a payment app. You can use it
like Venmo or whatever and send money back and forth between users. Kik wanted to do something
like that, but at the same time, they saw the boom in cryptocurrencies and decided to make their own crypto coin called Kin.
In 2017, they launched Kin.
Now, to start out, Kik gave themselves tons of this cryptocurrency, and they were telling people that there's two ways to initially obtain it.
There would be a private pre-sale, and there would later be a public sale.
In both methods, though, people would buy this Kin with Ethereum from Kik.
The private pre-sale resulted in Kik selling $50 million worth of Kin.
By the time the public sale was over, Kik had made $98 million from doing this initial sale of their Kin cryptocurrency. And all that money went straight into Kik's bank account. The SEC warned them about raising money
through cryptocurrencies like this. They had to follow certain rules and it gets tricky,
but it didn't seem like Kik respected the rules. So in 2019, the SEC filed suit against Kik and Kin, alleging securities violations.
We're getting into some legal weeds here, but it seems like Kik is saying that Kin is a cryptocurrency and it shouldn't be considered a security.
But the SEC was saying, well, if you're using it to raise money for your company, you're sort of treating it like it is a security.
Kik was saying it was a security. Kik was saying
it was a currency. The SEC was saying it's more like a stock. And it's tricky because, well, yeah,
it is a currency. Kik was really using it to raise money for their company, basically treating it
like a security. So Kik and the SEC went into a fierce legal battle. This legal battle was rough
and changed everything about Kik. The team at Kik was tied
up with this lawsuit and just loved all the money they were making with Kin. So they just sort of
stopped caring about Kik. Their chat app? Kin was making money. Kik was losing money. It's as if
Kik Interactive split into two when they made Kin. In fact, they made their own separate entity called the Kin Foundation, which just focused on Kin.
So in 2019, Ted Livingston, the founder of both Kik and Kin, announced that Kik would no longer be supported.
They were done with it, and they were going to shift focus entirely to work on Kin.
In fact, they took their staff of almost 100 people
down to just 19 people. And the CEO said, quote, going forward, our 19 person team will be focused
on one goal, getting millions of people to buy Kin to use it, end quote. And so nobody, I mean,
nobody was left to work on the Kik chat app.
The app was abandoned by its own company.
And Kik Interactive announced it would be shutting down the chat app in October 2019.
But then, a company called Media Lab AI stepped in.
This is a California-based company who owns several other chat apps like Secret, Yik Yak, and Whisper.
They offered to buy Kik, so it was sold to MediaLab.
We don't know for how much, but this is who owns Kik today, MediaLabAI.
And so the app is still alive and growing even today.
Now, you're going to find all this to be pretty important later, so thanks for sticking with me through this. Now, Media Lab has a history of buying failing apps
and trying to make them profitable. So what's the first thing they do when they buy Kik?
The only thing Media Labs did was the second they took over, everybody started seeing ads on the app.
If you use Kik today, you'll undoubtedly see ads everywhere.
They're in chat rooms, they're in private messages.
It's pretty much a permanent banner at the top of the app
that's always displaying ads.
And so that brings us to today.
I haven't seen recent numbers,
but the latest count that I saw
was that Kik has over 300 million users
and it's owned by Media Lab AI.
And they're likely doing as little work as they can to just make it profitable and keep it going. All right. So
you get in there, you're playing around. What do you discover when you're in there?
So what I discover is that there are many groups for many many many different things and i joined a few
groups some related to some gaming stuff and then i started noticing some people are talking about
some more kinky groups let's call it that um but i also start noticing some other groups
um that generally just share straight up porn. Ah, yeah.
A chat program which lets you make rooms about whatever you want and post pictures and videos
of whatever you want.
Yeah, of course, there's going to be porn there.
But, you know, posting porn online is typically legal.
As long as the porn is legal in and of itself, that the actors in it are legal of age in the country that they're from
and in the country that it is being shown in, and that there's no acts in it that are illegal,
such as rape. As long as that is taken off, then it is legal, to my knowledge, at least.
Okay. So how's the porn scene on kick i mean so here's the thing
that's weird already right you've got this target audience is 13 to 24 year olds 70 of their users
is that um and yet there's a there's porn channels so it's interesting that they have that already. But yeah, how's the porn scene over there on Kik? It is very active.
Like some of the actual social groups I joined were much less active than the porn groups.
So when you get into a porn channel or room or whatever it's called, what's going on there?
Are people just posting like photo after photo of, of nudity and
stuff like that? Or is there videos being posted or is it like, do you ask people like, Hey, do
you have something like this? Or do you just, is it, is it like TV? You just turn it on and you
see what's there or is there, is it like, um, yes to all of the above. Uh, it depends a lot
from group to group, uh, in a lot of groups, when you join first, you are asked to verify
because there are a lot of bots on kick.
So to avoid there being bots and people who aren't active and such,
you're being asked to verify.
So you send a message to an admin and they tell you to send something,
maybe a video, some porn video,
so that you show that you are willing to share,
or maybe a live picture of yourself or something like that.
And then you get into the group, and, well, depending on the group,
you can be finding all of the things that you mentioned.
Some are very, very active, and people just share a lot of both pictures and videos.
Other groups are less active and you have to specifically ask for what people want, so to speak. And that's kind of where I got my start actually, is trading as it's called, where you,
where people have some requests for something in porn and then you try and give it to them and then in return they give something back
and it is videos and pictures and and that's kind of where i got really interested in it because
well when you see i've seen one pair of boobs you've seen them all so to a degree so the porn part pretty quickly got boring but trading was very was very very different and a lot more fun so this is how doctor got started
trading pornography on kick people would ask for certain types of explicit material he'd hunt for
it find it and then share it in the channel or privately to that user. In return, he'd get some other picture or video, which he'd then save so that he could
maybe give that to someone else someday. And this is all free, too. Just show up and be active,
really. But the thing is, Kik's Terms of Service strictly forbids this type of activity. I mean,
so does Facebook, too. Pornography isn't allowed to be posted on either of those platforms.
But Kik's Terms of Service says,
And the list goes on.
But the point here is that porn is not permitted on Kik.
So it just shouldn't even be there.
Yet Doctor was telling us how prolific it is on there.
With a couple searches, you can quickly find channels full of porn.
And yeah, a lot of their users are teenagers. it is on there. With a couple searches, you can quickly find channels full of porn.
And yeah, a lot of their users are teenagers. So I guess we should catch up with Azrael again for a minute. So while Doctor was on there trading porn, Azrael was on a whole different path. He was
just hanging out in an anime channel, doing normal things that people do in chat groups,
nothing kinky or weird. One of the chats i was an administrator for got raided by a source clan were immoral raiders and they just
somebody dropped into their chat gave them a hashtag they hit the chat
they jumped in tried a few spam bots. Spam bots weren't
spamming hard enough, so they just started threatening people. They claimed to have our
IPs, and I knew that you couldn't really grab IPs without some special tools. They would join,
and I would remove them, and it was like that for a couple hours. I was
just removing account after account after account. And then they started private messaging me
pictures of gore and claiming that they would do that to me because they had my address
and all this and that. And I just didn't care. And eventually I got legitimately angry
and talked to a buddy of mine
who was into the exploitation side of kick.
And he got me my first actual mod
that I could hit people with.
And that's where it all started, I guess.
What happened was I kind of started spotting these people that were using modified APKs.
APKs are what Android apps are.
That's just how they are bundled, and that's how they come.
And what he found were people were taking kick
and modifying the app to do different things,
pretty much hacking the app itself.
So the only exploit back then was you could turn off your red receipts
and see who was reading your messages.
Ah, right.
So when you send someone a message, it'll show you if it's read or not.
But with this modified version of Kik,
you could make it so that the messages you actually did read
show as unread to others.
And I don't count that as an exploit, considering.
But then things started ramping up, and the old owners of Kik released the source code.
And after that, the modding community with it went wild.
That's where blue mods came in, King Skull, and all that.
What are these things?
What are the mods?
What are there, bots in these things or something?
Is that what you're talking about?
Yes.
Yes. Yes. So my custom APK, basically it has a few basic exploits.
I can see who reads my messages.
I can prevent people from seeing that I read messages.
I can send raw XML files, which are stanzas, and all that sort of stuff.
Hmm, interesting.
There's a whole community on Kik who run customized versions of Kik
that allows them to have different features that the normal user doesn't have.
Now, keep in mind, this also is not allowed in Kik.
It's against the terms of use.
It specifically says, quote,
except as permitted by us in writing, you agree not to reproduce, distribute, modify,
prepare derivative works of, translate, reverse engineer, reverse compile, or disassemble the
services in whole or in part, end quote. So yeah, Media Lab doesn't want people modifying their app to make it do extra stuff like this.
But apparently Media Lab isn't monitoring for this type of activity.
So there's quite a few people doing this.
In fact, there's a guy named Blue who made a modded version of Kik and he was actually trying to sell it.
He wanted you to pay 50 bucks for it, but who pays for shit?
So this is what Azrael was seeing, modded versions of Kik.
And he was fascinated by all the extra things you could do that a normal user couldn't.
On top of that, he was seeing that there were all these raider clans on Kik
who would go infiltrate chat channels and attempt to try to grief other channels or something.
He really didn't like
that his anime channel got raided so he modded up his kick client to help defend his channel and
that uh has a bot system where you can lock your chat so that anybody who joins is immediately
removed and a couple other things like that you can censor words so if somebody says a censored
word they're removed and my smart thing i did was i went and got a list of all the working raid bots
and plugged them in one at a time into my censors list so anytime anybody added the bot, removed. So it completely prevented us from
getting raided. This was cool, to him at least, to have a sort of super powered kick client to do
extra stuff that other people couldn't. But it fascinated him to the point where he wanted to
know what other modded clients you could get with other features. And he eventually found one that let him send XML stanzas to other users.
Let me explain.
See, Kik uses a protocol called XMPP,
which sends messages from one user to another.
XMPP uses XML to encode and format the actual messages being sent.
So just think of XML as this language that each side of the chat
app understands, and it's how communication happens in the app. Well, Azriel's Kik client
had the ability to modify the way that XML message looked when sending a message to someone.
He could essentially inject a stanza of code, which is just like a snippet of code, into the chat,
and then this would be executed by other Kik clients that are in that chat room.
When another user would see that stanza, their client would have to process that code,
which would sometimes make their chat app do weird things.
So you could join a chat, send a stanza through the XML system,
and it was like, oh, this user was just given admin.
And then you have admin ability. Whoa, to get admin ability to any chat room you wanted?
That's crazy. You can just take over any chat room anytime. That was patched. So that isn't
a thing anymore. Okay, good. Glad that was fixed. People shouldn't be able to just take over chat rooms.
But we're starting to see Azrael's path take shape here.
He's exploiting the Kik app to defend chat rooms.
But can you guess where this is going?
It doesn't quite seem dark yet, right?
It just kind of seems like a nerf fight to play like, oh, I've got a chat room and i'm trying to defend my chat room it
doesn't seem that big of a deal it it really wasn't so where does how does this get worse
so i started noticing other chats were getting rated and like there were clans and these clans
had a large number of exploits and i was thinking at the time if i join one of these clans had a large number of exploits. And I was thinking at the time, if I join one of these clans, not only will I have all of their exploits, but I'll be able to learn to the point where maybe I can pivot off kick and help people there, or I'll just have enough exploits that nobody can touch my chat room so there's something funny about the
way the way you say it for someone just listening to it for the first time and you say clans
and there's uh i don't know raid it sounds very childish i understand no all right clan here's
what we're gonna do we're gonna get together and we're gonna raid this channel yeah basically uh the clan i joined
at the time was called raindrop and they were focused on just like anti-toxicity
you know what i mean so if one of our spies joined a chat and the chat was toxic and like they would say,
share somebody's pictures that didn't want them shared or the admin was just really rude and mean
and kicked people for no reason, we would go and take the chat. And so while Azriel used to defend chat rooms
from takeovers and spammers,
he's now become one of the spammers and raiders.
Now, the tactic to take over a channel
is somewhat interesting.
First, a bunch of people join from his clan
and then they start spamming like crazy,
posting pictures, text, whatever.
And it just becomes a constant stream
of scrolling information.
You really can't read anything when you're in that room.
People with slow internet connections might get lagged out.
People with notifications on might just get bells ringing constantly.
And yeah, the admins could kick people, but the raiders knew who the admins were and would
target them specifically with crazy stanzas and all kinds of private messages, XML that's
all buggy, and things to just make their app freeze.
Once the admins just start falling out of the chat room
and they can get enough people in that chat room,
they can take it over.
So this is the kind of stuff Azrael starts doing
and was having fun with it.
So while all this is going on in Kik,
Doctor is still over there in some other chat rooms
fulfilling people's
desires to see certain porn i'm not quite sure i understand this whole trading like the whole
reason why anyone would want to trade um but i don't think i need to really get into it so
because it just seems like oh you want to you want to masturbate here i'll give you some stuff
and it's just weird to help people masturbate, but okay.
Yeah, it is weird.
And when you put it like that, it is very, very weird. I found it kind of like oddly fun in a way,
just because it was like more in the challenge
because I was very good at trading, very good at getting a lot of stuff.
So very quickly, I had more than 2,000, 3,000 items in my library.
Now, keep in mind, Kik is only a mobile app.
So everything these guys are doing is 100% on their phone.
So when someone requested some porn, he'd have to go looking through thousands of things on his phone to get a good one and send it.
Yeah. And the part that made me good at trading was also that not only did I have that many,
and I got more, I got over 8,000, but I was able to pretty much memorize them. So I had watched most of what I got
in a sort of objective manner.
So I knew pretty much what was in it.
So if people asked something,
I knew, okay, do I have that or do I not have that?
Do I have something that falls close to those criteria?
And approximately when did I get it
so that I can find it in relation to others?
And then I could scroll down and find it.
So the challenge became people ask for something.
And I had like this little calling signet that I posted out the doctors.
And so if you have an itch that needs scratching, I've got something like that well to a degree i think it's it's a big part of
it and why people are into it into trading is not just the whole kink thing but power really
i think it's the feeling of power that you get from being able to give people what they want always and that people recognize that.
Like when I started out, people didn't know me at all.
But very quickly, a lot of people started noticing that I knew my business.
And a lot of people started saying,
if anyone has it, it's dark.
Getting recognized like that
is a powerful feeling.
And to be honest,
it's a drug in a lot of ways.
It's a weirdly good feeling.
How much time are you spending on this?
Way too much.
Way, way too much time on it.
This was during the first lockdown around where I lived.
So I had nothing but time.
Like it was one of the first things I did when I woke up was see,
do I have any requests? Anyone who's personally messaged me? like it was one of the first things I did when I woke up was see,
do I have any requests?
Anyone who's personally messaged me.
And then as I would go about my morning routine,
get a cup of coffee and stuff,
I would start looking through what they wanted,
start finding some stuff.
And I would go all day pretty much.
Like I might be playing, playing games on the side,
maybe even chatting to some people online,
but I would pretty much be on my phone the entire day.
Okay. So everyone has their own addiction. I'm not going to judge. That's just what Doc was into.
But still, you might be asking,
these are supposed to be true stories from the dark side of the internet.
Raiding chat rooms and looking at porn is not that dark.
Well, thanks for hanging out with me this long,
setting up the story,
because everything is about to get really dark from here.
This is the last chance to turn back.
After the break, it gets dark. by just how much stolen identity data criminals have at their disposal, from credentials to cookies to PII.
Knowing what's putting you and your organization at risk
and what to remediate is critical for protecting you
and your users from account takeover,
session hijacking, and ransomware.
SpyCloud exists to disrupt cybercrime,
with a mission to end criminals' ability to profit from stolen data.
With SpyCloud, a leader in identity threat protection,
you're never in the dark about your company's exposure from third-party breaches,
successful phishes, or infostealer infections.
Get your free Darknet exposure report at spycloud.com slash darknetdiaries.
The website is spycloud.com slash darknetdiaries.
Doc had been on a massive porn trading binge,
but eventually got a request that he didn't know how to fulfill.
People sometimes started asking for what they generally just called CP.
And at first I didn't know, and as soon as you asked what's CP, nobody would say, and they would just close the chat block you.
But after a while, I figured out CP stood for child pornography.
And sometimes people also just referred to it as underage, which was easier to understand at first, at least.
But CP was kind of like the code word
that people used, the shorthand.
And if you didn't know what it was,
you weren't supposed to know, so to speak.
And those items had a lot more value.
Those were the ones that some people really wanted
when they wanted to test what kind of collection you had
and to see like, oh, you maybe have these things and these things
and then they start talking about age slowly,
like what are the age of the actresses?
And, oh, do you have a little bit younger?
How about a little bit younger?
And a little bit younger.
And it became really creepy very, very quickly.
And I couldn't provide, and I wouldn't provide,
because I had some that looked very young,
but I knew that those were porn actresses
that were like 22 plus, something like that, that I had looked up to make sure.
But then I started getting some items that I honestly didn't know how young they were.
And people started valuing those items more in certain groups.
And age was all of a sudden a huge value factor,
so that the lower the age would get, the higher the value item,
no matter if it was picture or video.
Like, for comparison, a picture of an underage girl, a lot of people would value as the same value as five to ten normal porn videos.
Now, while he's trading porn to people all day, every day,
he's still collecting tons of porn on his phone.
And maybe I was trading with someone and then they start sending some.
And then I have to say, oh, no CP, dude.
But I would still get these things and see them.
He was shocked that people had this type of content.
At first, some of the stuff I saw made me horrified.
Like, who would even keep this around?
And why would people send me this when I didn't even ask this around? And why would people send me this
when I didn't even ask for it?
Then after he saw this a few more times,
the shock wore off.
And after he saw it a few more times after that,
he'd save that photo or video to his phone,
still not giving it to others when they asked,
but keeping it because he was a collector.
And over time, I got jaded. So I really didn't see the age as much anymore.
And at some point, I realized that, hey, I have access to a lot of these. A lot of people are
asking for it. Well, sure. Like the power got to my head
and I was like, okay, sure, I can distribute
some of it. But then I set a limit.
I would not go below this age.
What was the age?
The age was
14, I think.
Nothing below 14 years old.
Yeah.
And again, as soon as I accepted that, it was just a downward spiral from there, really.
After a while, some of the stuff I just stopped caring.
And I just saw the value of the item itself not what was actually in it
and it sounds horrible i know and it felt horrible when i realized it later but at the time i didn't
realize what i was doing and how bad it was do you have any idea how old the people were that you were trading these with it ran there was
a wide range like the youngest people i talked to was around 10 or 11 um i wouldn't get i never i
never gave them any porn but a lot of people were trying to but who people who were child molesters,
they were trying to get them to make porn.
And that's usually why I got in touch with them.
It was not to get porn from them,
but because I was an admin in the same room,
so they would come to me and say,
hey, this person is asking for news of me.
And I would ban them, the people who are asking for nudes.
Oh, right.
This app is targeted towards teens.
According to Kik, one in three American teenagers use Kik,
which means there's probably a lot of preteens on there too.
That's a really bad mix.
When you have preteen children in the same chat room as people looking for child porn.
This is ugly.
This is really ugly.
And it can't end well.
And the oldest I talked to, I think were in their mid-60s.
What the hell is a 60 year old even doing in this chat app? Oh yeah, I think we're in their mid-60s. What the hell is a 60-year-old even doing in this chat app?
Oh yeah, I know why.
Because porn, and child porn, is prolific on Kik.
I was in a lot of rooms.
When I was the most active, I was active in like 30 or so rooms.
30 porn rooms?
If not more.
See what I mean?
I hate to say Kik just allows porn,
without knowing exactly what Kik is doing to stop this,
but it's incredibly easy to find porn in the app.
I mean, Doctor says he was in 30 rooms himself,
so you can see that this isn't just a few bad people,
or just a small issue.
There are a lot of people on Kik solely there for the porn.
And not only that, but many of these rooms are actively posting child porn.
How vast or how big is the child porn exchange system going on on Kik?
Never ending.
Like if you use the right text to search for um when you're searching for
the private groups and not the private group the public groups even um you can find so many groups
there's a it's a very active scene unfortunately so the um the the moderators of Kik, I mean, obviously there's moderators in each room, but isn't Kik at all moderating any of this stuff?
Honestly, I don't know.
I think they try to, to some degree. they have some form of artificial intelligence
running through some of the rooms,
looking for some things,
trying to recognize some patterns
and see if these and these things are posted,
then they will close down the rooms,
maybe ban the users
and make sure the content is no longer viewable.
How do you know that they're running that?
Because some people I've traded with later on, if I look at the chat history with them,
I can go back and see, oh, these things are no longer available. And
sometimes the rooms get shut down. Okay. I had to research this and this is what I found.
Kik has a few methods to combat child pornography. In 2015, Kik partnered with Microsoft to test
Microsoft's photo DNA service. Apparently what Microsoft has is a database of known
child porn image hashes. Now hashes aren't the images themselves. It's just a database of known child porn image hashes. Now, hashes aren't the images
themselves. It's just a string of text, like a fingerprint of the file. And if you have the hash,
you can't generate the image from that. But if you have the image, you can quickly generate a hash of
that image. So Kik was somehow using this photo DNA technology to scan their images to see if
there was a known child porn image on
their site. But I'm not sure exactly how they did that or what they did when they found a match,
or even if this photo DNA service is still in use anymore. The last time I heard about it was in
2015. In Doc's case, who started using Kik in 2020, he didn't see any evidence that would suggest that this is an always on filter there were child
porn images posted all the time to chat rooms and private messages and those rooms would stay up and
online forever and if you scroll back through the history in those chat rooms you could see child
porn that was posted long ago in the past from what i can tell it only starts looking in the past. From what I can tell, it only starts looking in the rooms and looking at
individual people if they are reported for something. So if a user reports another user
or chat room, then some kind of anti-child porn scan triggers and may remove content.
And I guess from Kik's point of view, there are probably millions of images posted every minute in safe and clean chat rooms.
So maybe it's just too expensive or too hard to scan every image to see if it's a known child porn image.
So my guess is they're just not very thorough with trying to combat this.
But I don't want to just take one guy's opinion about what he saw on Kik.
I want to do my own research on this and see if there's other evidence I can find about child porn on Kik.
Of course, I'm not going to go on the app
and look for it myself.
Uh-uh, I'm staying far away from that for sure.
But if I search the internet for Kik and child porn,
news stories just start jumping out at me.
A Calabash man is accused of using an app
to receive and send sexually explicit photos
and videos of minors. According
to arrest warrants, 19-year-old Benjamin Lindsay used Kik, that's a messenger app,
to distribute the images. The unknown children are between the ages of 7 and 12. The court
documents say Joshua Richard Harrison confessed to posting explicit photos of children through
the app Kik. Milwaukee County Judge Brett Blummey's Instagram page
is filled with personal and professional highlights on his way to the judiciary.
But after a search in Cottage Grove of one of his two homes and his arrest,
authorities say his computer activity on the app Kick links him to child porn.
With uploaded images and videos consistent with child pornography through the Kik app on 27 occasions last fall.
Judge Blumey has been on Milwaukee County Circuit Court bench since August.
His assignment, children's court.
As a bond condition, he's now banned by a court of having unsupervised contact with minors. The judge leaves this county's criminal justice system
knowing he will return not to dispense justice,
but to potentially have to accept its consequences.
The news articles are continuous.
There was a guy in Tucson, Arizona,
who was arrested for posting child porn to a chat room in Kik.
There was a guy in New Jersey who was arrested
for trying to have sex with an underage girl on Kik.
In 2020, a New York man was sentenced to 11 years in prison for possessing 1,500 images and
150 videos of pornographic material involving minors that was distributed on Kik. In 2020,
an Ohio man was guilty for distributing child porn over Kik and got 20 years in prison. And
there are so many more cases. In fact, I'm looking at a BBC article now, which is simply titled,
Kik chat app involved in 1,100 child abuse cases.
So just as a cursory glance, Kik has a very big problem with child abuse.
And yes, this is child abuse.
In fact, the common term I've seen used to describe this stuff is CSAM, spelled C-S-A-M, which stands for Child Sexual Abuse Material.
And it's simply defined as any sexually explicit image or video of anyone under 18 years old.
This can be a very traumatic experience for kids.
And the trauma comes right back any time they see the image again.
There are organizations out there fighting hard to stop the spread of this.
Now, before we get into those organizations,
I want to know what Kik is doing about this.
So I reached out.
I started where I normally reach out to anyone, which is Twitter.
There's a Kik account on Twitter.
It has over 300,000 followers.
But looking at that account, they haven't posted a single tweet since October
2019, over a year and a half ago, which is the same month that Media Lab bought them.
However, in their bio, there's a link for help and I need help for sure. So I click the link,
which is help.kick.com. The first time I clicked it, I got a 404 page not found,
but then I clicked it again and it sent me to a Zendesk portal that doesn't exist.
The site just says, oops, that help center no longer exists.
Okay, so Kik has no social media anymore, and their help portal is just abandoned.
They just seem to have completely canceled their Zendesk service
and didn't bother to update the URL in their Twitter bio.
All right, so next, let's see if I can get help in the app. And while using the app, I did find a
way to message the Kik team directly, which is a perfect way to get in-app help. So I message them,
saying, I need help. And the Kik team replies, ahoy there. I say, there seems to be a child porn
problem here. What are you doing to stop this? And the reply was,
if you believe the message you received
contains illegal content,
please contact your local law enforcement agency.
You can also report this to our support team.
And then they give me a link for support.
But the link they give me is help.kick.com,
which is the same link in their Twitter bio,
the link that's been dead for years.
So I tell them that
link doesn't work. And they reply to me, follow us on Twitter, twitter.com slash kick. And I'm like,
that Twitter account has been inactive for years. And then they reply, howdy. At the point I realized
I was talking with a bot and was getting nowhere and I was getting more frustrated. I really want someone from Kik
to talk with me. So I looked for another way to contact them. I go to kik.com and click the
contact button. There are two email addresses there. One is kiksupport at medialab.la. The other
is kiksafety at medialab.la. The.la threw me off because the company's name is Media Lab AI, but okay, whatever. I'll
try these emails. I email Kik Support asking to speak with anyone there about this. I get a
generic response. Thanks for emailing us. We're experiencing unusually high email volume and we'll
get back to you whenever we can. They never got back to me and it's now been three months. I sent
an email to Kik Safety. No reply. Not even a confirmation that they got my
email. And yeah, that's also been three months of waiting with nothing from them either.
I go back to their website. Who's running Kik, I wonder. There's a link at the bottom of the page
called Safety Center. I click it. It has lots of information on being safe in the app. But then,
at the bottom of the page are people's faces.
Meet our safety advisory board. And there's a picture of Anne, Brooke, Justin, and Hemu.
These are their advisors. And Brooke is specifically advising them on child exploitation
problems. Perfect. So I message her on Twitter. No response. I message her on LinkedIn. No response.
I give up on her and start going down the line.
Anne is another advisor.
I send her a tweet.
She does reply, but she tells me she hasn't worked on the Kik advisory board in years.
I also see Justin's on the board.
I tweet at him, but he replies saying, nope, I haven't been on the board for years either.
Okay, so this entire advisory board, which is posted publicly on their website, is no longer valid.
I'm starting to think they don't have any advisors at this point and have not updated their website in years.
It feels to me like Media Lab has abandoned Kik.
Their website is defunct. Their Twitter is defunct. There's no phone number for them because there's probably no one to answer the phone.
Nobody appears to be home at kick.
But according to the Google Play Store, the app is still getting updates.
About once a week, it'll have some kind of update.
So someone is clearly back there doing something.
But I'm telling you, after trying for months and months and months to get a hold of someone,
anyone, I was completely unsuccessful.
As far as I can tell, nobody is there. So once again, Media Lab is who owns Kik. They took it
over when Kik was facing those SEC legal troubles, and they knew it was losing money like crazy,
but they're trying to make it profitable with ads. My theory is that it's not profitable,
or Media Lab is just really trying hard to cut corners as much as they can by just not staffing effectively and not taking child pornography problems seriously.
And they only do the bare minimum just to maintain the app.
To get more help, I called up my friend, Caitlin.
Hello.
Hello.
Hey, how are you?
Great. Thanks so much for taking my call.
Oh, no problem. This is Caitlin Bowden. She's badass. In fact, she's so badass, she started something called the Badass Army, which stands for Battling Against Demeaning and Abusive Selfie Sharing. Basically, if someone takes your nude photos and posts them publicly without your consent, that's a problem. It's devastating. And Caitlin has been helping victims of revenge porn for a while now by doing things like helping people get their images removed from the Internet.
Yes.
Have you ever had to try to get images removed off of Kik? worked with because there are so many different ways that these images are getting shared.
And Kik is one of the platforms that people feel more anonymous on.
So therefore, they're more likely to do this sort of thing.
And, you know, there was in the beginning, you know, it was almost impossible to get
anything removed from Kik.
And then for like a few months, right in the middle,
I want to say it was, you know, toward the end of 2018 or so,
they were really great about responding,
but it didn't last really for very long at all.
And suddenly everything just was going unanswered again.
So when you say responding, you say, you know,
this is a non-consensual
posting of a nude photograph. Can you please remove it? And they would remove it?
Yes, they would. Well, they would shut down the, you know, chat room that was,
the picture was being shared in, or they would shut down the user's account that was sharing the image. Or if law enforcement was contacting them, they would, you know, respond with the
information that they were asked. And, you know, with Badass, we normally would use a DMCA.
So there was, you know, the copyright violation, and they just would shut
down what needed to be shut down. It would remove the image.
Okay. So at some point they stopped removing images for you.
They stopped responding at all. There wasn't even an easy way to get a hold of them. Suddenly the email that we had been sending our DMCAs to was no longer active. For a while, there was just no response
at all. And then eventually things were getting bounced back. And then, you know, we found other
emails for the company that had bought them and still just nothing. It was like we were
just throwing these emails out there and nobody was reading them.
So that was a while ago. I mean, here we are in 2021.
Are you think these images are still up? Well, actually, you know, when you had brought up the
subject of kick with me, I was just curious. There was an account that I had used to infiltrate
these picture sharing groups. I haven't used it in a while. I haven't checked it in a while.
You know, but it got my curiosity going. And there was a court case about a year back where,
you know, the creep that had been sharing the images was found guilty and sentenced. And part of my job with Badass was helping the victim clean up the mess now that it was no longer needed as evidence. So out of curiosity, I went to that old account and I know that I had sent
out DMCA's to get these images removed. And then I just kind of, well, I was busy and I never really
double checked and triple checked to make sure they were removed. And much to my dismay, they're still up.
And these images are, you know, found to have been illegal. It's not like this is, you know, just a copyright violation
or anything like that in their eyes.
These are images of a teenager that are still up and available.
Well, specifically under 18. Well, specifically under 18.
Yes, specifically under 18.
This is really frustrating me.
Kik has completely stopped removing revenge porn photos too?
Or not even respecting DMCA takedown requests?
What the hell, Kik?
Well, the issue is that if Kik knows that these images are out there,
they then are obligated to clean them up.
And if they don't clean them up and they know about it, then they're in big trouble.
So what they're doing right now is they're turning a blind eye to it.
They're ostriching.
They're sticking their head in the sand and pretending this isn't a problem.
Yeah, good point there.
If Kik doesn't know this is going on in their chat app pretending this isn't a problem. Yeah, good point there. If Kik doesn't
know this is going on in their chat app, that's abhorrent. And if they do know and they're not
doing anything about it, that's also abhorrent. There's no explaining the situation at this point.
The issue is that Kik doesn't moderate. Okay, so there's nobody sitting there watching any of this
stuff happen. Unless they are being made aware of the issue, they don't know what's happening.
And they're in big trouble for not moderating at all because they're allowing these illegal images to still be on their platform. But the issue is if they know about it and they still don't do
anything, then they're in much bigger trouble because then they're knowingly doing it.
What's so frustrating about all this to me is that Kik doesn't seem to be held
responsible for any of the problems that they're allowing to happen.
That is actually kind of a safe haven.
And I'm going to say something that's going to be extremely unpopular because,
and even I don't like saying it, and that is that, you know, Section 230.
It makes it so that the platform itself is not responsible for what the users do on it.
If a user is committing a crime with the platform, the platform is not liable.
And until a method is found to, you know, incentivize platforms taking the initiative to delete these images and to make sure that their platforms are moderated, then there's no reason that they feel they need to do it.
I'm getting really frustrated now.
Like my palms are actually sweating.
What does a person do when they get this frustrated?
That's it.
I'm calling the authorities.
Hi, Jack.
How are you?
Hi.
Thanks for taking my call. So let's start out. Who are you? Hi. Thanks for taking my call.
So let's start out. Who are you and what do you do?
My name is J.P. Rigo.
I serve as a special agent for the Ohio Attorney General Dave Yost Bureau of Criminal Investigation.
I've been an agent since 2012.
Okay. So I have a crime to report.
Oh, wait, you may have to talk to the police department for that, not me, but go ahead.
Okay, well, there has been quite a lot of child sexual abuse material on Kik recently, and I don't think Kik is moderating this.
We've run into that. It probably takes up a significant portion of not just, I think,
the time of the agents that I serve with, but fairly common that we are, in a sense, tackling these.
But as you know, it's the World Wide Web.
I mean, it's huge.
It's well beyond what any agency themselves can take on.
JP went on to explain to me that whenever they receive reports of people trading child porn,
they take it very seriously and investigate the person and make arrests whenever they find
evidence. The reason why I called him is because he's dealt with this exact case a bunch of times,
like he's dealt with child porn issues in the past and arrested people. So it's true. They do
take this seriously. And as we heard earlier, there's no shortage of news stories about people
being arrested for trading child porn on Kik, which is good.
Those people should be stopped.
But I feel like the common part of this story is that Kik is seemingly permitting this type of activity.
And is there some kind of legal trouble that they can get into for allowing this?
That's a good question.
I don't know. It's probably a little bit too big of a, I want to say, something for me to put my hands around. I would need to really pull in some minds about the work of making this a safer place, not just for kids, but for all of its users.
But I don't know that I could even speculate how that would happen.
But when I hear you say, what can we do? Just keep reporting. I mean,
I know NCMEC has been a fantastic organization. In a sense, many times you see these reports of
the arrests and whatnot, but many times it starts from them because, you know, in their history too,
they were, and they still are about the business of trying to protect and serve our communities in the sense of, you know, no, they're not law enforcement,
but they have that they have that drive to to see to see something resolved.
OK, so his suggestion is to report this to NCMEC, which is spelled N-C-M-E-C. This is the National
Center for Missing and Exploited Children, and they run something
called the Cyber Tip Line. The Cyber Tip Line was created by Congress to process reports of child
sexual exploitation, and then take these reports and help enrich those databases of known child
porn hashes, as well as reporting it to law enforcement. This is probably where that photo
DNA service gets its hashes from. The CyberTipLine
puts out yearly reports on what it sees. And in 2020, it received 21 million reports of child
sexual abusive material. Wow, that's a lot. That poor person who has to go through all those reports
must have no hope for humanity. But I'm digging into this report to learn more.
There's two groups of people who report to the CyberTipLine. The first is just regular people
like you and me. If you see something, you can say something and tell the CyberTipLine.
But in 2020, there were only 300,000 reports by the public. So where'd those other 20 million
cases come from? The companies themselves who saw the explicit material on their platform self-reported.
So for instance, in 2020, Facebook, the company, reported 20 million times that they saw child porn on their site.
Google reported that they saw it 500,000 times.
Imgur.com also reported 31,000 instances of child porn to the CyberTip line.
So how many times did Kik report child porn to the CyberTip line. So how many times did Kik report
child porn to the CyberTip line in 2020? Well, if I search for Kik in the report, it's not present.
But if I search for their parent company, Media Lab, it is there and it says they reported 14,000
times to the CyberTip line in 2020. But it's not clear if that was for Yik Yak, Whisper, or Kik,
since they own all these apps. I asked NikMik to clarify how many were for Kik specifically,
but they refused to provide extra detail. They're super busy anyway. So it seems like Media Lab is
doing something, but it just seems like the bare minimum here. Like, just enough to stay out of
trouble or to be able to
say in court, we have filters in place and are actively reporting things to the cyber tip line.
But they can do so much more. They're letting so much go through without any consequences to the
users trading it. But are they in violation of any laws for permitting this activity?
At part, I don't know. I'm not a lawyer. But I do wonder about COPPA laws. This is
the Child Online Privacy Protection Act, and it was put in place to safeguard the data of children
who are under 13. Now, in the terms of service on Kik, it explicitly says that anyone who is under
13 is not allowed to use the service. But is that good enough? I mean, we clearly know there are kids under 13 on Kik,
but does Kik know that too? So in 2019, TikTok was under scrutiny for violating COBOL laws.
They were illegally collecting personal information from children under 13,
and they agreed to settle a lawsuit and paid $5.7 million as a result of this allegation.
Surely, if TikTok has been found to violate
COPA laws, then Kik must certainly be violating them too, right? So how do you get the FTC to
create a suit against someone? I don't know. I suppose you'd have to start with evidence,
screenshots, pictures, videos, chat messages. But there's no way I'm going in there and collecting
that. You kidding me? I know I
don't have what it takes. So I think what we have to rely on is some watchdog group or government
entity seeing the prolific problem that Kik is allowing and trying to bring them to court to
prove what though? That they're negligent at fighting a child porn on their app or even
obligated to do that. Section 230 says the app
itself isn't responsible for something that the user does which might be illegal. It might be
really hard to prove anything here. So if a group brings this to court, good luck. But that wouldn't
be the first time Kik would face legal problems. You already heard about their SEC lawsuit. Well,
they lost that case, which resulted in Kik having to pay a $5 million fine. Or I guess the Kin Foundation had to pay that. But even before that
lawsuit, the parent company to BlackBerry issued a lawsuit against Kik, saying they were infringing
upon some BlackBerry copyright. Kik settled that lawsuit and paid an unknown amount. So you can see
that they haven't been the cleanest company legal-wise. But that's just with
Kik Interactive. What about Media Lab AI, the company that owns Kik now? Well, looking at Pacer
records, I do see a handful of lawsuits against Media Lab. One was a teenager who claims she was
sexually harassed on Kik, and she was suing simply because the app didn't warn parents clearly
enough that pedophiles were active on this app.
And that case got dismissed. There was another lawsuit. It looks like Worldstar Hip Hop is saying that Media Lab published a copyrighted video. That one's still going on. But there's
one more that's pretty interesting. Media Lab owns Whisper, which is a chat app. But I guess
you can share secrets with others anonymously. Apparently, in March 2020, a security researcher found that the Whisper database was sitting right there on the Internet without a password.
This left a lot of people exposed.
Age and locations of users were leaked.
And if you looked at this database, there were 1.3 million users on Whisper who were 15 and under.
And in total, there were 900 million user records in this database breach. So this resulted in a class action lawsuit where people were saying
they were suffering damages from getting their data exposed like this. Should have been more
secure. Media Lab did reach a settlement agreement and paid the victims for this, but we don't know
how much was in that settlement. The victims were seeking $5 million, but it just doesn't say what was agreed on.
Oh, and if you read the app reviews for Whisper, it also looks like that app is not doing well
either. Like, it seems like there's lots of scammers on the app now, and maybe even prostitutes,
and users are reporting there's just not good moderation taking place there.
So yeah, it doesn't seem like Media Lab is the cleanest company legal-wise either,
and apparently not the most secure because of their Whisper database breach.
But maybe there's the Parler option.
Remember Parler? That right-wing social media app?
Well, after the insurrection on the US Capitol in January of this year,
Google and Apple told Parler they must moderate their content or
they'll be removed from the app stores. That was the line they drew. They had to moderate what was
going on in the app. Parler didn't moderate their content, and so they got kicked out. Perhaps Google
or Apple should do the same with Kik. Say, hey, you've got a real bad child porn problem going on
here, and you're not working hard enough to fix it. Either moderate or get kicked out. And by the way,
this wouldn't be the first time Kik would be kicked out of an app store.
They were first kicked out of the BlackBerry app store after their lawsuit with BlackBerry.
BlackBerry really didn't like them after that and they just removed them from the BlackBerry app
store. Kik was also removed from the Windows Mobile app store. This one looks to be more
voluntary though. Kik was just done updating their Windows version and they're just like,
we're just done with this. And so that's two app stores are gone from, but they're still
present and available in two of the biggest, Apple and Google. And I think Google and Apple here have
a solid ground to throw Kik out of the app stores. I mean, it's pretty easy to go into Kik and see
for yourself all the child porn that exists in there. You might have to trade some child porn
in order to see all the rooms to prove it yourself. But once you do, you'll be able to get access to rooms and see it for yourself.
But is that going too far, ganging up on cake and kicking them out? Yeah, it's hard to know for sure
what's right and wrong these days. But let's do a few thought experiments. Suppose this app only
had one use, which was to spread child porn. If 100% of the users were just there for that,
would anyone have a problem kicking them out of the app store then?
I don't think so.
So how many bad users does it take for you to think that they should be kicked out?
It's hard to say because it's likely that every chat app out there has a child porn problem.
It's just that this one probably has a bigger problem than most. Here's another thought experiment. What about darknet marketplaces like Silk Road? The
guy who started Silk Road, Ross Ulbricht, is in prison for life. Why though? Under Section 230,
he shouldn't have been in trouble for something that other users on the site were doing, right?
If two users came on Silk Road to buy and sell drugs, why is that Ross's problem? Well, probably because the whole purpose of Silk Road was to be
an illegal drug marketplace. But let's imagine a scenario where he ran Silk Road like Kick,
and instead of a drug marketplace, he just simply called it a marketplace.
Kick explicitly says no pornography is allowed. What if Silk Road said no illegal
behavior is allowed? And then what if Silk Road just turned a blind eye to all the illegal activity
that went on there and only banned people if someone complained that what they did was illegal?
What then? Because in a court, Ross could have clearly pointed to the rules saying no illegal activities allowed here and shown how he's banned users for breaking laws.
So yeah, what then?
Would Section 230 protect him?
Would that be enough to keep Ross Ulbricht out of prison or even enough to keep the Silk
Road site up to date?
I think so, because there's this site called armslist.com.
It's like Silk Road, but only for weapons.
And when you're buying weapons on the site, you can search for private sales,
where it's just you meeting someone in a parking lot somewhere and buying a weapon with cash.
And in Wisconsin in 2012, some crazy guy went on the site,
found a semi-automatic handgun for sale, specifically found one for sale by a private seller,
and met this guy in a McDonald's parking lot and bought the gun from him, and then went to where his wife
worked and shot her to death and two of her co-workers?
The wife's daughter sued ArmsList.com, saying that they purposely set the site up to allow
illegal gun sales, because you can filter for just private sellers who don't do background
checks and accept cash only.
The case went to the Wisconsin Supreme Court.
And guess what?
Section 230 gave Arms List total immunity and said they're not responsible for what users do on the site.
So online weapons marketplaces are legal even if they facilitate illegal purchases? And this just frustrates me that there's no legal responsibility
that app makers and website owners have to abide by when it comes to illegal activity going on in
their apps. Because right now, today, they seem to have immunity from this. One more thing to think
about. When a bar that serves alcohol has too many reports of disorderly conduct, they can get their
liquor license revoked. Or if a bar doesn't prevent illegal activities of disorderly conduct, they can get their liquor license revoked.
Or if a bar doesn't prevent illegal activities going on inside it, they can also have their license revoked. But there isn't any clear rule like that with social media apps, apparently.
The only way to get your app shut down is if you explicitly allow illegal content,
or if one of the app stores decides you aren't following their rules. So it sounds like we're passing the moral and ethical responsibilities
to Google and Apple app stores here.
But that's putting a lot of burden on those app stores.
And there's no public oversight to whatever they decide.
And it becomes a little creepy when app stores overextend their power and control
and just shut down whatever they feel like.
Is it even possible
to have a world where the internet is open and ethical too? So let's look a little closer at
the Google Play App Store. Because remember, this app, Kik, only exists in the Apple App Store and
the Google Play App Store. There's no website version or desktop versions for this app.
There are clear guidelines that all apps must follow to be listed in the
Google Play app store. Here's what they say. Before submitting your app, ask yourself if
it's appropriate for Google Play and compliant with local laws. Our restricted content policies
cover a broad set of topics. Child endangerment is never acceptable on Google Play. If an account
is found to violate these policies, we'll take action,
including reporting the account to the appropriate authorities.
Okay, so if the app is endangering children, then it's not acceptable in the Google Play Store.
But that's not the intent or purpose of Kik. Any chat app can endanger a child.
So I guess it comes down to what the users are doing in the app?
User-generated content hosted in your app must meet certain requirements,
including the implementation and use of
content moderation and reporting systems.
Aha, there it is.
User-generated content must be moderated
or you can be kicked out.
So yeah, that's where I'm putting my finger,
right there, on that violation.
But I think it still would be pretty hard
to prove you aren't moderating.
Because what does that even look like or mean? Blocking curse words and chatroom names could
be considered moderating. And they do that right now. So what do they really need to do or change
here? And how would they be able to prove that to Google if Google was mandating they do this. It's really tricky stuff.
Also, don't just take my word for this.
Try this.
Google the word Kik, K-I-K.
Then click the News tab on Google so you see all news articles about Kik.
I guarantee you,
the first three,
maybe even 10 pages of results
are all about child porn on Kik.
If all the news articles about your company are talking about child porn on Kik. If all the news articles about your company are
talking about child porn on your app, then that should be clear evidence that your company has a
very bad child porn problem. Other major publications have highlighted this problem too.
There's a Forbes article which is titled, The $1 Billion App Kik Can't Kik Its Child
Exploitation Problem. And that article was written in 2017, way before Media Lab even bought them.
So this isn't a new problem for Kik.
There's also a New York Times article, which is titled,
The Wild Popular App Kik Offers Teenagers and Predators Anonymity.
And that article is about how two college guys coerced a 13-year-old on the app
to meet them in real life, and then they killed her.
If you Google other
chat apps like WhatsApp, WeChat, Signal, you simply don't see articles like this or articles
about child porn almost at all. So again, this is more proof at how rampant the child porn problem
has become on Kik. I want someone to investigate this further, to really know what's going on.
Someone who has the teeth, who can actually make
something change from all this. All I can do is underline and highlight that this is a problem,
but nothing is going to actually happen from me talking about it on this episode.
It'll be someone else who actually has power to do something that will actually make a change.
And that brings us back to Azrael.
Azrael is the hero you didn't know he needed.
Remember last we heard he was just getting into using modified Kik clients to take over chat rooms?
Yeah, well, he was doing good with that Kik clan, and then he joined a new clan called ZenSec.
Now, because he was the new guy,
they wanted him to prove himself. What they
wanted me to do was they wanted
me to either
bait somebody
into proving themselves,
proving to us that they're
a pedo, or
they wanted me to
grab a pedo's IP
and basically
get all their ISP information and all that stuff
okay whoa whoa whoa this is the first time we've heard the word pedo on here on this interview so
was this the first time you're experiencing that there's a pedophilia on or it was the first time that I saw that pedophilia was a large-scale problem.
Yeah.
And ZenSec focused solely on the pedophilia problem.
Okay.
So when they said that to you, you must have been like, the what problem?
How am I supposed to find these guys?
Kind of, yeah.
I was like, all right, so how do I get a pedo to do this?
And at the time, I had a stanza that I plug in my link to the stanza,
and it basically shows up on the receiving end as an invisible picture
and then they're like why isn't this loading they click it i have their ip get all their isp
information so that's what i used uh basically what happened was one of our other members came in and dropped an app of a known pedophile.
He dropped a what? An app? An at?
An at. So your profile.
Oh, so it's just a username.
Yep. Dropped their username.
And I was like, you know what? Fuck it. Not doing anything. Let's get this
dude. So I dropped the stanza and then I said something to make it so he would open the chat.
Okay. So a classic phishing scheme. He said something that he thought a pedophile would
be interested in to get them to read his message. But that stanza of code he put in the private chat would collect more information on this person if they read the message. It basically
tells the Kik client to reach out to a website for something. And when the user goes to that website,
Azrael can see what IP just went there. And the IP can tell them what city and location this user
is in. And while you can use a VPN with Kik, I've been told that Kik just doesn't work very well over VPNs.
Exactly. And let me tell you, dude, pedophiles will click anything.
So he collects this extra information on the suspected pedophile and then reports all of what he finds to the admins of this new clan that he's trying to get into.
And the admin's like, oh, you didn't even tell us you were doing that. That's great. You're one of us. So he was now in a Kik clan that specifically targets pedophiles and child porn
traitors. When I say pedophile, especially on Kik, I mean people that are sharing straight up mega links to terabytes of CP.
And it's just, it literally makes me sick to my stomach to think that this is out there.
You know what I mean?
People are exploiting children.
And people are paying for it and sharing it on a child's app.
I've been the guest of honor in this clan. I've got to go in and check it out, and it's app. is ready to do a raid, they know exactly what channel to hit. Then the people in the clan will get their souped up versions of cake and go raid different chats in an attempt to take over the
chat room and close it down. They all pile into a channel and then start spamming it. So there's
two different types of spam codes. There's a QoS attack code, which is literally just as much as you can pack into a single chat bubble.
And then there is crash codes.
Crash codes rely on actual zero day exploits within the Kik system, which will force close
your app if you look at the code.
Now, they could use some tricks to try to take over the chat and close it down, or they
could just try scaring people.
We rely honestly heavily on the fear factor.
If somebody comes in and just spams your room and you don't know what's happening and their account looks scary and the bot they're using looks scary, you're just going to assume that person is scary.
And if you are forced to exit the chat,
how did they do that?
How am I going to join back?
What happens if I join back?
You know what I mean?
So you've gone in and spammed lots of pedo rooms.
How many do you think your crew has spammed at this point or raided?
We have hundreds.
I personally have only raided 10 or so myself.
I prefer phishing at this point, taking accounts that way.
Because if I log into an account, I am taking that account out of play.
If he owns rooms, I can purge them.
If he's victimizing a child, I've cut contact.
You know what I mean?
But we have hundreds of rooms that people who are better at spamming and attacking have taken.
I want to remind you that all this is done on mobile devices
since Kik is just a mobile app.
How many phones do you have?
One.
You're doing all this on one phone?
Yes.
With multiple Kik apps installed?
I have, let me count my Kik apps for you.
One, two, three, four, five, six, seven.
I have seven kick accounts on this one phone.
So this must be, each is a different APK,
because I don't think you can make the same APK install.
Yes.
Yes.
So I have seven different APKs.
So you have to multitask on your phone, like, let's go to all these different kick apps. And, oh man, that just must be like, that must be an evening. to do all at once and every single time i raid my heart is racing it's such a rush because there's you're doing so much all at once and if you fuck up one little thing your raid stops now doc the
guy trading this stuff on kick has seen these raids in his chat rooms yeah multiple times and
of course he finds it annoying usually it's not's not that effective, but I have seen sometimes where one person comes in, starts talking, being nice, and then they realize there's child pornography in this room. And they start sending very, very hateful messages,
almost as if a bot takes over,
because they send it to a lot of people very, very quickly,
and then they start spamming messages,
often just some long, copy-paste message they send over and over and over again.
So they get banned by the admin.
But then later, they join again.
They shouldn't be able to join because they're banned, but they do.
And then they start spamming again.
And then another person comes in and starts spamming.
And then another person comes in and starts spamming.
And then there's usually three or four people at least that keep spamming.
And no matter what you do as an administrator, you cannot get them out.
They keep being able to join for some reason.
And then people start leaving because it's annoying.
Because even if you block this person yourself,
you will still see that a message is being hidden from you.
So you will still see this person has been blocked,
this person has been blocked,
filling up your screen over and over again.
So the rooms kind of die from that.
Has your account ever been crashed?
Does your app crash sometimes or hang
or you just can't type anything anymore?
My app didn't crash,
but I know that a lot of people's apps did.
I think I was just generally using a good phone.
People on iPhones usually had their apps crash.
What a weird battle this is.
In my clan right now, we have literally thousands of chats,
hundreds of accounts that need to be taken down. And there's only 20 of us.
We can only do so much. And I got to the point where I literally have nightmares about
a child being victimized because I didn't take out that chat or that account. I thought it was ridiculous to talk about
chat raiding clans at first. And I guess I still think it's ridiculous. Like, what does this even
matter, right? But now it looks like Ezreal is a vigilante of some kind. He uses his raiding skills
for good. He cares about combating child porn so much that he feels like it's his duty to do something.
He can't just stop when he knows it's constantly going on and no one is stepping up to fix this
problem. Now, Azrael's clan has, of course, reported numerous people and chat rooms to Kik.
They say, for the most part, Kik does not take action. If they do take action, it's typically
months later and then they might see a ban or a closed channel. So they're doing their part by telling Kik about this problem.
Recently, I tried to, so I'll grab somebody's ISP information, run that through some tools
to get as much information as I possibly can. And then I will drop that to the Federal Bureau of Investigations
Cybercrime Division, which is absolutely scary to me because cybercrimes.
I'm firmly gray hat, so I could get myself in trouble.
Yeah.
I mean, what he's doing is phishing people and doxing them.
And he's using a hacked kick client that goes against the terms of use on kick two.
I bet if kick knew about the clan or Azrael, they would immediately ban them from the app
because they're breaking the terms of use, which is just ridiculous.
Yes.
The power they have can be used for something awful, but they're using these powers to fight something much more awful on kick.
At a point, I kind of realized what I was doing and how bad it was.
And I, well, really, I freaked out and I didn't sleep for like two whole days.
And in my sleep deprived state, I sent an email to you,
which probably looked kind of panicked in a way.
And it was.
And I needed a voice of reason,
someone I trusted who knew sort of the inside of how different IT security
things work and who knew the potential of me having the context I did, but also the
danger of having the items.
And I got told by you that I should delete them.
Yeah, I said, why don't you just delete everything?
Uninstall the app and stop.
Yeah.
And in my power state, in the state of having that much power,
it was tough.
It was very, very hard to do because it felt wrong.
I had worked very, very hard to get more than 8,000 items,
items that I had cataloged in my head.
So I knew them by heart.
I knew so many things that I had all these things
and people looked up to me.
And I was being told to delete them,
to throw away all this power.
And it took me three more days and I talked to a friend of mine
whom I really, really trust about it as well.
And I ended up deleting it.
I deleted everything I had.
You think someone should just go shut this thing down like they did with Parler?
I want Kik to shut down, but I also want the police to actually
utilize what is right in front of them.
Kik is such an easy place to spot and catch predators,
but they aren't being caught in there.
Things are running rampant,
and they could use this tool at their disposal,
but they're not.
So while I want it closed,
I also want it to be exploited by the police, so to speak.
So you might be wondering,
maybe this is some sort of honeypot set up by the police. Well, that's actually happened a few times. There's a Forbes article
about a guy who was arrested for trading child porn on Kik. The police commandeered his phone
and got access to his Kik account, but they didn't delete it or shut it down. They made a deal with
this child porn trader to keep trading on Kik so they could collect information on other Kik pedophiles.
The guy made the deal and stayed on Kik trading child porn while the FBI watched over his shoulder.
It's weird and creepy as hell that this happened because this Forbes article says it's not clear
if the FBI actually caught anyone else from this operation. What's more is that Forbes asked Kik
about this operation and Kik had no that Forbes asked Kik about this operation,
and Kik had no idea there was an undercover operation going on in their chat app.
So this tells me Kik wasn't working with the FBI on this one.
So all this is to say that, yeah, while the police may be using the app
to do some sort of sting operations,
I don't think there's any major coordination between Kik and the FBI
to conduct lots of sting operations or anything.
And if this is some sort of honeypot, it's permitting quite a lot of vile stuff, which makes me wonder if it's doing more harm than good.
I absolutely do believe that Kik should be held to the same standards as every other social media platform, including Parler.
If they're refusing to moderate
and refusing to remove illegal content,
then they don't have a place among the app store
if that's the way that they want to do business.
I have really mixed emotions about that.
This is my hacker origin story. I came from kick. Everything
I know came from kick. And to have it shut down would hurt me. But after everything I've seen and
everything I've experienced, I'm scared that there's really no other way and i also strongly dislike the idea of the powers that be just
flipping a switch you know what i mean well no i don't why why would that be a bad thing i think
it would shut down this whole system and that'd be great it, but then that's a scary line to cross.
With Parler, I never got on Parler,
so I can't exactly say they didn't deserve it.
But I don't think anybody deserves it.
You know what I mean?
Nobody should be able to just flip a switch
and just end things
but
in terms of kick
I think that might be the only way Thank you, Azriel and Doc, for sharing your very personal stories with us.
Also, thank you to Caitlin Bowden and JP Rigo for lending your voice
and being part of this too. If you ever encounter child porn yourself, please report it to the
Cyber Tip Line, which you can find at cybertipline.org. If you're a listener all caught up and
can't wait for more episodes, then you must find this show valuable. So please consider donating
to the show on Patreon. This will tell me loud and clear that you love it and want more of it,
and it'll give me the means to keep it going.
So please head over to patreon.com slash darknetdiaries and show your support.
Thanks.
The show is made by me, the leader of the operators, Jack Reisider.
Research and fact-checking by the disciple, Sean Summers.
Editing help this episode by the PAC leader Damien.
And our theme music is by the Rust Devil, Breakmaster Cylinder.
And even though the NSA is one of the few government organizations that actually listens to you,
this is Darknet Diaries.
Oh, and one last thing before you go.
If you want to get involved with helping combat this problem, check out the ILF.
Here's my friend Chris to tell you about it.
Hello, my name is Chris Hanagy, and I'm the founder and CEO of the Innocent Lives Foundation.
The ILF is made up of security professionals whose mission is to identify anonymous child predators and help bring them to justice. We have volunteers who are masters in everything from open source intelligence to exploit writing,
who donate their time to identify child predators and hand cases over to law enforcement.
With our non-vigilante stance, we do everything in our power to create airtight cases for law enforcement.
You can join this fight.
When you donate to the ILF, you directly
fund this powerful mission. To learn more about the ILF and to donate, please visit our website at
innocentlivesfoundation.org. Thank you.