The Journal. - Meta Is Struggling to Boot Pedophiles Off Facebook and Instagram

Episode Date: December 7, 2023

Meta has spent months trying to fix child-safety problems on Instagram and Facebook. But as WSJ's Jeff Horwitz explains, the social media giant is still struggling to prevent its own systems from enab...ling and promoting a vast network of pedophile accounts. Further Reading: - Meta Is Struggling to Boot Pedophiles Off Facebook and Instagram  - Instagram’s Algorithm Delivers Toxic Video Mix to Adults Who Follow Children  - Instagram Connects Vast Pedophile Network  Further Listening: - He Thought Instagram Was Safe. Then His Daughter Got an Account.  - The Facebook Files, Part 1: The Whitelist  - The Facebook Files, Part 2: 'We Make Body Image Issues Worse'  Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Our colleagues Jeff Horwitz and Catherine Blunt have spent months investigating something really troubling, the spread of content that sexualizes children. It happens all over social media, but they took a closer look at how it happens on Facebook and Instagram, two platforms owned by Meta. We talked to Jeff about it. We have been looking at how Meta recommends content and how it builds pernicious communities,
Starting point is 00:00:31 in particular, a community focused on the sexualization of children. Those people aren't a natural community, but Meta's products kind of build them into that via recommendations of accounts you should follow or groups you should join. Do you have a sense of the number of people who have been involved in these types of groups? I can tell you what Meta employees have told us, which is that it's in the millions. us, which is that it's in the millions. In June, Jeff and Catherine published a story showing that Instagram's algorithms were connecting and promoting a network of accounts.
Starting point is 00:01:13 Accounts that were openly devoted to underage sex content. Meta took immediate action to address the problems. It set up a child safety task force, pulled down hashtags related to pedophilia, and it reviewed its content moderation practices. But as Jeff and Catherine found, the problems are far from being solved. Welcome to The Journal, our show about money, business, and power. I'm Jessica Mendoza. It's Thursday, December 7th.
Starting point is 00:01:51 Coming up on the show, how Meta's platforms recommend content related to sexualizing children. At Air Miles, we help you collect more moments. So instead of scrolling through photos of friends on social media, you can spend more time dinnering with them. Mmm, how's that spicy enchilada? Oh, very flavorful. Yodeling with them.
Starting point is 00:02:21 Ooh, must be mating season. And hiking with them. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww.
Starting point is 00:02:25 Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww.
Starting point is 00:02:26 Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww.
Starting point is 00:02:26 Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww.
Starting point is 00:02:27 Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww.
Starting point is 00:02:30 Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww.
Starting point is 00:02:31 Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. Eww. E After the June story published,
Starting point is 00:02:47 our colleagues launched a new investigation into Meta's platforms. They wanted to look more closely at how Facebook and Instagram recommended content. One way to figure that out was to set up test accounts. Here's Jeff again. We purchased some new devices just so we had a completely clean account, right? purchased some new devices just so we had a completely clean account, right? We wanted to make clear that we were seeing what with a fresh account, you know, would happen. What did these test accounts look like? The accounts stated they were tied to us and they were set up as adults.
Starting point is 00:03:18 The team set a whole bunch of parameters. Among them, they wouldn't look for illegal images or follow any accounts that seem to be promoting or selling child exploitation content. And along the way, they would flag problematic accounts to Meta. At the center of their examination was Meta's algorithms. The main goal of those algorithms is to get content in front of as many interested people as possible. And they're really good at predicting what people want to see. There's something called accounts you should follow. It's a recommendation system Instagram has. Right, which is something that if you've used Instagram, you've seen it.
Starting point is 00:03:53 This is a very standard growth tool. They're just trying to connect you with more people. You know, you look at an account that's into old jewelry, they'll connect you with more accounts that are related to old jewelry. But the dark side of the recommendation system is that it'll also connect people searching for inappropriate content. If you're into kids in the wrong way, that's a very clear signal to the algorithm. And they will tailor the recommendation systems accordingly. You look into an account related to underage child prostitution, they're going to connect you with as many accounts as they can find related to underage child prostitution, they're going to connect you with as many accounts as they
Starting point is 00:04:25 can find related to underage child prostitution. Jeez. Right? And this is all based on network analysis, right? Namely, that there's a whole bunch of people who want this content. And, you know, it's not hard for a computer to determine that people who are into one potential underage sex account are going to be into all the other ones too. Jeff and the team put Meta's recommendation system to the test.
Starting point is 00:04:51 They used their new accounts to follow gymnasts and cheerleaders because they'd noticed that a lot of these users' followers appeared to be adult men. What we could do was follow gymnasts, child gymnasts and cheerleaders and the people who follow them. Right. So like this is just like, OK, let's follow a couple dozen 11 year olds that have parent monitored accounts and then see what happens. And then a second stage was, OK, we follow the top 10 or 15 people following those accounts. How does that change things?
Starting point is 00:05:28 Like, you'd think that a recommendation system, if you follow child gymnasts and cheerleaders, you should just see more child gymnasts and cheerleaders, right? You know, like, makes sense on a sort of intuitive level. But that wasn't the way it turned out. But that wasn't the way it turned out. It turned out that just following preteen and early teen gymnasts and cheerleader influencers was enough for Instagram to start recommending adult sex content. And that's not a great start, right? And very clearly a line was being drawn between, are you interested in preteen gymnasts and cheerleaders?
Starting point is 00:06:10 If so, like... Here's some other stuff you might be interested in, yeah. Right? Like, once you added the followers, then I think at that point the algorithm was like, oh, yeah, I see what's going on here. Like, welcome to the community, right? If that makes sense and starts serving up stuff that is a little bit more aggressive. I mean, that's when we started to see promos for accounts that seemed to be openly suggesting that they were selling child sex content. Invited you to DM them, invited you to follow a link to other platforms that are kind of known for this sort of thing. Meta said that the test accounts produced a manufactured experience that doesn't represent what billions of users see.
Starting point is 00:06:58 How far in to following these other accounts did it become clear to you that this is where it was going to go? Oh, that was fast. I mean, within a few days, you know, you started getting adult content recommended to an account that just followed kids. And likewise, as soon as you added the followers, the entities that clearly Instagram viewed as being part of the community interested in this stuff, I mean, it didn't waste any time, right? Right. As soon as the signal was present, it acted on it and it started recommending content that was, you know, going to personalize the platform and boost engagement. And it wasn't just the journal's reporting that showed problematic results.
Starting point is 00:07:44 Test accounts from the Canadian Centre for Child Protection had similar findings. Why was all this happening? That's after the break. Introducing TD Insurance for Business, with customized coverage options for your business. Thank you. You can get customized coverage for your business. Contact a licensed TD Insurance advisor to learn more. With Uber Reserve, good things come to those who plan ahead. Family vacay? Reserve your ride as soon as you book your flights. To all the planners, now you can reserve your Uber ride up to 90 days in advance. See Uber app for details. Experts say they have seen illegal, live-streamed videos of child sex abuse on Meta's platforms.
Starting point is 00:09:01 But Jeff says there's another category of troubling content. A lot of the stuff isn't, it turns out, strictly meeting the definitions of child sexual abuse. So there's this category, which is stuff that is like, it's really creepy. There's no way that any reasonable human being could look at it and be like, oh, yeah, that's an appropriate thing to be posting, sharing, looking at. But at the same time, it doesn't meet the definitions of child exploitation for a legal sense. What's an example of that? So let's say you're the parent of a kid who's a gymnast and you're filming her workout routine. And in the course of a five minute video, there's a 10 second bit in which the kid's doing the splits while stretching. That, you know, nobody's gonna think that's particularly crazy. There's a 10-second bit in which the kid's doing the splits while stretching.
Starting point is 00:09:49 That, you know, nobody's going to think that's particularly crazy. Now let's say you own an account. You don't have a child, but you are compiling those 10-second clips of little girls doing the splits. At that point, like, we're still totally legal, but we're also in hell. And I think a lot of the stuff that Meta has done is geared toward like finding known child sex abuse content, stuff that law enforcement and nonprofits have already flagged. And that's actually like kind of black and white. What the company I think has a harder time with, and to be fair, this is hard for anybody, What the company, I think, has a harder time with, and to be fair, this is hard for anybody,
Starting point is 00:10:31 is distinguishing between that and child sexualization, right, which is a lot more amorphous. So that means that a lot of inappropriate content can be on Meta's platforms. And it can make its way into the algorithms that serve up targeted suggestions. On Facebook, the Groups You Should Join feature suggested topics such as kidnapping and quote-unquote dating children as young as 11. You know, I think one that just sort of still sticks with me is a 200,000 user group. It was Spanish language,
Starting point is 00:11:01 but the name translates to Dark Family Secrets. This is a pro-incest group, or at least it was until we reported it to Meta. And people were literally arranging to swap live streams of reported sex content involving their own children. And you have this crowd of 200,000, which is twice the size of the biggest football stadium in the U.S. When Jeff flagged some of these groups via user reports, Meta often found the groups to be acceptable. Like a Facebook group named Incest. Facebook's automated response system said, quote, We've taken a look and found that the group doesn't go against our community standards, unquote.
Starting point is 00:11:44 It was only after Jeff brought specific groups to the attention of public relations at Meta that the company removed them. A Meta spokesperson told the journal that the company made nearly 200,000 groups on Facebook more difficult to find and disabled tens of thousands of other accounts. But he added that the work hadn't progressed as quickly as the company would have liked. You know, the company, I think, is always kind of working to do better, right? It's not like Meta wants these people on its platforms. But the question is, what's the cost of making their product less helpful?
Starting point is 00:12:19 And what has Meta said about why the company hasn't been able to address these problems? What has Meta said about why the company hasn't been able to address these problems? Meta has said that it is doing what it can and that I think there's an issue with, you know, they view the recommendation systems being really valuable. It helps connect people with communities and hobbies and interests of theirs that they really value. And in most instances, those are totally okay. Meta could tamp down the effectiveness of its algorithms, but this recommendation system is one of its most valuable assets. The algorithms allow the company to sell highly targeted ads. That's a feature Meta wants to protect. On the other hand, if those targeted ads get placed
Starting point is 00:13:02 near inappropriate content, that's a big business risk. Brands have expectations about what content is safe for their advertisements to appear next to. And in this instance, we were finding that ads for major brands were consistently appearing next to content that was being recommended to our test accounts that was like pretty awful. Like it was a mix of adult pornography plus like images of little kids plus kind of inducements to go visit other forums off platform that might be selling illegal content. And so a couple of the brands, Match Group and Bumble Inc., those are probably the two biggest dating app companies, pulled their ads. One of sort of the big questions that comes out of this work for me is just sort of like, okay, is this a cost that maybe the platform and society in general are willing to bear, right?
Starting point is 00:14:13 Which is that, yes, it is an unfortunate thing that communities devoted to child sexualization exist. And it's also unfortunate that meta systems cater to them pretty effectively. And I think the question that sort of, it's a meta question, it's a question for users, it's a question for regulators, et cetera, is like, okay, what's the cost? Because I think certainly there's no kind of imminent fix
Starting point is 00:14:48 on any of these things coming down the pipe, right? They've done the stuff they were going to do for their child task force. The same recommendation issues that we were saying at first are still ongoing. And I think that's one of the interesting things that just of the months of back and forth the company over this stuff has led to is like, all right, well, this is kind of, this is what the platform currently does and seems like what it will continue to do barring some unexpected development. That's all for today, Thursday, December 7th.
Starting point is 00:15:34 The Journal is a co-production of Spotify and The Wall Street Journal. Additional reporting in this episode by Katherine Blunt. Thanks for listening. See you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.