Today, Explained - The Facebook whistleblower

Episode Date: October 5, 2021

Facebook kicked off the week with an outage and followed that up today with a whistleblower testifying before Congress. The Wall Street Journal’s Jeff Horwitz explains how the company may have misle...d the public about the dangers of its social networks. Today’s show was produced by Will Reid, engineered by Efim Shapiro, fact-checked by Laura Bullard and edited and hosted by Sean Rameswaram. Transcript at vox.com/todayexplained Support Today, Explained by making a financial contribution to Vox! bit.ly/givepodcasts Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 This NFL season, get in on all the hard-hitting action with FanDuel, North America's number one sportsbook. You can bet on anything from money lines to spreads and player props, or combine your bets in a same-game parlay for a shot at an even bigger payout. Plus, with super-simple live betting, lightning-fast bet settlement, and instant withdrawals, FanDuel makes betting on the NFL easier than ever before. So make the most of this football season and download FanDuel today. 19-plus and physically located in Ontario.
Starting point is 00:00:25 Gambling problem? Call 1-866-531-2600 or visit connectsontario.ca. You might have noticed your Facebook went out on Monday and your Messenger and your Instagram and your WhatsApp, everything at the company went pretty haywire for five-ish hours, which meant almost 3 billion people around the world weren't able to rely on their social media. Mark Zuckerberg lost something like $6 billion in personal wealth yesterday. And that might not even be the worst thing that happens to Facebook this week. Good afternoon, Chairman Blumenthal, ranking member Blackburn, and members of the subcommittee. Thank you for the opportunity to appear before you.
Starting point is 00:01:15 Because today, a Facebook whistleblower named Frances Haugen testified on Capitol Hill. I joined Facebook because I think Facebook has the potential to bring out the best in us. But I'm here today because I believe Facebook's products harm children, stoke division, and weaken our democracy. The company's leadership knows how to make Facebook and Instagram safer, but won't make the necessary changes because they have put their astronomical profits before people. Haugen recently leaked thousands of pages of research from inside the company to the Wall Street Journal. Look, there's been a lot of people wondering, is this product good for us on any number of different levels, right? From mental health to safety to political discourse. Specifically, she leaked them to this reporter, Jeff Horwitz.
Starting point is 00:02:05 I think the thing that this body of work shows is how Facebook understands itself and how Facebook understands its own platform has gone off the rails in ways that it hasn't disclosed to the public. We asked Jeff to walk us through the thousands of pages of internal Facebook files because there's just a ton of stuff. We started with Instagram and what Facebook's researchers discovered when they started asking users about how it made them feel. And the operational assumption inside Facebook is that this stuff really isn't good for teenagers who are in a vulnerable place, in particular girls. What did the research say? So the research says that for most users, people who are in a good place, Instagram's fine. There's
Starting point is 00:02:53 some negative social comparison for most people. And that's just, you look at other people and you see that they are more attracted than you, have better jokes than you, go on better vacations than you. But that's okay. It's part of life. We all live with it. But for a minority of users, and not necessarily even a small minority of users, Instagram can really make some problems worse, in particular body image was something the research focused on. And so they found that for people who were in a vulnerable place, this could be bad. The headline kind of quote from the article probably was, we make body image issues worse than one in three teenage girls.
Starting point is 00:03:31 One in three. And now that's for people who already have body image issues, but it's a pretty high number if that makes sense. And did they do any research on where bad body image issues amongst teenage girls leads? They have found that for some users, and again, this is among people who had thoughts of hurting themselves, they traced that directly back to the platform. So in the US, it was, I think, 6% of users who thought about killing themselves in the last month traced that idea to Instagram directly. And do people who have this really negative relationship with Instagram use it less once they establish that, hmm, this is bad for me?
Starting point is 00:04:11 Yeah. So this is a problem that the platform has, which is that the people who seem to be most vulnerable and most affected by these negative effects are also the ones who have the least self-control in terms of their usage of the product. It makes some sense to me intuitively, right? For me, I'm a reporter, so it's Twitter. But it's not uncommon just to be in a bit of a funk and just keep scrolling. And what Facebook found is that particularly content in certain categories like beauty and fitness and fashion could end up basically triggering these negative spirals. Which is to say what? People end up using it more? Yeah. They basically don't take a break at the point when
Starting point is 00:04:53 really it would be good for them to take a break. And they keep coming back to the platform, even if they are aware that it's not good for them. So from dark to even darker, tell me what you discovered about human trafficking. Yeah. It turns out that Facebook spending on safety issues is just heavily concentrated on English and European languages. Outside of that, it's a very steep drop off and just expenditures are very little. And so one of the things we found was that Facebook was for a long time, pretty aware that it had a human trafficking problem on its platform. This was mostly people selling themselves or being sold into indentured servitude, usually in Gulf states.
Starting point is 00:05:38 So there would be buyers and sellers groups for jobs. And, you know, sometimes those jobs would be real and sometimes they would involve sex trafficking, for example, you know, coming from the Philippines or from Africa. And they basically would be giving away their passports and their rights and they could be, you know, resold without their consent. It's a pretty rough system. Facebook knew this was happening and hadn't really done anything about it. They had a very small team focused on this sort of thing. And nothing really happened until late 2019
Starting point is 00:06:13 when the BBC wrote a story about human trafficking on Facebook. In Saudi Arabia, we found hundreds of women being sold on Haraj, another popular commodity app. And on Facebook-owned Instagram, we found hundreds more. And Apple took notice of it and basically told Facebook that if they didn't get the problem under control immediately, Apple might remove Instagram and Facebook from the app store. And, you know, knowledge that people were being sold on their platform hadn't been enough to get
Starting point is 00:06:46 the company to act but you know the threat of getting removed from the app store damn sure was so facebook just sort of convulsively acted at that point they took down a ton of content related to human trafficking groups pages posts all that but they didn't really fix it. They kind of took care of the immediate problem. And then things went back to normal. And, you know, one of my colleagues actually ended up speaking to a woman who was trafficked just within the last year. So I think there should be some sort of verification on every job advert on Facebook. And maybe the human trafficking thing sounds shocking if you're like here in North America somewhere, but we've heard stories about how Facebook is being used abroad
Starting point is 00:07:35 for like even like genocide in Myanmar, right? Facebook incited violence against Rohingyas. We are not saying this. This was the assessment of United Nations investigators two years ago. It turns out that this is actually kind of a pretty standard weakness. The company has just not invested in many of the safety tools, right? Algorithmic screening tools that look for bad or inciting content or things of that nature, they literally do not exist in many languages. I mean, so for example, in Arabic, they literally don't have people who can moderate
Starting point is 00:08:11 content in most dialects of Arabic. And, you know, you can tell that, you know, from let's say the, you know, Israeli-Palestinian violence earlier this year. Pro-Palestinian activists are accusing Facebook of censorship and targeting the social media giant with one star reviews in the Apple and Google app stores. They were just making errors right and left. I mean, their own employees, we could see this in some of the internal documents, their own employees had to step in and just say, like, guys, you're screwing over some of the largest news outlets in the Middle East right now. And you're calling the name of one of the holiest sites in Islam a terrorist organization. You got to fix this immediately. And so it's kind of this crazy thing where things are always on fire, and they haven't really ever invested in what we even think of as the sort of baseline level of safety efforts that we expect from Facebook in developed markets. You're talking about a lack of resources in one of the wealthiest companies in the world.
Starting point is 00:09:09 Why aren't they investing in core safety teams in some of these countries if they're offering the platforms there? It's, I think, a question of where pressure comes from. Something that Francis noted to me is that every additional market Facebook enters and every sort of new language it services is almost definitionally going to be smaller and poorer than the last country or last market. And so it just simply is a question of priorities. And Facebook has traditionally responded to bad press and government attention from powerful governments. And there just simply isn't the force required to
Starting point is 00:09:52 even, first of all, detect what's going on in the platform and to complain about it in those other markets. So it's in their financial interests to offer the platforms in these poor countries, but it's not in their financial interests to moderate or regulate the platforms in those countries. Look, Facebook's not making meaningful revenue in Myanmar, but they really do not like the idea that there could be sort of other social media players serving the same markets. And they kind of have tried to be everywhere and do everything all at once. And so I think it's like less straight money than it is just like omnipresence is something that the company really, really is invested in.
Starting point is 00:10:31 Let's talk about the algorithm. The algorithm seems to sort of be the root of everything Facebook does. And Frances is out there saying it makes people angrier. The result has been more division, more harm, more lies, more threats, and more combat. In some cases, this dangerous online talk has led to actual violence that harms and even kills people. What's going on? So Facebook has made a lot of tweaks to its algorithms over the years. It's like literally testing out dozens of them at any given time. But a really big one came in 2018. It was called Meaningful Social Interaction. When we ask people what they want
Starting point is 00:11:12 out of the platform, the number one answer we get across different countries and in different ways we ask is they want to connect with friends and family. Specifically, they want to keep in touch with people who live far away. The idea was that Facebook was going to prioritize content that came from friends and family, either reshares or original posts, and that it was going to really kind of try to avoid passive scrolling. And the way it was going to do that
Starting point is 00:11:40 was by prioritizing content that made people engage, right? So like, like, reshare, emoji, or comment. Which is why this year we're really focused on what we're talking about is meaningful social interactions. We're trying to make sure that the time spent on the platform is time people say is well spent. And so what Facebook's researchers realized is that this algorithm change they did in 2018, it turned out to really favor things that were angry and things that were incendiary and things that broke Facebook's rules.
Starting point is 00:12:17 And so basically Facebook ended up turning up the heat on political discourse worldwide. That dynamic led to a complaint to Facebook by major political parties across Europe. This 2019 internal report obtained by Haugen says that the parties feel strongly that the change to the algorithm has forced them to skew negative in their communications on Facebook,
Starting point is 00:12:42 leading them into more extreme policy positions. So Facebook's sort of trying to balance this idea that, you know, we want to be like a shiny, happy place with, we want to keep you on these platforms as much as humanly possible. And the thing that actually keeps you there, it turns out, is having emotional reactions, even though a lot of them end up being negative. Yeah. And I mean, I think the company came to realize that they were creating, I mean, they did come to realize, we can see this in the documents, they were creating perverse incentives for people to create angrier content. And they didn't change it, as you know, they kind of made some tweaks around the edges but they weren't really willing to give up on the benefits for usage that heavy engagement-based ranking
Starting point is 00:13:34 allowed for it sounds like a lot of what you discovered jeff and i don't mean this as any sort of like disparaging analysis of your work here it sounds like a lot of what you're discovering is like, what people maybe suspected was true of how Facebook was running its business is now just being confirmed by these leaks and by your reporting. Is that fair? Yeah. And I think it's also, look, I don't think, we knew that human trafficking was a problem on Facebook. We knew that teen mental health was something that a lot of people were worried about. We knew that Facebook did tend to rile people up. I think what really matters about this, first of all, is that Facebook is the only one who's
Starting point is 00:14:18 ever able to be sure of these things or even get anywhere close to sure, right? Otherwise, it's just us debating social media in a bar. That's one of the big points here is that the company itself is the only one who can even fully consider, much less answer these questions. And right now, there isn't a way in. And I think that's one of the things that Frances is really concerned about and that motivated her to talk to me originally and to eventually do what she did in terms of collecting documents. She's hopeful that if everybody is aware of what Facebook is actually doing, that maybe
Starting point is 00:14:58 they will demand sources of information about the company that don't require someone like her to take a lot of risk and go public. Thank you. pictures is a great way to keep up with family. And Aura says it's never been easier thanks to their digital picture frames. They were named the number one digital photo frame by Wirecutter. Aura frames make it easy to share unlimited photos and videos directly from your phone to the frame. When you give an Aura frame as a gift, you can personalize it. You can preload it with a thoughtful message, maybe your favorite photos. Our colleague Andrew tried an Aura frame for himself. So setup was super simple. In my case, we were celebrating my grandmother's birthday. And she's very fortunate. She's got 10 grandkids.
Starting point is 00:16:17 And so we wanted to surprise her with the AuraFrame. And because she's a little bit older, it was just easier for us to source all the images together and have them uploaded to the frame itself. And because we're all connected over text message, it was just so easy to send a link to everybody. You can save on the perfect gift by visiting AuraFrames.com to get $35 off Aura's best-selling Carvermat frames with promo code EXPLAINED at checkout. That's A-U-R-A-Frames.com promo code EXPLAINED. This deal is exclusive to listeners and available just in time for the holidays. Terms and conditions do apply.
Starting point is 00:16:57 Jeff, your reporting on the Facebook files anonymized the whistleblower. But since she's made her identity known with a 60 Minutes interview on Sunday and congressional testimony today, what's her story? She's 37, grew up in Iowa, daughter of a doctor and a college professor who eventually turned into an Episcopal priest. And she got her start at Google, I think, came up doing product management and algorithm management. I have worked as a product manager at large tech companies since 2006, including Google, Pinterest, Yelp, and Facebook. My job has largely focused on algorithmic products like Google Plus Search and recommendation systems like the one that powers the Facebook newsfeed. She ended up getting really, really sick, kind of derailed her career. She was at Google then and basically had to resign.
Starting point is 00:17:53 And she hired a family friend to help her with her recovery because she could barely walk at that point. This young man was a really close friend and was a really meaningful connection to her at a time when she was largely homebound. And then he got radicalized on the internet. Now, it wasn't Facebook. It was like Reddit 4chan, but basically someone she knew and was one of her only points of social contact during a really rough time, ended up going down kind of a white nationalist conspiracy rabbit hole. And she tried to intervene. She failed. And the guy basically left the Bay Area where she lived and kind of the friendship just disappeared for a number of years. He's actually since recovered and rejected the really crazy beliefs, but that really left a mark. And so she had been ignoring Facebook job recruitment offers for years. But in 2018,
Starting point is 00:18:58 she decided that she was going to try to go join Facebook and do work that would actually help address this stuff. And I've seen her cover letter to the company. It's very clear she wants the job because a friend's been radicalized and she thinks it's important to try to help people avoid that. So she really came into the job with a pretty personal motivation.
Starting point is 00:19:20 And I think what happened is once she got inside, she kind of came convinced that Facebook either couldn't or wouldn't solve the problems she'd worked at a lot of companies and I think one of the things that really surprised her was just kind of how bare bones a lot of Facebook's integrity operation was. So I mean, she was given a team of brand new engineers blue line people, and really try to kind of spread misinformation among particular communities that's tailored to them. And she, by her own acknowledgement, did not manage to do the job in the time Facebook had given her and basically was told, look, we're Facebook. We do impossible things with minimal resources. Get used to it. And that was supposed to be motivational. look, we're Facebook. We do impossible things with minimal resources. Get used to it.
Starting point is 00:20:30 And that was supposed to be motivational. But in her mind, she just looked around and she came to believe that the company simply wasn't investing what it needed to invest in safety work. And even beyond the investing issue was the question of, were they actually following their own advice when their experts did come to conclusions about where problems were and how they could be fixed. Do you know when she makes the decision to sort of give up trying to fix these things internally and make what she knows public? I think her turning point came on December 2nd of last year, when after this kind of intense and heroic effort by this kind of understaffed civic integrity team to try to prevent total disaster from happening in the US election, and they did have a lot of successes in addition to some major defeats, Facebook decided
Starting point is 00:21:19 like a few weeks after the election, it was going to disband the entire thing. And it was framed as kind of a thing that was in everyone's best interest and everyone's going to get jobs and get spread throughout the organization. But I think a lot of people on Civic, including her, kind of took this as a sign that Facebook truly just considered the Civic team to be a thorn in its side rather than an entity that could help guide it to a better place. I'd gotten in touch with her weeks before then. I only heard from her that evening for the first time. Does Facebook dispute any of what she has shared with you or any of what she's saying now in public? Not the legitimacy of it. I think Facebook has disputed whether the characterization is correct and they've taken
Starting point is 00:22:00 issue with whether the Wall Street Journal has overhyped things. But I mean, the documents are the documents. And I think that's one of the things that is really powerful about this stuff is that when Facebook wants to argue about teen mental health, they're not arguing with the Wall Street Journal. They're arguing with their own researchers. And is Facebook currently arguing with its own researchers? Oh, yeah. There's been a lot of that. Facebook actually released a couple of the slide decks related to teen mental health that we wrote about, and they annotated them with just some pretty harsh criticism of their own people. And whether the research was justified or the researchers understood the objectives they were going for or whether things could be considered causal. And like, this is UX research. It's not supposed to be publishable in like science, you know,
Starting point is 00:22:50 via a pre-review process. So, I mean, you know, from my perspective, it looked like people doing what was basically solid, you know, rough and ready UX work. But, you know, you have Facebook kind of both publicly and privately kind of suggesting that its researchers caused it a problem. Is anything Facebook's doing here actually illegal or is it simply ill-advised, bad for society? Her read is that some of these things are an investor issue, that Facebook hasn't been straightforward with the world in general. And yes, that includes its investors. That's kind of the basis for filing a claim with the SEC and seeking whistleblower protection status there. But her read is more that whether it's illegal or not, maybe our laws
Starting point is 00:23:35 aren't ready for it right now. But that's just kind of a sign that we got to get to work rather than that there's nothing wrong with it whatsoever. I mean, it does make some sense, right? Like most of our laws from the internet predate social networks. So you wouldn't even expect that they could have gotten it right. Facebook has demonstrated they cannot act independently. Facebook over and over again has shown it chooses profit over safety. It is subsidizing, it is paying for its profits with our safety. I'm hoping that this will have had a big enough impact on the world that they get the fortitude and the motivation to actually go put those regulations into place. Is Congress equipped to deal with Facebook or is Facebook sort of unknowable to them and changing too quickly for them? I think Francis, like a lot of people
Starting point is 00:24:24 inside Facebook and former employees, is a little distraught with the state of the public debate outside the company. People are talking about, is there political bias in the sense of, is Mark Zuckerberg squelching conservatives, or should Section 230 be repealed, or should Facebook be broken up? And these are kind of, for her, the wrong questions. They don't really get to the things that she's seen that she thinks are problematic, which is the issue of engagement-based ranking and the issue of information access from outside the company. So I think one of her goals and her reasons for going public, I think, is that she did want to see if she could have some influence and perhaps have members of Congress and people in the press as well asking somewhat different questions than
Starting point is 00:25:14 they have been historically in terms of not should we ban Facebook or break it up or cause it to get sued into oblivion by repealing Section 230. But should we simplify it and force more information to be produced about it? But an important distinction here is that she doesn't think Facebook can regulate itself and that someone needs to step in. Is that right? Yeah. I mean, this is someone who joined the company with the goal of being part of the solution and helping it fix itself. And she gave up on that idea. And she gave up on it candidly, I think, pretty quickly and pretty thoroughly in the end. She does believe that not just outside oversight, but a smarter type of outside oversight is going to be necessary.
Starting point is 00:26:13 We now know the truth about Facebook's destructive impact. I came forward at great personal risk because I believe we still have time to act. But we must act now. I'm asking you, our elected representatives, to act. Thank you. dot com or you can listen. The Wall Street Journal has a daily news podcast. It's called The Journal. Today's episode of this daily news podcast was produced by Will Reed. I'm Sean Ramos for him. It's Today Explained. Thank you.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.