Radiolab - Facebook's Supreme Court

Episode Date: February 12, 2021

Since its inception, the perennial thorn in Facebook’s side has been content moderation. That is, deciding what you and I are allowed to post on the site and what we’re not. Missteps by Facebook i...n this area have fueled everything from a genocide in Myanmar to viral disinformation surrounding politics and the coronavirus. However, just this past year, conceding their failings, Facebook shifted its approach. They erected an independent body of twenty jurors that will make the final call on many of Facebook’s thorniest decisions. This body has been called: Facebook’s Supreme Court. So today, in collaboration with the New Yorker magazine and the New Yorker Radio Hour, we explore how this body came to be, what power it really has and how the consequences of its decisions will be nothing short of life or death. This episode was reported and produced by Simon Adler. To hear more about the court's origin, their rulings so far, and their upcoming docket, check out David Remnick and reporter Kate Klonick’s conversation in the New Yorker Radio Hour podcast feed. Support Radiolab by becoming a member today at Radiolab.org/donate.      

Transcript
Discussion (0)
Starting point is 00:00:00 Wait, you're listening. Okay. Go on listening to Radio Lab. Radio Lab. From WNYC. See? See? Yeah.
Starting point is 00:00:17 Three, two, one. Hey, I'm Chad. I'm going to go on this radio lab today. We have got a special collaboration with the New Yorker magazine and the New Yorker Radio Hour, very excited about that. So for the last several years, we hear at Radio Lab, and by way,
Starting point is 00:00:35 I mean, mainly Simon Adler, he, we, have been watching and reporting on Facebook, specifically how Facebook decides and then enforces what people can and cannot post on their site. As many of you know, the way that they do it is they've got this rule book. One single set of rules for the many countries in the globe to define what is postable and what isn't. And then they have a giant army of 15,000 souls who have to moderate all the crap that we
Starting point is 00:01:07 put on Facebook. Uh, anyhow, in doing so, Facebook has managed to piss off. Well, just about everybody. I mean, despite all of the time, effort, and money that they have thrown at this problem by taking post-down... They have been accused of censoring voices across the political spectrum and infringing that they have thrown at this problem by taking post down. They have been accused of censoring voices across the political spectrum and infringing on users' right to free expression. And then by leaving material up,
Starting point is 00:01:44 they've been accused of helping to incite a genocide in Myanmar, and arguably swing the 2016 US presidential election. And I start here with this wrap up because since we last reported on all of that, Facebook has actually made a pretty big shift in how they are going to approach policing, referring the world's speech. It's a shift that's going to have a massive impact on their decisions about what isn't allowed on the site, including the question, which we'll talk about in a second, of whether former President Trump should be banned indefinitely from Facebook.
Starting point is 00:02:22 But more deeply, this is a shift that has Facebook really looking less like a company and oddly a little bit more like a government, an unelected government for the entire planet. So with all of that, let me now hand off to. Hi. Hello, Kate, how are you? Simon. Are you rolling on your end? There we go. Now I am rolling.
Starting point is 00:02:48 Great. I will record myself on my phone. Yeah, so a couple months back I called up academic Kate Klonek to talk about this shift and this research project she's been working on documenting it. I want to be done with this project so f***ing badly. Just like, yeah. This has been your life. Yeah, it has like a little bit too much though. I'm ready to like, you know, I'm ready to kind of do something different. Kate is a professor of law at St. John's University. She's studied Facebook often on for years.
Starting point is 00:03:19 And she was added again because back in 2018, Mark Zuckerberg, the company CEO, was considering this strange proposal. Yes, like this crazy project to solve this crisis about content management. I think you know that I've been kind of inside for the last couple, like a little over year. Kate actually sat down with Mark to talk about all this. She did it
Starting point is 00:03:45 over the computer so you'll hear some clacking of keys. But anyway, as he told her, I said a much times that I just think that it's unsustainable over time for one person or even kind of one company's operations to be making so many decisions, balancing free expression and safety at the scale. Like I recognize that this is a huge responsibility. And I'm not going to be here forever.
Starting point is 00:04:14 Yeah, I'd like, I plan to be running the company for a while, but one day I'm not going to be running the company. And I think at that point it would be good to pass built up a separate set of independent structures that ensure that the values around free expression and balance in these equities can exist. Oh, interesting. Like I trust me, but I don't necessarily trust the next guy. Right. And so like a benevolent dictator, he wants to devolve power away from Facebook and himself. And what he'd landed on as a model for how to do this?
Starting point is 00:04:48 Was a Supreme Court for Facebook. And... Sorry, what exactly, like, what? Yeah, so the proposal was pretty simple. It was creating a group of people from all over the world that would basically be this oversight on Facebook and its speech policies. Essentially think of it as like the Supreme Court of the United States,
Starting point is 00:05:11 but instead of overruling lower court's decisions, this Supreme Court of Facebook would be able to overrule Facebook's decisions. It's a hard pitch to make, isn't it? Oh my God, 100%. You can imagine how that went over. They're like, wait, what? You want us to do what?
Starting point is 00:05:28 That's how I imagine that going. Yeah, but Mark wanted this to happen and so it happened. It's part of like a larger sense, I think, that he sees Facebook becoming more and more like a government isn't even the best term term but like a system of government. Like a long-term legacy that he knows will not make terrible decisions. This seems to be them catching up and being like, yeah, like if you've got three billion users, you're bigger than any company at that point, any country, your rules can be as impactful as any government's laws, and so you really need to start thinking of yourself
Starting point is 00:06:22 in a new way. Yeah, I think that's right. Has any company ever done anything like this before? I mean, honestly, there's nothing that even kind of comes close, and I don't want to be grandiose about this, but there is a sense in which it feels like you're watching. I felt like I was watching an experiment that would, even if it completely and utterly failed, would be remembered and be a lesson. For however, the world ends up sorting out this problem of online speech.
Starting point is 00:06:54 And so once Facebook decided to build this court, they suddenly needed to figure out like, what cases would go to the court? Who would be on it? How would they make these decisions? And it became clear that it's, you know, not appropriate to have a single person answer these questions on behalf of society for a great disinstitution.
Starting point is 00:07:17 This is Brent Harris who led Facebook's effort to build this board, this court. And as one of his first decisions, he said, we need to go out and actually listen to a wide array of people about what the problems are and the challenges are that they are finding and ask them what do they want this to be, what can we create. And so they held dozens of listening sessions all over the world talking to lay people. But the cornerstone of this process was really six global workshops where they invited experts to come and weigh in.
Starting point is 00:07:51 Kate was one of 40 or so people that attended the US workshop. It was held in the basement of the Nomad hotel in downtown Manhattan. And when she walked in, it was like walking into a technologist's wedding. You come in every table is decorated with succulents and bottles of vast water and an iPad.
Starting point is 00:08:12 The iPad is not for you to keep. And in fact, someone jokes to one of the Facebook people just to me, it's like, yeah, we used a couple generations, old iPad to make sure no one walked away with any of that. That's spectacular. old iPad to make sure no one walked away with any of them. That's spectacular. So you have an iPad and ultimately... This moderator came out and tried to get the room's attention.
Starting point is 00:08:34 And of course, everyone's half-listening, most people on their phones and whatever else. In part because a lot of people in that room were just very skeptical of what Facebook was doing here. I mean, Kate herself remained somewhat skeptical of this court. This is just something Facebook can scapegoat. It's really crappy decisions too.
Starting point is 00:08:53 That was my main skeptical point in all of this. That Facebook is essentially erecting what will be just a body to absorb blame. But anyhow, the moderator explained what they were up to that they'd brought these experts here to Inesyn's design this institution. So what do you think this should be? What does it look like?
Starting point is 00:09:15 And some of it was like an answer to questions, some of those things people brought up, case selection questions, board selection, who picks the board? And I would say I saw a third of it was people standing up and holding forth on topics that had nothing to do with why we were there that day. Less of a question and more of a comment. Uh, Holy cow.
Starting point is 00:09:37 So many of those eventually though they got to the heart of the matter. Like how should a global board think about these cases that are right on the edge? What we wanted to do was really put people in the shoes that Facebook is in right now and taking these decisions. So they told them, like, hey, you are going to play mock court. As a group, you're going to have to decide whether a piece of content should stay up on Facebook
Starting point is 00:10:04 or come down. And so everyone was asked to open their iPad. So you were asked to like, we're going to go over the first simulation. And you'll love this Simon. The first simulation that they did was the Kill All Men simulation. Really? Yes. Wow. Oh, that's great. Oh, this is the thing you, the one that you focused on on the last story. I remember there was like a song in there. Yeah, it's the. Oh, this is the thing you, the one that you focused on on the last story. I remember there was like a song in there. Yeah, am I right? You're totally right.
Starting point is 00:10:30 We spent 10, 15 minutes dissecting this piece of content and this. You know what? You should play this and just be like, here's what they focused on. Okay, yeah. I think we only need to do about three minutes of it, but here it is.
Starting point is 00:10:43 We're gonna keep it moving right along the next Tweeting Come to Stage. Please give it up for Marsha Belskie! We did this back in 2018. It's about comedian Marsha Belskie and a photo she posted. Yeah, I guess I guess so mad. I feel like my first one to the city, I was such a carefree, brat. You know, I was young and I had these older friends,
Starting point is 00:11:05 which I thought was like very cool. And then you just realized that they're alcoholic. You know? You know? This is her up on stage. She's got dark curly hair, was raised in Oklahoma. How did you decide to become a comedian?
Starting point is 00:11:19 You know, it was kind of the only thing that ever clicked with me. And especially political comedy. You know, I used to watch the Daily Show every clicked with me. And especially political comedy, I used to watch the Daily Show every day. And inspired by this political comedy, she started this running bit that I think can be called sort of
Starting point is 00:11:34 absurdist feminist comedy. Now a lot of people think that I'm like an angry feminist, which is weird, this guy called me a militant feminist. The other day, and I'm like, okay, just because I am training a militia woman in the woods. At first, I just had this running bit online on Facebook and Twitter. She was tweeting and posting jokes. You know, like we have all the Buffalo Wild Wings surrounded, you know, things like that.
Starting point is 00:12:04 Eventually took this bit on stage, even wrote some songs. Anyhow, so about a year into this running bit, Marcio was bored at work one day and logs onto Facebook. But instead of seeing her normal newsfeed, there was this message that pops up. It says you posted something that discriminated along the lines of race, gender,
Starting point is 00:12:38 or ethnicity group. And so we removed that post. And so I'm like, what could I possibly have post it? I really, I thought I was like a glitch. But then she clicked continue and there highlighted was the violating post. It was a photo of hers. What is the picture? Can you describe it? The photo is me as what can only be described as the cherub.
Starting point is 00:13:00 Cute little seven year old with big curly hair and she's wearing this blue floral dress, her teeth are all messed up. And into the photo, Marcia had edited in a speech bubble. That just says, kill all men. And so it's funny, you know, because I hate, I hate, it's funny, you know, trust me, whatever. Facebook had taken it down because it violated their hate speech policy. I was dumbfounded. And so back to present day, this is the scenario they put in front of these tech elites in the basement of the nomad hotel to see really how they would react. Is that hate speech? What does that mean? And should that be up on Facebook or not?
Starting point is 00:13:45 Leave it up or take it down. And so people started to discuss. People were like, well, this wasn't funny. And someone else was like, does it matter whether it's funny or not? Back and forth and back and forth. And even so, like, should men be protected? Like men are more protected than other groups. Eventually, though, the room pretty much came to an agreement. Kill all men is clearly humor or social commentary. That should be up on Facebook. And it's inappropriate for Facebook to take that down. Yeah, I get that. I mean, I remember when we first did this feeling like, like, this is a harmless joke, right? And Facebook should be a place where harmless jokes can get made. Because in this case, the joke only works because men are the power structure.
Starting point is 00:14:31 If they weren't, it wouldn't be funny. Yeah, it's punching up. There you go. It's punching up, right? But here's where things get interesting. Because as we said, they did six of these expert global workshops. Berlin, Singapore, New Delhi, Mexico City, Nairobi.
Starting point is 00:14:50 And at each of them, they ran through this kill-all-men scenario. We ran that case across the world. And something that's very, very striking is we got really different viewpoints about should that be up on Facebook or not. Like not just at the New York workshop, but in Berlin, another Western liberal democracy and even Singapore, folks supported leaving it up. And you'd think that folks who'd experienced more authoritarian governments and restrictions
Starting point is 00:15:26 on their speech would also be for leaving it up. But it didn't go that way. The sound's really bad. Go for it. But I understand that, of course, I kill all men. That's the most feminist radical joke that you can make. This is Berhan Taie. She works for an NGO called Access Now. We defend an extended rights of users at risk around the world. And when she was shown this photo
Starting point is 00:15:55 at the global workshop in Nairobi, which had attendees from all across the African continent, her thought was, It was very funny. And you know, many of us are feminists might have said that once and I know what's twice in our life right where you're just like you know could we yeah you know and I like I understand that to be a joke so I'm like yeah of course they should be space for humor and I know why satire is so important. But I'm sensing a but what is it?
Starting point is 00:16:22 So you know it's how do I put it? So for me right now you know it's So, you know, it's, how do I put it? So for me right now, you know, it's funny, but, you know, humor is a luxury. And we're not, we're not, and none of us are laughing right now. So yes, we've seen content like that. It's unfortunately quite prevalent.
Starting point is 00:16:41 And, you know, we've lived through it. So it's not something that we joke about right? What is she, what events in the world is she thinking of when she says that? Well some very recent history and so we're going to take a little bit of a detour here to understand why Berhan would want that kill-all-man joke to take it down. And along the way, we're going to see close-up really the life and death decisions this global court will have to make. We'll get to that right after a quick break. This is Lauren Fury from Western Springs, Illinois.
Starting point is 00:17:36 Radio Lab is supported in part by the Alfred P. Sloan Foundation, enhancing public understanding of science and technology in the modern world. More information about Sloan at www.Sloan.org. Science reporting on Radio Lab is supported in part by Science Sandbox, a Simon's Foundation initiative dedicated to engaging everyone with the process of science. Chad, radio lab here with Simon Adler. Yes, yes, yes, yes. Okay, before we went to break, we met digital rights activist Berhan Taie, who was opposed to leaving a joke like Kill All Men on Facebook.
Starting point is 00:18:18 That is correct. So why is that? What was she thinking? Yeah, well, I mean, it comes down to what's been going on in her home country. You're the absurdity in Ethiopia right now. Ethiopia. There's a lot of animosity between different groups, a lot of tension. And looking at just the past four or five years there,
Starting point is 00:18:41 you see how these questions of who's punching up and who's punching down can get flipped on their head with the click of a mouse. So to set things up, Ethiopia sits right on the Horn of Africa. It's the second most populous country on the continent. And for a long time, it was considered one of the world's leading jailers of journalists. Politically, the country used to be very authoritarian, very repressive. This is online activist turned academic and al-Qaqqala, the assistant professor at Hamlin University. And yes, I can say that me and some of my colleagues were like the first people
Starting point is 00:19:18 logging to Ethiopian public. He was actually forced into exile because of this activism, and the way he tells it. 2015. The worst unrest in a decade. The demonstration started as a small skills student protest. Student protest break out and they start spreading across the country. Thousands took Ethiopia streets over the weekend and watching this unfold from the United States, Dr. Chala noticed that at the center of these protests was this guy, Jawar Muhammad. Yes.
Starting point is 00:19:52 Jawar himself is a very tech savvy guy, he is a articulate and English. He is dissenting voice that allowed. There is going to be sufficient prayer on the government to break its will to go. And, you know, he had to go to 1.43 million followers on Facebook. Making him as powerful as just about any news organization in Ethiopia. Now, a couple quick things about Joar. Number one, he is from the Aromoity, the largest ethnic group in the country. And
Starting point is 00:20:27 we'll get more into that in a moment. But first, the other notable thing about Joar is that at the time that these protests were getting underway, he was actually living in Minnesota. He was in exile there, thousands of miles away from the action. At least 75 people killed Joar. But as these protests intensified, including clashes with there, thousands of miles away from the action. But as these protests succeeded. He was lionized as well a hero. God, it's God! It's God!
Starting point is 00:21:16 He's a hero! He's a hero! He's a hero! He's a hero! He's a hero! as well a hero. One who'd helped usher in a new Prime Minister. Ethiopia has a new leader. Abiy Ahmed.
Starting point is 00:21:35 Abiy Ahmed won 60% in the new era in Ethiopia. Since coming to power, Prime Minister Abiyah Mehd was engaged in listening to what people of the country have to say. No, for the first time in our entire maybe 3000 years of history. Again, Berhantay. We actually thought we could be a cohesive United country. The government freed thousands of political prisoners and journalists. The latest of journalists, invited those
Starting point is 00:22:07 in exile to come back home, even ended a decade's long conflict with neighboring iratre. I mean, these changes were so profound that Ethiopia's new prime minister, Javier Ahmed. Thanks in no small part to Jawar Muhammad. Went on to win. The Nobel Committee has decided to award the Nobel Peace Prize to Ethiopian Prime Minister of the Ahmed Ali. That's right, the Nobel Peace Prize. price. So what you've got here is really the promise of Facebook realized, right?
Starting point is 00:22:57 Like man from thousands of miles away leverages Facebook's power to bring down an authoritarian government and elevate a peace-loving leader. I mean, this is David and Goliath level sh**. And as part of all of these reforms... I will be traveling back to the country. We have now established our office in Exager. Jawah Muhammad returned to Ethiopia and was welcomed with open arms. However... and was welcomed with open arms. However, while Abiy Ahmed's reform ambitions have increased his popularity,
Starting point is 00:23:29 analysts fear that ethnic rivalries in Ethiopia will undermine his reforms. The very forces that brought this change about began pulling in the opposite direction. I'm sure you're going to get a lot of reaction for this because everything is contested in Ethiopia. Every historical fact, everything. You see people are confused, there is information disorder in the United States. This is just like child display when you compare it with Ethiopia. But yes, the first, the first violence that happened was in 2018. The first, it was gruesome pictures circulating on Facebook, along with different anti-ethnic minorities sentiment. But what were the ethnic tensions and what was being said?
Starting point is 00:24:18 Yeah, so how complicated to get, or how in the weeds to get... Get complicated. Well, okay. So as I mentioned, Jouar is part of the Oromo ethnicity, the largest ethnicity in the country. And well, the Oromo are the largest, that they've also long felt politically
Starting point is 00:24:39 and culturally marginalized. And this feeling of marginalization, this resentment, this was really at the heart of the revolutionary protests that Jawar had helped lead. But Jawar, I'm just curious, are you a Roma-first Ethiopian first? I am a Roma first. I mean, many of his posts pointed directly at it. He would face our Roma-era Preston, how our home was where Marick analyzed, and that is absolutely okay with me,
Starting point is 00:25:09 because there is some historical truth to it, but he is a guy who heat up the temperature, ramp up some emotions. As I said, we are forced to fight back. To coalesce together, to come together and fight back. But now, even with the old government out of power and a new aromopry minister in power, Jawar Muhammad did not let up.
Starting point is 00:25:43 He kept stoking this resentment. To be honest with you, I think there is a risk of not civil war, but catastrophic communal violence across the country. I think people have to be very careful from now on. And with this inversion of power, statements he was making during the protest sounded very different in 2018, like even just the line. This is our land, this is our homeland.
Starting point is 00:26:10 Went from being about Ethiopians getting a corrupt government out of power to aromose getting minorities out of their territory. And quickly the language began to escalate. He will ramp up with like protect your land. By notices they are aliens, they are going to loot you. You know they are even until eventually. October 2019. The riots began on the 23rd of October 2019 and lasted for several days. A mob took to the streets, burnt cars and killed several people they thought were their opponents.
Starting point is 00:26:50 86 people died across the country. What caused this horrific outbreak of violence? The Facebook post by opposition leader Jawah Muhammad. One evening from his home in Aadis Ababa, Jawar Muhammad posted an unsupported claim. Insurinating that he is going to be killed by minorities. In his post, he called on his supporters for help. In response, some of his followers called for war. And while Jawar denies that he was intentionally inciting violence, hate flooded onto Facebook.
Starting point is 00:27:22 Content calling for the killing of all minority groups. Again, Berhantay. Content actually telling people like a fear neighbor is from a different ethnic group go and kill them. Literally that was what we were seeing. And then everyone started to take things on their own hand. And you know, it is minorities. Everything that could go wrong went wrong. Minorities were brutally murdered, like brutal brutal brutal brutal brutal violent minority communities being brutally targeted by the Oroamo countries largest ethnic group.
Starting point is 00:28:14 When they tried to cut my granddaughter's breast, I took out mine and I begged them to cut mine instead then they stopped but they took cut mine instead. Then they stopped. But they took her father instead. And since then, the government just has not been able to get back to any sort of peace. More than 800,000 people were shot dead by police on Monday, at least, who've been fatal shooting of a saloon. And so every couple weeks, there's just another outbreak These five people were shot dead by police on Monday, at least, with the fatal shooting of a saloon in the evening. And so every couple weeks, there's just another outbreak
Starting point is 00:28:49 of this sort of violence. First we'll brought up this change, this political change. And that is bullshit for me. I'm sorry, for my speech. But that is what happened. And so back in Nairobi, in an air conditioned conference room where this Supreme Court of Facebook training session was underway, as Berahn was sitting there, staring down at this iPad with a photo on it that says, kill all men, she's like, yeah, this has to come down.
Starting point is 00:29:34 You know, I'm not a face-to-face, you know, even give space to having a conversation about content governance and motivation when it's about humor. And Berhan was not alone in this. Many people felt that as an incitement of violence, that could result in actual harm. Again, Facebook's Brent Harris. And that is something that should not be on Facebook. And so I think around 4 p.m. to be honest with you, I left.
Starting point is 00:30:00 She walked out of the session. Because I was just like, no, this doesn't... This does not address the issues that we're talking about today. Damn, what do we do? Because it really is a we. What do we do if the very thing that people in New York, in an ironic way, say, must stay up, is the very thing that makes her walk out, because it's just utterly privileged and completely ignorant of the real life consequences of hate speech.
Starting point is 00:30:32 F***. That's wow. And keep in mind, these are just mock trials. Training sessions really. Like, they ran into this as they were trying to answer how to answer these sorts of questions. And now, we will get to some of their actual rulings and this Supreme Court itself. Yes.
Starting point is 00:30:54 But first, like, I think that the tension we're seeing here goes deeper than this one example. I mean, at the core of Facebook is this very American understanding of freedom of expression. And you hear this even in the way Facebook executives just talk about the company. More people being able to share their experiences, that's how we make progress together. You know, how many times has Mark Zuckerberg said some version of this? Most progress in our lives. It actually comes from individuals having more of a voice.
Starting point is 00:31:29 But when you talk to people from different parts of the world, there's not universal agreement on this. I will definitely tell you that. I found myself, oh my goodness, I was not as liberal as I thought. Again, Professor Indulk Chala. A. T.O.K.A. Facebook came and overwhelmed us with information. We didn't have a well-established faculty-making system.
Starting point is 00:31:54 We didn't have journalism institutions. We Ethiopia have only imported Facebook. We haven't imported the rest of the institutions and democratic foundations. We haven't imported the rest of the institutions and democratic foundations. The economic security around which such untrammeled freedom of expression is beneficial. So, we'll 10 years ago, 8 years ago. I saw that freedom of expression and technology will help us liberate us and get us out of the Italian system. Now, I have seen people will get angry and they will take matters on the Rohan Han.
Starting point is 00:32:29 That's what happened. So, it's about like a choice between a cold distance or thing whatever you want to say. It comes down to that for me. And as I have seen the violence that those speak has made, I think I would prefer coexistence. And to put that opinion in perspective here. 80% of Facebook users are not American. 80? Yeah. Really? Yeah. And content moderation. This is a very difficult task. One that's being done by people that have no freaking idea about our way of life
Starting point is 00:33:06 You know, and unfortunately it's us that are being affected over and over again with these things Then you know then then you guys. I mean, is there anyone openly advocating for just abolishing Facebook? Yes, but I don't think anybody's taking that particularly seriously, But I mean, come on. Like at a certain point, for private company becomes so potentially toxic to the very basic functioning of a decent democracy. I don't know, man. I don't know. Unless you can somehow break Facebook into a Balkanized set of internet, where each one has its own separate rules.
Starting point is 00:33:45 But I doubt that's even possible. Well, engineering-wise, it is possible. Facebook, in a few rare instances, already does employ some version of this. I spoke to Monica Bickert, who is Facebook's head of global policy, and she explained that there are certain slurs that are outlawed in specific regions, but allowed everywhere else. And similarly, they do have to abide by local laws. But she did go on to say that, quote, if you want a borderless community, you have to
Starting point is 00:34:22 have global policies, and that she doesn't expect that to change. No, no, that's crazy. You're gonna have to be so astute and so aware of regional context and regional history. I just don't think that's possible. So actually, and now that I'm saying it out loud, I think they should be outlawed. I don't know why I suddenly talked myself
Starting point is 00:34:45 into a very extreme position, but it suddenly seems like what other solution is there? Well, the solution Facebook has landed on is this Supreme Court. After those global workshops, they took all that feedback and created this independent structure. It's going to have 40 members. It currently has 19.
Starting point is 00:35:08 The members represent every continent other than Antarctica and and they're from just a wide array of backgrounds. Some are lawyers, others are free speech scholars, activists, journalists, even a former prime minister of Denmark, and among the first decisions they're going to have to make, is whether or not former President Trump will be banned from the platform indefinitely? Facebook has currently banned him, but it will be up to the board to rule on whether that band should remain or be lifted. And I mean, this decision won't just impact Trump.
Starting point is 00:35:50 It could very well have implications for how Facebook will deal with political figures not just in the United States, but in places like Ethiopia. Hello, hello. Hey Simon. Mine up. Very nice to meet you virtually here. How are you doing? I'm good. How are you sir? All right, and well making the right decisions for for the entire planet seems In many ways impossible
Starting point is 00:36:15 When I sat down and talked to several members of this court of this board I have to say they did make me a little bit hopeful Thanks so much for being willing to do this. I hope we can have a little bit of fun here, I have to say they did make me a little bit hopeful. Thanks so much for being willing to do this. I hope we can have a little bit of fun here today. I hope so. I was, yes, I think we should make as much controversy as possible. Oh, wow. Okay.
Starting point is 00:36:34 Well, this is my Nikii. He's a member of the board, former special rapid tour to the United Nations. And he's basically spent his entire life fighting for human rights. And what struck me about him right off the bat is just how unfaithe Facebook-y he is. I haven't used Facebook or Twitter myself. Really? I'm old school.
Starting point is 00:36:57 I try to keep my private life private. Why the hell were you chosen to be on the oversight board of a product that you don't even use? Why? Yeah. Because the all kinds of people have been chosen for it. I mean, that's the beauty of it, isn't it? We have all kinds of people on the board, all kinds. And that he sees the solution here in the incremental progress we've made in the past. You know, look, I see this work as human rights work.
Starting point is 00:37:22 I have gone through in my life two different things around hate speech using radio in the first one, Gwanda, then in Kenya as well, the media can be abused and then how do you reign them in? How do you mitigate them and how do you mitigate them in a way that doesn't abuse human rights? So the tools and the problems is basically the same. The difference is that mainstream media before social
Starting point is 00:37:45 media has been regulated over time, decades and years that then informed and guided how the information is put out. He said, just look at the five-second delay that live television runs on now. I'm sure when it started with the live television and live radio, it was quite on the go. So I think that's the questions we have to now deal with on Facebook. But I think I think I have confidence that there is enough experience in the world that's dealt with these phenomenals. And this feeling resonates with most of the people I spoke to at Facebook. I mean, I spent about 15 years working on climate before I came to Facebook. And I think the issues here are deeply analogous.
Starting point is 00:38:26 Again, Brinteris. They are human-generated. There are major regulatory actions that are needed. There's a serious responsibility by industry and a step up and think about the responsibility that they hold. And the solutions that will come forward as we start to figure out how to address these types of challenges will inherently be incremental.
Starting point is 00:38:48 And at times I worry we will kill off incremental good progress that start to address these issues because they don't solve everything. You know, is the Paris agreement enough? No. Is it a lot better than what we had before? Yes. Is the Montreal Protocol enough? No, is it a substantial step forward against this challenge? Yes. And building this board is only one step in a wide array of many other steps that need to be taken on. It sounds to me, though, what you're saying is this is the first piece in this global governance body Facebook is imagining. Well, if it really works and people end up believing in it and thinking it's a step forward, then further steps can be taken. There are nothing ever perfect nor is going to be issues. People criticize the specific people who are on it.
Starting point is 00:39:43 They'll criticize the specific people who are on it, but criticize the process. And I mean, when Kate Klonic, who turned us on to this story to begin with, when she interviewed Mark Zuckerberg, he said as much. It's not like the oversight board is the end. It is one institution that needs to get built as part of the eventual community governments that are the governance that I think we will end up having 10 years from now or however long it takes to build all this out. I just felt like a good concrete step that we could go take.
Starting point is 00:40:15 And what they're thinking of in terms of next steps. One would be something like regional circuits or a level of adjudication that are more regional or more localized, that sits below this board as a means of taking these decisions. You mean like seven continental courts or I don't know, 52 subregional courts that feed up to the one Supreme Court? Yeah, that's right. the one Supreme Court. Uh, yeah, that's right. And so, what we're watching spring up here is not just a solution to what is truly one of the problems of our moment, but also this wholly new way to organize ourselves and sort of adjudicate our behavior.
Starting point is 00:41:07 Look, look, what we're trying to do is an experiment. I cannot tell you it will work, but I can tell you we'll try to make it work as much as possible. And will we make mistakes? I am an absolute lit. I have got no doubt in my mind that being the humans we are, not yet evolved into saints and angels, we will make mistakes. That's part of the process.
Starting point is 00:41:38 The oversight board started officially hearing cases in October. They've already ruled on matters ranging from whether nude photos advocating breast cancer awareness should stand to whether a post about churches in Azerbaijan constitutes hate speech. And real quick, before we go, an update, actually. Since we first reported this story, the oversight board has come to a decision about President Trump. They chose to uphold Facebook's ban, meaning, well, you won't be seeing posts from him in your timeline anytime too soon. This story was produced and reported by Simon Adler with original music throughout by Simon.
Starting point is 00:42:29 Is this original music by Simon that we're hearing right now? Simon? It is indeed. Alright. As we said at the top this episode was made in collaboration with the New York Radio Hour and New Yorker Magazine to hear more about the intricacies of how this court came into being, the rulings they've already made, and what's coming up on their docket.
Starting point is 00:42:49 Check out David Remnick and reporter Kate Klonick's conversation in the New Yorker Radio Hours podcast feed, or head over to New Yorkerradiohours.org. And on that note, a huge thank you to Kate Klonick, whose tireless coverage of Facebook and their oversight board made this story possible. We'd also like to give special thanks to Julie O'Wano, Tim Wu, Noah Feldman, Andrew Moran, Monica Bicker, John Taylor, Jeff Gellman, and all the volunteers who spoke with us
Starting point is 00:43:16 from the network against hate speech. Beautiful, Chad. That's great. All right. Beautiful, Chad. That's great. Alright. Hi, this is Claire Sabri, calling from Lafayette, California. Radio Lab was created by Chad Abumrod and is edited by Slaoring Wheeler. Lulu Miller and Lochis Nasser are our co-host. Dylan Keefe is our director of sound design. Susie Lesterenberg is our executive producer. Our staff includes Simon Abler, Jeremy Bloom, Becca Bressler, Rachel Kusik, David Gabel, Matt Keeltee, Anne McEwan, Sarah Carrey, Arianne Black, Pat Walters, and Molly Webster, with help from Shemelle Leigh, Sarah Sandback, and Johnny Moons. Our fact checkers are Diane Kelly and Emily Creters.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.