Today, Explained - Mark Zuckerberg Explains Himself

Episode Date: April 3, 2018

Facebook CEO Mark Zuckerberg seldom gives interviews, but in the wake of the massive Cambridge Analytica privacy breach, he made an exception to speak with Vox’s Ezra Klein. Mark tells Ezra why he�...�s hopeful about Facebook’s future before privacy advocate Marc Rotenberg tells Sean Rameswaram why he’s not. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Bridget McCarthy, editor of Today Explained. Sean Ramos-Warnley. I see you're holding three toothbrushes. Why? I am. Well, I have my brand new Quip toothbrush, which has become my favorite toothbrush. And then I have the toothbrush that it replaced, which is my old electric toothbrush. And then I have my... Analog. Analog toothbrush.
Starting point is 00:00:22 Okay. And you're going to tell us about all three. I am. Great. You are. Getquip. Analog toothbrush. Okay. And you're going to tell us about all three. I am. Great. You are. Getquip.com slash exploit. Facebook is in some deep doo-doo. As this whole Cambridge Analytica scandal continues to unfold, the company has lost almost
Starting point is 00:00:47 $100 billion of its value. It's being investigated by the Federal Trade Commission. Will Ferrell just shut down his account. Mark Zuckerberg doesn't give a lot of interviews, but he's giving one to Congress in the coming weeks. And he sat down to explain himself to friend of the show Ezra Klein on The Ezra Klein Show. When Facebook gets it
Starting point is 00:01:11 wrong, the consequences are on the scale of when a government gets it wrong. Elections can lose legitimacy in a country or ethnic violence can break out. Has Facebook just become too big and too vast and too consequential? Has it made you question any of that? Well, I think we're continually thinking through this. So certainly, you know, in 2016, we were certainly behind on preventing things like misinformation, Russian interference. And that's just a huge focus for us now going forward. You know, right now in the company, I think we have about 14,000 people working on security and community operations and review. In 2018, I think it's going to be a big
Starting point is 00:01:51 year for us. There's not only the really important midterms here in the U.S., but there are presidential elections in Brazil and Mexico and a number of other countries. And, you know, after the 2016 U.S. elections, a number of months later, there were the French elections. We spent a bunch of time developing new AI tools to find the kind of fake accounts spreading misinformation. And we took down, I think it was more than 30,000 accounts. And I think the reports out of France were that people felt like that was a much cleaner election on social media. And then fast forward to last year, 2017, in the special election in Alabama, we deployed a number of new tools to
Starting point is 00:02:32 find fake accounts who were trying to spread false news. And yeah, we were able to take them down successfully. Let me ask you about your tools to punish it, though. The upside of being able to move a national election using Facebook is very high because, you know, look, if you get caught, if you're Russia and you are executing a massive bot operation, a sophisticated one to try to move the US election, the consequences of that can be really severe. The sanctions could be tremendous and you could even imagine something like that escalating up into armed conflict at a certain level. If you do this on Facebook, you know, maybe you get caught and they shut down your bots.
Starting point is 00:03:15 But do you have capacity to do not just detection but sanction? So, yes, there are a number of things that we do. You know, it might make sense to go through three big categories of fake news. There's a group of people who are like spammers. These are the type of people who, in pre-social media days, would have been sending you Viagra emails. The basic playbook that you want to run on that is just make it non-economical for them to do that. A number of them ran Facebook ads on their webpages.
Starting point is 00:03:40 So we immediately said, okay, anyone who's even remotely sketchy, no way are you going to be able to use our tools to monetize. And that made it so that some of the efforts slowed down. The second category are state actors. So that's basically the Russian interference effort. And that is basically a security problem. You know, you never fully solve it, but you strengthen your defenses. You get rid of the fake accounts and the tools that they have for using this.
Starting point is 00:04:03 And that one I feel like we're making good progress on too. The third one, which is the most nuanced, are real media outlets who are probably saying what they think is true, but just have varying levels of accuracy or trustworthiness in what they're saying. That is actually the most challenging portion of the issue to deal with because there are quite large free speech issues. You know, folks are saying stuff that may be wrong, but like they mean it. Do you really want to shut them down for doing that? Recently this year, we've rolled in a number of changes to News Feed that try to boost in the ranking broadly trusted news sources. We've surveyed people and asked them, you know,
Starting point is 00:04:45 whether they trust different news sources. So take, you know, the Wall Street Journal or New York Times. You know, even if not everyone reads them, the people who don't read them typically still think that they're good, trustworthy journalism. Whereas if you get down to blogs that may be on more of the fringe, they'll have their strong supporters, but people who don't necessarily read them often don't trust them as much. One of the things that has been coming up a lot in the conversation is whether the business model of roughly monetizing user attention is what is letting in a lot of these problems. Tim Cook, the CEO of Apple, gave an interview the other day, and he was asked, what would
Starting point is 00:05:22 you do if you were in Mark Zuckerberg's shoes? He said, I wouldn't be in the situation. The truth is we could make a ton of money if we monetized our customer. If our customer was our product, we've elected not to do that. You know, I find that argument that if you're not paying, that somehow we can't care about you to be extremely glib. If you want to build a service that helps connect everyone in the world, then there are a lot of people who can't afford to pay. Having an advertising-supported model is the only rational model
Starting point is 00:05:56 that can support building the service to reach people. You know, I thought Jeff Bezos had an excellent saying on this. There are companies that work hard to charge you more, and there are companies that work hard to charge you less. At Facebook, we are squarely in the camp of the companies that work hard to charge you less. I think it's important that we don't all get Stockholm syndrome and let the companies that work hard to charge you more convince you that they actually care more about you. I've been thinking a lot in preparing for this interview about the 2017 manifesto you wrote, where you said that you wanted Facebook to help humankind take its next step. And you said, quote, that progress now requires humanity coming together, not just as cities or nations, but also as a global community. Then you said that Facebook could be the social infrastructure for that. In retrospect, I think a key question here has become whether creating infrastructure where all
Starting point is 00:06:51 the tensions of countries and ethnicities and regions and ideologies can more easily collide into each other will actually help us become that global community or will further tear us apart. Has your thinking on that changed at all over the past year, year and a half? Sure. The world coming closer together is not a given. Over the last few years, the political reality has been a big rise of isolationism that I think threatens some of the global cooperation that will be required to solve some of the bigger issues like maintaining peace, addressing climate change. I think a lot of these problems require people coming together
Starting point is 00:07:29 and having a global understanding. So now the question is, how do you do that? Just helping people connect by itself isn't always positive. You know, right now, a lot of people aren't as focused on connecting the world or bringing countries closer together as maybe they were a few years back. And I still view that as an important part of our vision for where the world should go to help the world move in that direction. One of the scary stories I've read about Facebook over the past year is that it had become a real source of anti-Rohingya propaganda in Myanmar and thus become accidentally part of an ethnic cleansing. Is Facebook too big to manage
Starting point is 00:08:07 its global scale in some of these other countries, the ones we don't always talk about in this conversation effectively? So one of the things that I think we need to get better at as we grow is becoming a more global company. The Myanmar issues have gotten a lot of focus inside the company. You know, we detected people were trying to spread sensational messages to each side of the conflict, telling the Muslims there's about to be an uprising of the Buddhists. So, you know, make sure that you are armed and go to this place. And then the same thing on the other side. I think it is clear that people were trying to use our tools in order to incite real harm. Now, in that case, you know, our systems detect that that's going on.
Starting point is 00:08:51 We stop those messages from going through, and hopefully we're able to prevent any kind of real-world harm there. But, I mean, this is certainly something that we're paying a lot of attention to, and we want to make sure that all of the tools that we're bringing to bear on eliminating hate speech and citing violence that we're doing in places like Myanmar as well as places like the U.S. that do get a disproportionate amount of the attention. I think if you go back a couple years in technology rhetoric, a lot of the slogans people had that we all read optimistically have come to take on darker connotations too. The idea that anything is possible, the anything has become wider. So when you think about the 20-year time frame, what will you be looking for to see if Facebook succeeded, if it made the world a better place? Well, I don't think it's going to take 20 years. We're really idealistic, right? And when we
Starting point is 00:09:44 started, I think we thought about how good it would be if people could connect. And frankly, I just think we didn't spend enough time thinking through some of the downside uses of the tools. Now people are appropriately focused on some of the risks and downsides as well. And I think we were too slow in investing enough in that. In terms of resolving a lot of these issues, because we didn't invest enough, I think we will dig through this hole, but it will take a few years. You know, I think human nature is generally positive. I'm an optimist in that way.
Starting point is 00:10:16 There is no doubt our responsibility is to amplify the good parts of what people can do when they connect and to mitigate and prevent the bad things. And over the long term, I think that that's the big question. I'm optimistic that we're going to address a lot of those challenges. And that when you look back five years from now, 10 years from now, I do think that people will look at the net effects of being able to connect online, that that's just a massively positive thing in the world. Not everyone is as optimistic and trusting about Facebook's future. Over in Europe, Facebook's being hit with one regulation after another. That's after the break.
Starting point is 00:11:08 Okay, Bridget, you have three toothbrushes here. Let's start with the analog. How is it? It's okay. It does the job. You know, it gets the job done. No frills, no bells and whistles. You know, when I'm just kind of tired,
Starting point is 00:11:26 and I just need to get the job done without any pleasure. I just go for the blue toothbrush. What about this puppy right here? This looks like your sort of prehistoric dinosaur electric toothbrush. Let's hear how it sounds. Oh, my gosh. Yeah, you know, now this guy was... Reminiscent of a dental drill. And it was fine.
Starting point is 00:11:44 I never questioned the noise or anything about it. Reminiscent of a dental drill. And it was fine. I never questioned the noise or anything about it. I used this happily for a year until I met someone new. And then it was over for this guy, I'm sorry to say. Okay, more about that someone new after more podcast. But before that, getquip.com slash explained is the place where you can change your toothbrushing life. This is Today Explained. I'm Sean Ramos from Mark Zuckerberg says Facebook's gonna fix Facebook. Mark Rotenberg says he's heard that before.
Starting point is 00:12:30 I think this is where we were about, you know, 10 or 12 years ago in the early days of the Facebook advertising model, where people quickly understood what the problem was. And that was simply that Facebook could not regulate itself. This Mark is president of the Electronic Privacy Information Center. His organization lobbies for more privacy regulation. And he says that most of Mark Zuckerberg's promises about privacy sound like horseradish. I think it's a little too late at this point for Mark Zuckerberg to decide what happens to the personal data of Facebook users. And the reason is not so surprising. It's not so much a moral judgment.
Starting point is 00:13:15 I'm not saying that the company is necessarily bad. I'm simply saying that the obvious point of the company and the way it creates commercial value is by extracting every bit of information it can from its users. Mark Rotenberg says this has been Facebook's game plan since the jump. Because part of what Facebook is about is creating an illusion of a smaller community within the larger world of the internet, that you could draw boundaries and that you could share information with your friends and family members or a closer group than just the internet at large. People would provide this information, photos from vacations or, you know, funny comments about something they had just seen on Netflix, with the understanding that it was to a pre-selected group.
Starting point is 00:14:06 But what Facebook was doing simultaneously was taking that information and transferring those key points that had commercial value to advertisers. And so it was almost like being on the wrong side of a one-way mirror, being observed as you were communicating with your friends and your activities being shared with people that were to you strangers. Do you even know what the general understanding is with most users of social media as to how their data might be sold to other people? I think most users have very little understanding of how their data is collected, who has access to it, or how it's being used. Every incentive is there to dig deeper and deeper into the private lives
Starting point is 00:14:53 of Facebook users. There's nothing that would stop that process unless you put in place a regulatory system and some laws and some independent oversight. And this is why I simply stated Mark Zuckerberg can't solve this problem. You have to have an independent agency. You need laws and you need some constraints put on what Facebook's able to do with user data. So Facebook isn't just an American concern anymore. There are 2 billion global users. So who's doing the best job the world over of imposing laws on Facebook and other social media platforms? Well, I would say at this point probably the Europeans. Europe has done a fairly good job over the last 20 years updating their privacy laws with a new framework called the General Data Protection Regulation. It's a big deal in the privacy world. And the person who gives up the data actually gets
Starting point is 00:15:50 the rights. I mean, they get to know about how the information is being used. They get to limit its use. And if there's an improper use, there's some way to get some remedy. So it's very asymmetric in that way. But I think that makes a lot of sense. It's almost intuitive. It's like you give your money to the bank and the bank loses the money. The bank doesn't come back and say to you, well, you should have known that we had bad security. It's like I gave you my money. I'm thinking that it's your responsibility to manage that part of the problem.
Starting point is 00:16:20 And that's essentially what the Europeans are going to be doing going forward in the collection use of data. What does that look like? What do these privacy protections just look like practically? Do they use Facebook differently? Does it look differently? Yes. Well, actually, there are certain things that Facebook does in the United States which are not permissible in Europe. For example, facial recognition is something that's treated very differently in Europe because the starting point, if you're concerned about privacy, is that you should decide when to disclose your identity for others.
Starting point is 00:16:51 You shouldn't have a company sifting through your photos and revealing your identity when you've chosen not to. That's a big problem with facial recognition in Facebook today. So you're talking about when I post a photo and Facebook says, is this mark in your photo? Exactly. That kind of technology is regulated more in Europe than it is here. Precisely. Which is zero here.
Starting point is 00:17:13 Pretty much zero. Okay. But also look at the data of WhatsApp users. That was another very interesting case because you had a messaging service that was competing with Facebook and it had good privacy policies. A lot of people signed up for
Starting point is 00:17:25 WhatsApp and not Facebook. You know, Facebook eventually acquired the company. And then a very interesting question came up. Well, what happens to that data of the WhatsApp users? Because they all signed up with a better privacy policy. Are they going to have their privacy settings turned down after the Facebook acquisition? We actually went to the FTC in the U.S. and said, you have to protect the privacy interests of WhatsApp users. And they expressed a little bit of concern but didn't do very much. It's a huge battle right now in Europe. The Europeans have said to Facebook,
Starting point is 00:17:57 you can't get access to the WhatsApp user data because it would violate their privacy rights. Are there other countries or regions that are following suit? Yes. I mean, the latest count is that there are probably about 110 countries that have laws somewhat similar to the European approach. Now, there's a very different story, of course, in the US because we haven't done very much to update our privacy laws. We've experienced high levels of identity theft, data breach, financial fraud, growing concerns about cyber attack and foreign governments targeting personal data held by U.S. firms. I actually think now it's in our national interest to do more to protect privacy.
Starting point is 00:18:39 Yeah, isn't there some ransomware currently holding a bunch of data hostage in Atlanta? Yeah, it's pretty bad. I mean, you had a city pretty much shut down. I mean, obviously it wasn't just about personal data, but personal data plays a big role in all of this. And yet my guess is most Americans have Facebook running in the background of their phones at all times with access to all sorts of information. Of course they do. But at the same time, I think it's a little bit unfair to put that back on the user. It's a bit of victim blaming, you know, I mean, if you go into your house and turn the faucet and some brown water comes out, you don't say to yourself, gee, I'm so dumb. I mean, how am I allowing this
Starting point is 00:19:16 dirty water into my home? I mean, obviously someone is responsible for the management of that infrastructure. And I think those people should be held to account. The other half of my thought is that just like any other issue, the United States won't see real action on this until it's demanded by voters, by people. And I don't feel like Cambridge Analytica, now a few weeks out, has been the scandal that will lead to that kind of action. Do you? Well, I don't know. I mean, this time feels different.
Starting point is 00:19:52 Because a lot of people are still unhappy about the 2016 election. A lot of people are concerned about the general sense that data breaches are real, financial fraud is real, identity theft is real. And it's almost a perfect storm for Facebook. They're getting criticisms from a lot of different directions, which is also why you see Congress planning to hold several hearings, you know, this month. I mean, Mark Zuckerberg will be asked to testify before congressional committees. And I don't think they're planning to give him any awards this month. I think it's going to go in a different direction.
Starting point is 00:20:30 Mark Rotenberg is a professor at Georgetown Law. Shoutouts to Ezra Klein and Jillian Weinberger at Vox for hooking up that Mark Zuckerberg interview. You can hear the whole thing on the Ezra Klein Show. And one more thing for our show. We are working on a story about the gender wage gap, and we want to hear your stories. So find a quiet place and record them into your phones. Tell us your name, your job, and what the wage gap was. Email your voice memo to todayexplainedatvox.com. That's our email address. Again, send your stories about unequal pay to todayexplainedatvox.com. And thank you. Okay, before we go, Bridget, tell us about your Quip toothbrush.
Starting point is 00:21:32 Let's hear it first. Yeah, okay, so I don't know if you remember the loud buzz. I sure do. Unforgettable. I'm still scarred. Yeah, so listen quietly. Listen carefully. This is the Quip toothbrush.
Starting point is 00:21:42 I can't even hear it. Is it on? Oh, my gosh, it is. Yeah, so it's just this very gentle hum. Does it still do the job with the tape? It absolutely does. Yeah, it actually does it better. Okay, getquip.com slash explained. Get a new toothbrush.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.