The Daily - Thursday, March 22, 2018

Episode Date: March 22, 2018

Five days after details about Cambridge Analytica’s mining of data were made public, Mark Zuckerberg, the chief executive of Facebook, broke his silence on his company’s role in the data breach. M...inutes after posting a statement on Facebook, he spoke with The New York Times. Guest: Kevin Roose, a business columnist for The Times. For more information on today’s episode, visit nytimes.com/thedaily.

Transcript
Discussion (0)
Starting point is 00:00:00 From The New York Times, I'm Michael Barbaro. This is The Daily. Today, five days after the story of Cambridge Analytica broke, Mark Zuckerberg finally breaks his silence about Facebook's role in the data breach and talks to The New York Times. It's Thursday, March 22nd. It's no exaggeration to say that Facebook is in crisis, and that is a big statement.
Starting point is 00:00:44 There's a crisis at Facebook that is intensifying after the company failed to protect user data. The hashtag delete Facebook began trending after news of the data leak became public. People were upset. Their profiles were sold to a company that helped the Trump campaign. Some of their anger targeted at Facebook's Mark Zuckerberg. He has been silent on the data leak. If you look at Zuckerberg, he's MIA. When was the last time we heard Zuckerberg talk about this? It's a complete and utter failure of leadership. CEO Mark Zuckerberg is he's MIA. When was the last time we heard Zuckerberg talk about this? It's a complete and utter failure of leadership. CEO Mark Zuckerberg is still missing in action.
Starting point is 00:01:09 We're all asking where in the world is Mark Zuckerberg in this moment of crisis? He's got to make a clear statement and soon. Surely it's time for him to drop this pie in the sky, fix the whole world dreaming and fix his company. and fix his company. On Wednesday afternoon, minutes after breaking his silence by posting a public statement on Facebook, Mark Zuckerberg got on the phone with my colleagues, Kevin Roos and Shira Frankel. Kevin, how did this interview come to be?
Starting point is 00:01:40 We had been asking Facebook for days about how they were going to respond. And we were just hearing nothing back. I mean, they were saying, you know, something's coming, something's coming. This is very unusual for something to go, I think it was five days, without any kind of public comment from either Mark Zuckerberg or Sheryl Sandberg, the COO of Facebook. And I had sort of given up on hearing directly from the company. And then this afternoon, I was sitting on my couch, and I got a call that says, can you talk to Mark Zuckerberg in five minutes?
Starting point is 00:02:20 In five minutes. Yeah, this was not the most heads up I've ever had for an interview. But my colleague, Shira Frankel, and I jumped on the phone and called into a conference line, and there was Mark Zuckerberg. And it's a pretty rare thing. I mean, this is not a guy who gives a ton of interviews and is usually very tightly controlled in what he says to the public. So how do you start the conversation with him? What's your first question? Well, our first question is, can we have a few minutes to read this post that you just put up on your Facebook page? So we literally sat there for a few minutes on the phone just with dead air
Starting point is 00:02:58 while Shira and I furiously read his Facebook post. Okay, so having read what Zuckerberg wrote on Facebook, what did you most want to ask him on the phone? So we asked him first about the reaction and why it had taken so long for him to make any kind of public statement after this news came out about Cambridge Analytica. And he said, I really wanted to make sure
Starting point is 00:03:22 we had a full understanding of everything that happened. We wanted to be able to fix the system, or at least try to fix the system. We wanted to have a plan in place to make it harder for developers to access this kind of data. And we wanted to develop a tool, we wanted to give people a way to know whether their data was shared with Cambridge Analytica. And he said there's a question of, it's not exactly that easy because not only did this app collect the data of the people
Starting point is 00:03:52 who installed it, but it collected the data of all of their Facebook friends, too. So in order to determine whose data was exposed, you have to rebuild people's Facebook networks as they looked in 2013 or 2014 when this was going on. And people could have deleted friends, people could have stopped being friends with somebody, so it might not be actually all that feasible. Exactly.
Starting point is 00:04:15 And he said, you know, we're going to be conservative on this. We're going to try to tell anyone whose data was affected, even if we don't know for certain that it was. And what are you making of this plan he's laying out to kind of fix this data breach? I think that the most important thing that they're doing is limiting the amount of data and the number of developers who can get access to data.
Starting point is 00:04:37 So for a while on Facebook, they were really, really permissive about what they would allow people to get who were building apps on top of Facebook. So if you built an app, a quiz, which Harry Potter character are you, and it plugged into Facebook and you got people to install it, you could get a tremendous amount of information
Starting point is 00:04:58 about those Facebook users through that app. And they did actually tighten this up back in 2015. They made it a little bit more challenging for app developers to get that kind of data. But before then, it was really a sort of free-for-all. That resulted in a lot of developers having access to a lot of people's data. And the hard thing for Facebook now
Starting point is 00:05:22 is that they don't know where all that data is. They don't have it anymore. It's with these app developers. What Zuckerberg said was that they are going to do an investigation, not only of the apps that are currently asking for a lot of people's data on Facebook, but the apps that previous to this 2015 change that asked for a lot of people's Facebook data and investigate how that data was used and collected
Starting point is 00:05:49 and try to figure out if there are any more Cambridge Analyticas out there. That sounds like a huge amount of work to investigate thousands of apps that collected this kind of data. Yeah, it is. He said it would be thousands of apps and that they were going to have to hire additional people probably to do this.
Starting point is 00:06:08 But I think he knows that there is this sort of amount of data out there that they can't really control. I mean, they can ask developers to delete it after the fact, but developers are under no obligation to delete the data that came out of Facebook that people shared with them willingly. And it feels like one of the lessons of Cambridge Analytica from our colleague Matt Rosenberg is that even when a company says that they have deleted this kind of data, there's not necessarily a guarantee that they have. Right. I mean, Cambridge Analytica did certify,
Starting point is 00:06:42 they sent a form back to Facebook that said we've deleted this data. That reportedly was not true. And so I think they're trying to work out a new sort of system for making sure that people's data is not being shared against their will. You actually asked Zuckerberg about Cambridge Analytica and whether they were honest about what they did with this data.
Starting point is 00:07:05 And he seemed to acknowledge to you that the company had lied to Facebook, right? Yeah, he said it seems that that was false. They haven't done a full investigation yet, so they don't actually know, but they have read reports, including the ones that ran in the New York Times last week, that journalists have seen this data, and that, he said, was a strong enough signal for them to take action.
Starting point is 00:07:30 So after talking about Cambridge Analytica for a while, what do you guys ask Zuckerberg about? So we asked him about all of the other threats to Facebook. So Russia and foreign interference in the midterms later this year, and what they were doing to prevent the platform from being exploited again. And he said, you know, we're working on this. We have built some tools. We've seen that they work pretty well. And then he broke some news. He said for the first time that in December of last year, when the special election in Alabama happened, they
Starting point is 00:08:06 discovered that some Macedonian accounts were trying to spread false news in the lead up to that election, and that they were able to eliminate those. And in addition to talking about the Macedonian interference, he said something really interesting about sort of false news as a category. interference. He said something really interesting about sort of false news as a category. And he made a comparison to spam. He said a lot of this is done by the same people who would have sent you emails about Viagra in the 90s. And now they're trying to come up with ways to push false and sensational news into your Facebook feed to try to get you to click on it so they can sell you ads. And I think that's partly true. I think that part of what motivates people to put sensational news on Facebook is this kind of spammy profit motive.
Starting point is 00:08:53 But there's this other part of it that I don't think he addressed, which is that a lot of people are doing this for motivated political reasons, not necessarily because they're trying to make money. Right. Viagra never swayed an election, as best we can tell. Not that we can tell, right. So is he underplaying the kind of nefarious ambition of fake news versus the economic motives of spam? Yeah, I think there's a real political motive here that is really hard to defend against, because if people aren't motivated by profits,
Starting point is 00:09:26 then taking away the tools that allow them to make money isn't going to stop them from doing what they're doing. There's a moment in this interview, and I read the transcript of it, that I loved. And it felt very telling, where Zuckerberg basically says to you, look, when I made Facebook in my dorm room at Harvard in 2004, I never saw any of this coming. Yeah. So I asked him, because I've been curious about how he feels, not just what he's doing, but I mean, to be the person who built this enormous platform that is now changing the fabric of society in ways that are good and sometimes really bad. But in general, I sort of said,
Starting point is 00:10:08 well, do you feel any guilt about how Facebook is being used all over the world? And he paused for a really long time. And then finally, when he spoke up, he said, you know, we're doing something here that has never been done before. And there are challenges that I don't think anyone had anticipated. And then he said, you know, if you'd asked me in 2004, when I was in my dorm room at Harvard, if someday I'd be working on preventing
Starting point is 00:10:40 governments from interfering in each other's elections on Facebook, there's no way that would have been predictable. There's no way that I would have said, yes, that's what I'll be doing. And I thought that was really interesting and introspective and true. I mean, it was the first time that I'd really seen him step back and say, this was not always the point. This was not always what we bargained for. Now that we're here, we're doing our best to address this, but this was not something that anyone saw coming. Yeah, I love the actual question you asked him. You said, do you feel like the basic economic model of Facebook,
Starting point is 00:11:15 in which users provide data that Facebook uses to help advertisers and developers to better target potential customers and users, advertisers, and developers to better target potential customers and users. Do you feel like that works, given what we now know about the risks? Yeah, and that was sort of my way of saying, like, hey, how's this all going to work? Because that feels like the essential question. That's certainly my essential question, is not only is this working, but can this work? Is Facebook set up to work in this kind of environment?
Starting point is 00:11:49 And he said, basically, look, we're trying to build something for everyone in the world. And if you want to build something for everyone in the world, it's got to be free or really, really cheap. Because not everyone can afford to pay for a social network. So if you have a business model that is supported by ads, you can do that. If you don't, then maybe you can't. And he said, I don't think the ad model is going to go away. But he said that they had thought over time about giving people some way to pay for Facebook if they can afford it, and maybe not have an ad-based model. But he said, basically,
Starting point is 00:12:25 the ad-based model that we have now is not going to go away because it's the only way we can get our service to everyone in the world. In a sense, what he's saying is the economic model of Facebook is almost by definition a little vulnerable if you're just a regular person on there doing what you do, liking things, not necessarily thinking about what's going to happen to that data. So what's your response to that? In light of Cambridge Analytica, in light of Russian meddling, is Facebook essentially in an impossible situation? Yeah, I think they have one of the hardest tasks in the world, which is that they have built a thing that has far, far outstripped their original intention for it.
Starting point is 00:13:05 It's grown bigger than anyone thought or predicted that it would grow. And now they have to rein it back in without changing the things that made it one of the most profitable companies in the world. They're really being asked to do the impossible here, which is to build a secure, privacy-respecting platform that is locked down and unexploitable and also make billions and billions of dollars for their shareholders.
Starting point is 00:13:30 That basically does not seem feasible. Yeah, I can't make that math work. Maybe someone can, but for me, that feels like an impossible situation. So do you think one of the reasons Facebook has been slow to respond to all of this is that they sort of were in denial about the reality that they were going to have to find a way to regulate themselves if they didn't want someone else to come in and regulate them. They were seemingly daunted by what that meant.
Starting point is 00:13:58 I think regulation is a scary prospect for any large tech company. is a scary prospect for any large tech company. I think it's especially troubling for Facebook, which has been trying to avoid being regulated for a very long time. I think there's a chance, though, that this becomes a real business issue for them. Because in a sense, their product is their users. Their business model is not to charge people for access to a
Starting point is 00:14:26 website, it's to charge advertisers for access to their users. And that, I think, puts them in a very different place than a company like Apple or Amazon that actually has this sort of direct relationship with their customers. Okay, so all this work that Facebook is now doing since the Times reported on Cambridge Analytica, investigating thousands of apps, contacting tens of millions of people, it kind of speaks to just how challenging it is for a massive social network company like Facebook
Starting point is 00:14:56 to try to self-regulate. Yeah, I think that's true. And there's a question of, do they want to self-regulate? Which I think they would say that the answer is yes. I mean, they would rather regulate themselves than have someone else from the outside regulate them by far. But the more pertinent question, I think, is can they regulate themselves? Are they too big to be able to keep track of all of the things that are being built on and around and inside Facebook and make sure that those things are treating their users' data
Starting point is 00:15:27 with the proper respect and privacy. So did you leave this phone call feeling like you'd learned anything new about how he's thinking about all this? Not really. I mean, I think he understands the challenge and the magnitude of what he's being asked to do. But at the same time, there is really not that much they can do
Starting point is 00:15:51 to prevent all of this data that they have unleashed into the world. You can't really put that back in the bottle. That is out there. That is available. There are now places on the dark web where reportedly people can buy and sell this kind of data. So they may be able to fix the problem going forward,
Starting point is 00:16:13 but it's the amount of data that has already been exposed and the amount of damage that's already been done that I think is going to be a much bigger problem for them. The thing that they're promising to investigate is whether some of this data has fallen into the wrong hands or whether it's not being used for its intended purpose. But this is a monumental task for them. I mean, this is not something that would be easy for any company. And to the larger metaphor, like, you know, in those cartoons when they're like building
Starting point is 00:16:41 the railroad tracks, like right in front of the train as the train's moving. It feels a little bit like that. Like they are in totally uncharted water here, and they're getting ready for the midterms and elections all over the world. And one problem after another keeps coming up. And so they're trying to solve all that, but they're also trying to solve all the problems from four years ago at the same time that they're preparing for what's coming next. And so, you know, it's a hard job and I wouldn't want to do his job. Thank you, Kevin.
Starting point is 00:17:12 Thank you. In his statement posted on Facebook on Wednesday, Zuckerberg stopped short of a full apology for the misuse of user data by Cambridge Analytica. Later in the day, after his interview with The Times, Zuckerberg spoke to CNN. So this was a major breach of trust, and I'm really sorry that this happened. We have a basic responsibility to protect people's data, and if we can't do that, then we don't deserve to have the opportunity to serve people.
Starting point is 00:17:49 We'll be right back. Here's what else you need to know today. President Trump plans to announce at least $50 billion worth of annual tariffs and penalties against China today, in retaliation for what the White House says is its theft of technology and trade secrets. Unlike the president's steel and aluminum tariffs, which divided his aides, business leaders, and Republican lawmakers,
Starting point is 00:18:22 the latest penalties are expected to be far more popular because of growing outrage over Chinese policies that have forced U.S. technology companies to hand over corporate secrets in order to operate in China. And I think you all are aware and our community is well aware that it has been a long, almost three weeks for the community of Austin as we have dealt with package bombs and other types of bombs that have been placed throughout our community. The hunt for one of the most elusive serial bombers in recent decades ended on Wednesday when 23-year-old Mark Anthony Condit, knowing that he was about to be apprehended, drove into a ditch and blew himself up. As members of the Austin Police Department SWAT team approached the vehicle,
Starting point is 00:19:14 the suspect detonated a bomb inside the vehicle, knocking one of our SWAT officers back and one of our SWAT officers fired at the suspect as well. Police had zeroed in on Condon after discovering surveillance video showing a man in a disguise dropping off packages at an Austin area FedEx store and driving off in a 2002 Ford Ranger. He was seen wearing pink construction gloves that investigators determined could be bought at Home Depot and matched the suspect to a man captured on surveillance at a nearby Home Depot location. By the end of the day Wednesday,
Starting point is 00:19:55 police had also discovered a 25-minute confession left on Condit's phone. Condit's homemade explosions, beginning in early March, left two people dead and injured four others. That's it for The Daily. I'm Michael Barbaro.
Starting point is 00:20:17 See you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.