The Daily - Wednesday, Mar. 21, 2018

Episode Date: March 21, 2018

A young Canadian data expert came up with a plan to harvest people’s personal data from Facebook, and to use that information to influence their voting. How did the brains behind Cambridge Analytica... become its whistle-blower? Guest: Matthew Rosenberg, a New York Times reporter in Washington. For more information on today’s episode, visit nytimes.com/thedaily.

Transcript
Discussion (0)
Starting point is 00:00:00 From The New York Times, I'm Michael Barbaro. This is The Daily. Today, it was a young Canadian data expert who came up with the plan to harvest people's personal data off Facebook and use it to influence their voting behavior. Why the brains behind Cambridge Analytica is now the whistleblower. It's Wednesday, March 21st.
Starting point is 00:00:38 Matt, when did you first get introduced to Christopher Wiley? So I was in London in August and a reporter wanted to introduce us to this kid, Christopher Wiley, who had been one of the people who helped found Cambridge Analytica. Matt Rosenberg was one of the reporters who broke the story of Cambridge Analytica. We met in East London. I was staying at this hotel, this kind of hip hotel in East London. And in walks this guy. He's like this skinny, sort of hipsterish, 20-year-old East London kid who was a vegan from Canada. He's wearing a beanie from the Obama 2012 inauguration. And he's coming there to tell us all about how he helped found this firm that was
Starting point is 00:01:18 supposed to be the arsenal of weapons in Steve Bannon's culture war to reshape America into a right-wing nationalist country. And what was interesting about this guy's story? Why did you care what he had to say? So, you know, by the summer of 2017, we all knew Cambridge Analytica had worked for the Trump campaign. We also knew that there had been Facebook data somehow used by this company. But what Chris had to say was that it was tens of millions of profiles. It was a scale and a scope of data harvesting that we did not know. So Chris understood how Cambridge Analytica had used Facebook in a way that hadn't yet been disclosed or understood by the public.
Starting point is 00:02:02 Yeah, because he was integral to it. So you're at this bar in London with Christopher Wiley. What's the story that he tells you? I kind of tell him to start at the beginning. He is a high school dropout from British Columbia in Canada. He left high school after suing the school district over inclusion issues. Chris is gay, very openly so. After dropping out, he drifted to Ottawa, where he
Starting point is 00:02:26 started working for the Liberal Party in Canada. And there he met Ken Strasma, who was Obama's big data guy in 2008. And Ken kind of became his mentor and taught him about data science and using it for elections. And after a while, though, he kind of wanted to go back to school. He gets into the London School of Economics and heads over to London. But while he's there, I mean, he has this seeming kind of passion or compulsion to do political data. And it's kind of like being in the mall, but he got sucked back into the data world and ends up with this company called SCL Elections. Now, SCL Elections is part of a bigger company called the SCL Group, which does a lot of defense and intelligence contracting. It's like a research communications company. And SCL Elections, their election division, is just a small part of this company, but it's run by this guy, Alexander Nix, who is this very posh Brit. And Alexander Nix really wants to get a bigger chunk of the American kind of political market. He wants to break into the U.S.
Starting point is 00:03:25 He really wants to break into the U.S. And Nix is on a flight from London to New York and has this completely happenstance meeting with some people who work for Herman Cain. The Republican presidential candidate. Exactly. And through them, they get into the world of the kind of far right of the Republican Party. And that eventually leads Nix and Wiley
Starting point is 00:03:45 and the small data team they've assembled to Steve Bannon and Robert Mercer. The kind of kings of American conservative politics. Yeah. Nix and Wiley get to know the Mercers and Steve Bannon, whose Breitbart empire is exploding and he's now setting up Breitbart UK and really wants to reshape American politics. And how are they going to do this? So they've got this idea that they're going to take a new field of study called psychographic research. And they're going to sell the Mercers and Bannon on this idea of creating a political data firm. data firm. And instead of going out and polling people on like, what issue, what do you support on gun control? Or what do you think about taxes? You're going to go out and you're going to figure
Starting point is 00:04:28 out the actual personalities, what emotionally drives each American voter, and you're going to tailor messages to them. And through that, you're going to be able to predict their voter behavior, you're going to be able to influence them. It's that whole idea, that Breitbart idea of kind of politics being downstream of culture, that if you want to change the politics, you can change the culture, and that psychographics is this key to doing so. And Matt, where did this idea of psychographic research come from? This had originated a few years earlier at Cambridge University. They had something called the Psychographic Center, which is dedicated to this idea of using big data to kind of map personalities. dedicated this idea of using big data to kind of map personalities. So Chris had been hanging around this center. And at that center, a graduate student had developed this technique to kind of
Starting point is 00:05:09 predict personalities by looking at Facebook likes. And this grad student had done this study where he had gone through Facebook and harvested likes. And at this point, Facebook was wide open. Likes by default were public. So scraping them was very easy. And he found that on an average of 68 likes, he could predict amazing things like your skin color with 95% accuracy, your sexual orientation with 85% accuracy. You know, these guys had developed this technique that they claimed could reveal a tremendous amount of people's personalities. You give them 70 likes and they would know you better than your friends. At 150, better than your parents. At 300, better than your own partners.
Starting point is 00:05:46 So this research said that through 300 likes, you could be known by an outsider better than your own spouse. Pretty much. You know, after kind of harvesting all this Facebook data, they would unleash the algorithms that had been developed and kind of create all kinds of correlations. So they'd find like, let's say you liked Hello Kitty,
Starting point is 00:06:03 the Hello Kitty brand. They found what they claimed was a lot of accuracy, that you were probably a very open-minded person, but not particularly conscientious. If you like the Wu-Tang Clan, you're a heterosexual male, so guilty as charged. And, you know, they claimed people's personalities could be simply measured through Facebook likes. That's fascinating. Yeah. Chris sees this and he's like, well, we can do this, you know? At the same time, he's kind of hanging around this center learning about this.
Starting point is 00:06:32 SCL is working with the Mercers on a small pilot project to kind of test psychographics in the Virginia 2013 gubernatorial race. You had Ken Cuccinelli, the Republican, who they were backing, running against Terry McAuliffe, who was Democrat, who eventually won. They weren't using any of the Facebook data for this.
Starting point is 00:06:49 They were trying to use polling data and other kinds of data to measure personality. The results were ambiguous but promising enough that the Mercers said, you know what, let's go ahead. We're going to try and create a new company here. That focused on the idea of psychographic research. Exactly. And then by June of 2014, the Mercers had put in about $15 million. You know, Steve Bannon was on the board, one of the co-founders. And these guys suddenly had a problem because it was easy enough to get that kind of polling data and all kinds of other data points you needed to kind of measure personality in some small Virginia counties.
Starting point is 00:07:23 You know, they'd only tested two counties in Virginia. Getting that kind of data on a national scale was just, they didn't have the resources to do it or the time. But Chris, because he'd been hanging around Cambridge, knew that there was a data set out there that could work to measure personality. And that was that Facebook data that the young PhD student had been playing around with. The Psychographic Center, though, wouldn't play ball. And give that data over.
Starting point is 00:07:47 No. So they found a professor familiar with the research, also at Cambridge University, an assistant psychology professor named Alexander Kogan. He's a Russian-American. And he agrees to do it. He agrees to kind of recreate the data set. So here's how it worked. People would take a personality quiz.
Starting point is 00:08:05 And I think it's really important to note when listeners hear Facebook quiz, they're thinking some kind of quiz like what kind of root vegetable are you or which game are throwing the characters. Right. That's not what was going on here. They're told that this is for academic research, that we get paid a small sum to do this. A very standard personality quiz. They would then log into Facebook and download an app. And for taking this quiz, we would get a small payment code. That's how they would get paid.
Starting point is 00:08:29 Meanwhile, the app would scrape their data and all their friends' data, and that was, you know, basic bio data and what they had liked on Facebook. The researchers would use the answers that these people had provided on their personality quiz and their Facebook data to kind of create the algorithm that these people had provided on their personality quiz and their Facebook data to kind of create the algorithm that tells you if you're going to like the Wu-Tang Clan or, you know, Hello Kitty or whatever. And then once you felt that algorithm could predict that kind of thing with enough accuracy, you would then unleash it on the tens of millions of friends profiles you described for which you have no answers. You don't have the personality quiz
Starting point is 00:09:01 answers. Almost all this harvesting of data was done in June and July and a little bit of August 2014, and they came up with about 50 million profiles. Wow, 50 million. Yeah. So this system seems to exploit the fact that you can get permission from just a few people, and once you're in their world, you can then gather up all the information you need from their friends, which is the whole notion of Facebook anyway. Exactly. Now, I think in all fairness to Facebook, while this was going on, they were already tightening up their permissions. In April 2014, new apps were no longer allowed to scrape friends like this.
Starting point is 00:09:37 But any apps that had already existed and Kogan's did exist were grandfathered in for another year. So this was totally allowed by Facebook at the time. You know, they could go in and take all this data. The thing is, is you weren't supposed to give it to third parties. You definitely were not supposed to say you were doing it for academic research and then hand it over to a political data firm that also has a commercial arm. That's a problem.
Starting point is 00:09:59 It's also a problem, you know, for the researcher who is an academic. You know, there's not an academic study on Earth that allows you to deceive people. What do Chris and Cambridge Analytica ultimately get out of this? What do they learn by doing this research? They're getting the data they need to build the tools to get a psychological picture of tens of millions of Americans. And so they're testing messages off this. They're saying, OK, we've got neurotic voters here. Let's show them a wall. Let's show them people sneaking over a wall. Like a wall with Mexico. Yeah. And Chris showed me some of these slides. Some of them are like, you see people coming in over a wall and it says like, there's more than one kind of national security leak. Stuff like that. How can we appeal to people? How can we appeal to their fears? Most of it was fear-based from what I saw. There wasn't a whole lot of hope base. So if you fit a particular kind of personality profile, Cambridge Analytica is going to start to build messages and you're going to get served a particular kind of ad based on
Starting point is 00:10:55 that. Exactly. It's so interesting, this idea of using social media to tap into people's kind of primal desires and fears. It's the same tactics that we now know Russia used in the 2016 elections in their own effort to influence the results and to try to hurt Hillary Clinton and favor Donald Trump. Yeah. We'll be right back. So Chris starts telling you this whole story back in August. Why do you think he came forward? He's being called now a whistleblower,
Starting point is 00:11:48 but this whole thing was, in many ways, his idea, it sounded like. So I think Chris was really wrestling with all this. He had joined up as a 23 and then a 24-year-old kid who was really into this idea of playing with psychographic research to kind of get to people and alter their views. He was doing a PhD in kind of fashion and fashion marketing. I remember the name was Mentalware.
Starting point is 00:12:10 It was the name of his PhD thesis. And he wanted to use the psychographic research to try and predict kind of consumer fashion choices. And he quipped to me, I remember sitting there saying, you know, he wanted to be in fashion, not fascism. saying, you know, he wanted to be in fashion, not fascism. And I think he really saw the Bannons and Mercers and their ideas of the world as leading to fascism in a way that he found deeply abhorrent. He left at the end of 2014. So I think by then he knew what he was getting into,
Starting point is 00:12:35 what he'd gotten into. They had dealt with a lot of pretty fringe right-wing American candidates. One of his colleagues told me about a moment when a bunch of evangelical Christians showed up, and this colleague was also gay, and they wanted to know the psychographic research was good for identifying homosexuals so they can message them to convert them to heterosexuality. So I think these guys knew who they were dealing with here.
Starting point is 00:12:58 What happens after Chris tells you this story? what are the big questions for you? And where did you go next with the story? So Chris said he had a whole bunch of documents he wanted to show us. We wanted to see those documents. The documents that showed the scale and scope of the data harvesting that show Cambridge Analytica's involvement in it. And also documents that show what Facebook knew and what Facebook doesn't know. And what he's telling me is there are documents, including letters from Facebook, seeking to make sure this data has been deleted. And I'm like, wait, so Facebook knows that this amount of data has been harvested and it's never said a word publicly? Because if they're asking for the data to be deleted, presumably they knew it had been harvested.
Starting point is 00:13:44 And we now know Facebook shut down the app and they started investigating. And throughout the first half of 2016, they were chasing down Kogan. They were chasing down Cambridge Analytica. They were chasing down Chris and others to try and get that data deleted. In 2017, when reports started coming out with some figures about the data that was out there, Facebook still would just say, you know, it's taken care of. It's fine. Nothing to look at here. And suddenly we're sitting with this guy who's got these documents that say, well, actually there is something to look at here.
Starting point is 00:14:08 There's a lot out there. So these documents are making clear that Facebook knew this information had been taken, that it had been taken without permission. And from everything you can tell, they do not alert the people from whom it was taken without their permission. Yeah, no, they did not alert them. And one of the things about this data set, if this psychographic research actually works, it's not something you need to update every few years. You know, people's positions on gun control or taxes may change, but if you're neurotic at 24, you're neurotic at 44. Maybe you've moderated your personality, but personality is a constant. So the data is good for decades, presumably.
Starting point is 00:14:44 personality, but personality is a constant. So the data is good for decades, presumably. Hmm. So this is what you're arguing, kind of a lifetime's worth of psychographic information about tens of millions of voters. Yes. So do you go to Facebook with this information? Of course. I mean, we went to them more than a week before the story was supposed to publish. And Facebook decided, knowing this story was going to come out, they decided to preemptively put up their own statement and say, you know, we're going to come clean. This happened, and we're trying to get to the bottom of it and really present themselves as proactive. When they, in fact, hadn't been that proactive. It's hard to see somebody as proactive when they're putting out statements about what they're doing and to fix something less than 24 hours before they know a story that includes all these details is going to be published. Which seems more like saving face potentially than actually fixing it. Yes.
Starting point is 00:15:41 Matt, what's happening now? What's been the result of all this reporting and Chris's decision to come forward and talk? All hell sort of broke loose for Facebook. You know, their stock has lost nearly $50 billion in the last two days. They've got lawmakers in both the U.S. and Britain calling for Mark Zuckerberg to come testify. They've got state attorney generals in New York and Massachusetts opening up investigations. And I think for them, it's becoming this real crisis of,
Starting point is 00:16:10 you know, what kind of company are we? How do people see us? And what can we do to fix it? You know, is this their business model? I think people, you know, have thought of Facebook as a fun place. You know, you throw your kids' pictures up there. You're kind of like a country star or a hip-hop guy.
Starting point is 00:16:26 And you tell people about your life. And what they're realizing now is when you tell people about their life, they can try and manipulate you. They can look for ways to find your vulnerabilities and either sell you something, whether it's, you know, a new jacket or a political idea. And I think that's something that scares a lot of people. You know, there's something – I don't think people are used to living in a world where there's massive surveillance and people think of surveillance as cameras and people listen to your phone, but mostly it's putting your stuff on Facebook and letting a giant company know about it. And I think this is one of those stories that helps bring that home to people. It's not the only story.
Starting point is 00:17:01 This is not a new idea, but it is one of those moments where people are thinking that, and for a company like Facebook, it does represent a real crisis. At the same time, Cambridge Analytica is reeling. They had our stories, and that's been immediately followed up by Channel 4 News in Britain, which totally independent of us, had done their own investigation where they had sent one of the reporters posing as a prospective client from Sri Lanka who wanted to hire Cambridge to kind of help their political allies win elections in Sri Lanka. And they have Alexander Nix, the CEO, on a secret camera they were using where he's telling this fake client that, you know, well, we can go after your rivals. We can use people to pass bribes to them or send women to entrap them, and we can secretly film it all
Starting point is 00:17:46 and then put it on the internet. That's a pretty remarkable thing for the CEO of a company that was hired by the winning candidate in the U.S. presidential election. Yeah, he was suspended today. So the CEO of Cambridge Analytica is no longer in charge.
Starting point is 00:18:00 Yes. What do you think that the next questions are in this story? What are you interested in now? So, you know, what Cambridge Analytica did, turns out it wasn't very hard. It didn't take any kind of special permission, especially before 2014 and 2015. So how many other people were doing it? How many of them had more time, had more money, and were just better at it? And how much do they
Starting point is 00:18:25 know about us? Yeah. Matt, thank you very much. It's my pleasure, man. Here's what else you need to know today. I had a call with President Putin and congratulated him on the victory, his electoral victory. During a news conference at the White House on Tuesday, President Trump described a warm telephone conversation with Vladimir Putin about his overwhelming re-election as Russia's president.
Starting point is 00:19:07 Trump did not ask Putin about the fairness of the vote, reports that Russia poisoned a former spy in Britain, or Russia's meddling in the 2016 election. Did the president not raise the issue of Russian election meddling in that phone call? I don't believe it came up on this specific call, but it is something that we've spoken extensively about and continue to look at ways and steps forward to make sure it never happens again.
Starting point is 00:19:35 Instead, the president focused on what the White House called shared interests, including North Korea, Ukraine, and the escalating arms race between the United States and Russia. We had a very good call, and I suspect that we'll probably be meeting in the not-too-distant future to discuss the arms race, which is getting out of control. And a second woman claiming to have had an affair with President Trump filed a lawsuit on Tuesday seeking to be released from a legal agreement requiring her silence. The woman, a Playboy model named Karen McDougal, says she was
Starting point is 00:20:13 misled by the company that owns the National Enquirer, which paid her $150,000 in return for not discussing the relationship. Her lawsuit comes two weeks after Stephanie Clifford, an adult film actress, challenged a similar confidentiality agreement. Both women claim their contracts are invalid and that they should be free to describe their experiences with the president. That's it for The Daily. I'm Michael Barbaro. See you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.