Today, Explained - Cambridge Analytica

Episode Date: March 21, 2018

Today Facebook CEO Mark Zuckerberg admitted the social media giant “made mistakes” in the Cambridge Analytica scandal and vowed to fix them. The UK-based company improperly acquired the data of so...me 50 million Facebook users, and revealed how easily our info can be sold to third parties without our knowledge. Recode’s Kurt Wagner explains, then ProPublica’s Julia Angwin talks about the endgame: brainwashing the masses. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Return to the Mac. Mac Weldon sells men's essentials. They believe in men's essentials. They're also into smart design, premium fabrics, and simple shopping. And let's be real here, who doesn't like simple shopping? Not only is it simple, but Mac Weldon wants to make things cheaper right now for you too. Like 20% cheaper. Go to macweldon.com right now and get 20% off using the promo code EXPLAINED while you enjoy some of the simplest shopping you'll do all week. phone. So I'm opening up a Facebook app. I'm clicking the little menu button in the lower right and I hit privacy shortcuts. Now I'm scrolling and I'm going to click on privacy checkup. So the first page I'm seeing is it's reminding me who sees my posts when I share them.
Starting point is 00:00:59 For me, I have it set to public. So that means that anyone basically on Facebook can come see my next post. I am okay with that given what I do for a living, but people might want to change that to friends only. If I hit next, it's going to show all of the personal data that I've uploaded. It looks like I have three emails in here, a phone number, a birth date, where I was born, my fiance's name. This is kind of a lot of stuff. And then if I go to the next page, it says here are apps you've used Facebook to log into. I have about probably 20 on here. Yelp, Spotify, Venmo, Pinterest, Lyft, Uber, Bitmoji. You go hard on Bitmoji? Yeah, I go hard on Bitmoji, right? So I want to maintain that connection. Let's pretend I didn't. I could just click and sever that tie, basically, which would stop sharing that information.
Starting point is 00:01:48 Some people have like 300 apps in there, right? I mean, I saw some people tweeting just this morning that they were going through this process and, you know, it's like, oh, 1,200 apps that I've connected, which is shocking because I don't know who even downloads that many. But I guess if you do it over a number of years and you're a heavy app user and tester, it's certainly possible. Facebook knows a lot about you. It knows where you live, who you like. It knows you're obsessed with the Golden Girls. And all that information is just super, super valuable.
Starting point is 00:02:34 Facebook has announced what they're calling a comprehensive audit of Trump 2016 campaign data firm Cambridge Analytica after the shocking revelation that the organization harvested data from 50 million Facebook users, doing so, at least according to Facebook, improperly. So what happens when a shady outfit in the UK gets a hold of all your sweet Facebook data and gets caught? This is Today Explained. Okay, Kurt Wagner, senior editor covering social media for Recode. Cambridge Analytica sounds like some fire rapper from the UK, but I'm guessing this is not nearly as much fun as that.
Starting point is 00:03:14 No, no, not at all. Yeah, unfortunately. So Cambridge Analytica is a research firm, and people know about it primarily because the firm was hired by the Trump campaign during his run up to the 2016 presidential election. Steve Bannon was a board member, an advisor to Cambridge Analytica. I believe Robert Mercer, who's a big time Republican donor, was an investor in the company. And obviously, it has resurfaced in the news over the last couple days because the firm obtained the Facebook data of some 50 million Facebook users in a way that violated Facebook's terms of service. So now Facebook's come out and
Starting point is 00:03:58 suspended the research firm or the data firm from its platform and is now on this massive kind of PR campaign to try and explain why the data of 50 million Facebook users ended up in the hands of a researcher that was working with Donald Trump. In addition to, you know, obviously the data collection they were talking about with Facebook, there's been some pretty sketchy reports that have come out. So there was, you know, an undercover video that was collected that kind of showed that the firm goes above and get it on tape or on camera. Make sure that that's video recorded. These sorts of tactics are very effective. Instantly having video evidence of corruption. Right.
Starting point is 00:04:59 Getting it on the internet. So when you combine it with what we're learning about their role in this Facebook scandal, it certainly seems like they have some sketchy business practices. Okay, so I guess to understand what exactly this company was providing, we, to collect profile data from some 270,000 Facebook users so that they could build out these behavior profiles of potential voters in the election. And the point of all of this is to say, hey, as we're getting up to the 2016 election, how can we make sure that our ads and our messages are really being tailored to the right people? And so they hired out this research professor, Alexander Kogan, to do this. And he created an app called This Is Your Digital Life. He asked people to log in with their Facebook account. And so if you've ever downloaded an app on your phone and you didn't want to create your own username and password for that app, you might hit the log in with Facebook button that a lot of apps have. It's a very easy way to avoid having to go through all the steps of creating a new account for a new app.
Starting point is 00:06:14 And this isn't just like some weird British research app. This is like Tinder and Instagram. Totally. Uber, Lyft, Words with Friends. These are all the kinds of apps that people create, or I should say they use their Facebook profile to log in with. And so back in 2015 when all of this happened, the way that Facebook's terms of service and privacy policies worked were that if I signed up with this app with my Facebook account, I did not just grant the app developer my own personal information. I also granted them the personal information of all of my Facebook friends in my network. Wait, what? You sold out all your friends by saying this app could use your Facebook? You sold out all your friends. Yeah. Harsh. Basically, there was a way to opt out of this, but clearly a lot of people were unaware of that. How many accounts did this guy, Alexander Kogan, end up with? So 270,000 people voluntarily signed up using this Facebook login
Starting point is 00:07:11 that we were talking about. But through their networks, they were able to obtain data from some 50 million Facebook users. So if 270,000 people all have on average 200 friends, that's how you get to 50 million. It's not unbelievable that that number could get that high that quickly. So Kogan has all this data and how does it get to Cambridge Analytica? Right. Up until now, this whole process that I've just described is actually not against or it was not against Facebook's terms of service. This was actually, this happened as it was intended to happen, right? They used Facebook's API, people voluntarily use this Facebook login, the way that it was all written, they were voluntarily handing over all this data, all of that was actually
Starting point is 00:07:55 fine. What happened was that Kogan then sold that data to Cambridge Analytica, or I think the firm may have actually hired him to collect it. And that is where the violation happened. Because Facebook's rules state that if you're a third party developer collecting this data, you can't turn around and sell it to someone else. So that is why Facebook is now coming out and having a major problem with this saying, hey, you violated the rules. So it's kind of insane that it's totally above board to dupe people into selling out the personal information of all of their friends on their social media network. But the part where it gets problematic is when you make a little money off of trading that information. Correct. And I think Facebook realized that it was pretty crummy too, because they did change this rule a couple years ago.
Starting point is 00:08:48 I believe mid-2015, actually, that they said, well, actually, you can still use Facebook login. This still exists. A lot of people still use it. But you're only going to be handing over your own personal data. You're not going to be handing over all of your friend data. So, you know, this has changed. If you were to use the Facebook login thing with Uber today, for example, you wouldn't be giving Uber all of your friend data. So, you know, this has changed. If you were to use the Facebook login thing with Uber today, for example, you wouldn't be giving Uber all of your friend data the way that you did a few years ago. But obviously, it is now coming back to bite them that this was ever a
Starting point is 00:09:14 possibility to begin with. So how did we find out exactly that all of this was happening? Actually, Facebook found out that Kogan had given this data or sold this data to Cambridge Analytica back in 2015, and it got what it says is a written confirmation from both the company and the researcher that all of the data had been destroyed. Late last week on Friday, the New York Times and the Observer had gone to Facebook and said, hey, we are hearing that, A, this happened, and B, that the data hasn't actually been destroyed in the way that it was promised. The social media powerhouse has been reeling, seeing its worst day in four years, with a nearly 7% hit today in the company's stock, dragging the market down with it. Facebook lost something like $50 billion of its value so far this week. And Mark Zuckerberg finally broke his silence today to say, you know, my bad and offer some changes. So it sounds like we'll be hearing a lot more from him soon. Not so much for Cambridge Analytica. I believe that there are politicians both in the UK and the US who want to bring Facebook in for formal testimony. So this story is not anywhere close to being over.
Starting point is 00:10:33 And as a result, Cambridge Analytica is certainly going to be part of that narrative. At the same time, one of the things that a lot of people don't realize is that they thought Cambridge Analytica services weren't actually that good. A lot of the politicians who hired the firm for this whole, you know, purpose had then complained and said, well, we actually didn't feel like we got, you know, a very good product in exchange for our money. So it's possible that when you combine this now, these negative reviews about what they actually do with the drama of, you know, appearing as a company that stole Facebook user data. Man, it's a PR nightmare to have to come back from, that's for sure. Facebook's having a terrible, horrible, no good, very bad week. But the stakes for you and me are way higher because we're at risk of being straight up brainwashed, y'all. More in a minute.
Starting point is 00:11:42 What a great time to talk about moving all your stuff to the cloud. If you're into that sort of thing, check out the Google Cloud Platform Weekly Podcast, where Google developer advocates Melanie Warwick and Mark Mandel answer questions, get in the weeds, and talk to GCP teams, customers, and partners about best practices, from security to machine learning and more. Hear from technologists all across Google about trends and cool things happening with our technology. Learn more and subscribe to the podcast at g.co slash gcp podcast. This is Today Explained. I'm Sean Ramos from the idea of a political firm like Cambridge Analytica
Starting point is 00:12:30 buying your personal data and using it to influence you sounds scary. But if it's such a big deal, why aren't people like rioting in the streets? Why don't people care more? Well, the truth is we haven't been really given an option of caring. This is Julia Angwin. I'm a senior reporter at ProPublica. She's been trying to get people to care about this kind of thing for a long time, but she keeps running into the same wall. Like, what would the caring option be?
Starting point is 00:12:55 It would be not to use any technology, right? I mean, I do a lot of things to try to protect my data on the internet, but it makes my life kind of ridiculous. And most people, like my children, who watch me browsing the web with JavaScript turned off, things to try to protect my data on the internet, but it makes my life kind of ridiculous. And I, most people like my children who watch me browsing the web with JavaScript turned off, think to themselves, I could never live like that. That's no way to live. And it's not actually like most of the web doesn't load for me. And I make selective decisions about which parts of the page I want to allow to access my computer. That's not a normal thing. People shouldn't have to do that. I do it mostly to teach myself how hard it is and to remind myself that privacy is
Starting point is 00:13:32 not possible as an individual choice. Like we have to make some collective choices because putting the burden on the individual to lock up your data is really unrealistic. And even then it's tricky, right? Like I think about the Equifax hack, like that didn't have to do with me being reckless with my personal information. That had to do with this third party credit bureau not protecting my information at all.
Starting point is 00:13:55 And then it was like, you have to go do work now to protect your information. And I just felt like, who's to guarantee that that work that I'm gonna do is gonna protect anything. It just feels like sort of defeating. It is. I mean, you know, it's not just Equifax and Target and Facebook, right?
Starting point is 00:14:13 These are all private companies. But also, I don't know if you remember, but OMB, the federal agency, lost all the information about federal employees and all their background checks and really sensitive information that they had provided. And every one of them got some sort of email saying like, well, sorry, we lost all your sensitive files and we don't know where they are. So there's really no one who's been proven to be a good custodian of our data. And I think that's mostly because there's no incentive because they don't need to. The cost is borne by us, right? We have to set up some weird identity monitoring service, which, as you suggest, is probably
Starting point is 00:14:51 not very effective. But it's sort of like the only thing we can do. We fine people when they spill toxic chemicals. They have to pay a fine, and then they have to clean it up. And we don't have anything equivalent for data. Big headline. I don't know. Why should we care? Is there like concrete harm that can be done if a company like Cambridge Analytica can sell my Facebook profile information to
Starting point is 00:15:16 the Trump campaign or to any other political campaign? Like, what is the like in bright lights reason that people really need to pay attention to this? Yeah, I mean, that's a good question, right? A lot of people say to me all the time, Julia, why do you care about this stuff? Like, nobody's dying, like, blah, blah, blah, right? If you think about Cambridge Analytica, what they promised was that they were going to literally use all your vulnerabilities and insecurities to target you and essentially brainwash you with political messagings that preyed on your weaknesses. Wow. I mean, if they delivered that, that is pretty much harmful. Yeah. Okay. That's fair. So it's like they were trying to execute some sort of mind control. Yeah. I mean, that was their promise. Like, I don't know that they've achieved it,
Starting point is 00:16:01 but I do think it's worth thinking about the fact that there are many companies out there, not just Cambridge Analytica, whose only goal is to do that, right? That's what they want to offer. And that's something that we should all be worried about because if it is possible to do, and I think there's lots of evidence to suggest that humans are pretty easily influenced, particularly, you know, in a space like social media where, you know, trusted intermediaries can be convinced to deliver like a fake message, that we should be worried about it because there probably is someone who's going to pull it off if it wasn't them. Okay. So yeah, a company trying to tell me what to think sounds like something I should be very
Starting point is 00:16:39 worried about. Yeah. Yeah, I think so. I mean, that's what advertising is, but it was never very good at it, right? I mean but it was never very good at it, right? I mean, it was sort of good at it, but they only had 30 seconds and you were in between like TV shows and you often were in the bathroom, right? But now, you know, they actually have your friends delivering you messages and they have it looking like a real publication when it's not. I mean, the techniques are so much more sophisticated than they ever were before. It's not like Cambridge Analytica came up with some crack idea and everyone glommed on to it. This is a thing that already exists?
Starting point is 00:17:11 Oh, yeah. This absolutely exists. I mean, if you think about it, actually, Facebook itself probably was the first one out of the gate with this, with their study about emotional contagion. You know, they wanted to study whether if they made everybody's news feeds really sad, would those people get sad? And other people's news feeds were really happy. And when they had happy posts, would those people get happy? And like it largely worked, right? I mean, what was buried in that headline was that it largely worked.
Starting point is 00:17:38 And so these companies are studying how to influence people. And it might sound a little bit benign right now, but the truth is they are literally in the business of figuring out how to make people change their opinions and how to influence their thoughts. Once upon a time, we called this brainwashing. Remember those days? I'm scared right now, Julia. You should be scared. It is scary. I'm scared. I have tried, right, going off the grid and, you know, protecting my data and all that. That's no way to live, right? There's so many great things about technology. So the question is, can we mitigate the harms? And I think there probably are ways to do that. And it probably involves making some really hard decisions. Okay, let's do a quick PSA while we still have our free thought intact.
Starting point is 00:18:29 ProPublica's Julia Angwin on how to avoid a social media dystopia. Number one, let's go. Maybe these intermediaries like Facebook and Twitter should have the same responsibilities that newspaper editors did when they decided what was the top news of the day. You know, newspapers were always held liable for those decisions. responsibilities that newspaper editors did when they decided what was the top news of the day. You know, newspapers were always held liable for those decisions and held up to public scrutiny. And in fact, if they published anything untrue, they could be sued. Now, the tech platforms take no responsibility for what they publish, and they have no liability for it because Congress nicely gave them an exemption. You know, we maybe have to think about whether that exemption is warranted.
Starting point is 00:19:07 Have they earned it? Okay, so hold social media to the same standards as news media. Number two. You know, there's this idea that Facebook says, you know, look, we can take whatever ads we want. You know, even if they're coming in rubles, we don't really check what they're from. And they should know who's advertising on their platform, particularly in the context of politics, where there are lots of rules about who can buy ads and who can't. Okay. Okay. Don't do business with Russian trolls. Easy enough. Thing the third.
Starting point is 00:19:35 I do think journalists could play a role, too. Perhaps news institutions should band together and there should be a consortium of the ones that are considered real news and they should be labeled as such. And Facebook could promote them as like, here's the real news package. You can sign up for that or you can sign up for the non-real news package. Right. All right. So I guess that one's on us. Let's band together and figure out what's news and what's not. Yeah, exactly.
Starting point is 00:20:03 Julia Angwin writes about technology and the media at ProPublica, where she's a senior reporter. And since we've been talking about solutions, here's what Mark Zuckerberg said he wants to do to fix Facebook. They're going to audit any apps that look shady. They're automatically going to delete the apps that have access to your information
Starting point is 00:20:19 if you don't use them for three months straight. And they're going to add a little widget that shows you which apps can use your stuff to begin with. Today Explained avoids all the Facebook drama, but you can find us on Twitter at today underscore explained. One more note about Mack Weldon and their essentials. You got your underwear and you got your undershirts, but also your leisure wear, your hoodies, your jogging pants.
Starting point is 00:20:48 They've even got backpacks because backpacks are essential. Go to MackWeldon.com and get 20% off using the promo code EXPLAINED.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.