Tomorrow - 109: Hell is Other People with Paris Martineau

Episode Date: March 23, 2018

Hello podcast listener, how's the internet treating you? Awful? Us too! But it's not all bad. This week Josh and Ryan discuss Instagram, Russian spies, and E! True Hollywood Stories before sitting dow...n with Paris Martineau to dissect exactly how Cambridge Analytica weaponized your cat photos. We promise it ends on a hopeful high note. Sort of. Learn more about your ad choices. Visit megaphone.fm/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Hey and welcome to tomorrow, I'm your host Josh Wittopolsky. Today on the podcast we discuss Squirrel Hunting, Final, and My Abs. But first, a word from our sponsor. Is there anything that makes you feel more adult than buying art? I know personally it is when I feel most adult. Your art gallery makes buying art an easy affordable experience that you can do from the comfort of your own home or presumably any place that you feel comfortable. Better yet they encourage all levels of artists to exhibit on their site and 90% of the fee goes directly to the artist. Plus and I think if you care about the planet this will will be interesting to you, they're eco-friendly.
Starting point is 00:01:07 So look, here's all you have to do. Go to yourartgallery.com and select your piece of art, and then use promo code tomorrow at checkout to get 25% off your order. Alright, so hey, we're back another week of tomorrow. Ryan. With tomorrow classic. How was your week? It was great.
Starting point is 00:01:29 It was, it was okay. Yesterday I watched three E true Hollywood stories. So good. Not too bad. I don't want to talk too much about your TV habits, but what were they? I mean, what were they? I watched the I Love Lucy, I Love Lucy,
Starting point is 00:01:42 Bewitched, and Gilligan's Island. All great. Kind of a blast from the past, kind of a... Oh, and I dream of Jeannie, but I watched fourth. That's really unwelive. That's not great. You know how you said over the weekend, right? Yeah.
Starting point is 00:01:56 Well, what I wanted really out of that was I wanted to know that your dreams could come true late in life, but if everything's a mess, it doesn't mean they're not coming true. Wow, okay. All right, so let me talk a little bit about my week. It's been a very busy week. There's been a lot of news. There has been, it's been snowing again in New York, which is a nightmare. It's snowing and you moved offices so truly. Yeah, oh no. All right.
Starting point is 00:02:17 And we're in a new office and we're actually in a room in the new office, which is not the podcast booth because the podcast booth has not been built yet. It's literally like the walls in the floor of the room, where we normally record our sitting inside of another larger room waiting to be built. And finally, I was like, when I was like, you just need to get like a task rabbit team in here because I just said this to her
Starting point is 00:02:39 because it's like, she's trying to do a million things. Yeah. As I said on the last episode, when is in charge of the outline. She's capable of doing 999999 things, but this one thing will give to a taskwriter. This is the one that is just where you draw the line. And yeah, so it's been a really exciting week,
Starting point is 00:02:55 because we have been in Soho for a while, for a long time. Before launch, we were in a space that was like a temporary space that we were sharing with, well, Coden Theory had some open space, the design firm that we've worked with a bunch. But now we have our own office and it's all, I think I mentioned this already, but I think I mentioned this last week, but it's it's all very white. But now that we're in here, like, I mean, the actual painting of the office, it's all white white. It's iconic.
Starting point is 00:03:23 And it's iconic, but I think it's like, okay, we definitely need some art. I've got a few things coming in here and there. As you know, I've meticulously designed the office. It's actually not. You got that Paris cat and one of the other things people put up in dorm rooms. Yeah, I got a tapestry of all the rock stars who died
Starting point is 00:03:43 when they were 27. Jim Morrison, Kurt Cobain. Amy Winehouse. Jimmy Hendrix. I don't think Amy Winehouse was, was she 27? Yeah, 27 is the year, man. Yeah, that's the year. If you get past 27, you're no longer cool.
Starting point is 00:03:54 That's the new policy. That's my new policy. That's the unfortunate thing I just discovered. Yeah, sucks to be asked. Anyhow, so it's been a, it's been a, and there's been a lot of shit happening this week. I mean, in terms of the news, and we're gonna talk about that in a second, but you know, it's been a big week for Facebook and, and, and things
Starting point is 00:04:12 that I'm particularly interested in. That's actually how I fell in the Eat True Hollywood story hold because I wanted to not be in this current day. I mean, everything you turn to is involved in this. Yeah, I mean, it's, it's, it is truly a, we're in an age of, it's a nightmare. Well, it's a nightmare that seems to be endless, but I believe there might be an apocalyptic extinction event that will happen that would end it.
Starting point is 00:04:38 So I think we're actually in pretty good shape. We can get out of Candycrack. And I just take Ragnarok. I think, yeah, I think like we're like, it's over but like it may be over and I think that would be a relief for a lot of people. I would really take it off. Anyway, let's talk about the news and then and then after the news we're going to talk to I'm going to talk to Paris Martinau who is our one of our staff writers who writes
Starting point is 00:04:57 a for the future section of the outline and she's been covering a ton of the Facebook stuff that's happening the Cambridge Analytica controversy data, sort of data gate or whatever the fuck we're calling this. So we're going to talk to Paris in a little bit. But first, let's get into a couple of news stories and then... Well, the outline just covered and I found this fascinating. 1% of Reddit users cause 75% of drama and problems. Yeah, I love this.
Starting point is 00:05:21 Not a shock. Not a shock. I mean, okay, so this is to me was a great, I mean, seeing this is wonderful to see the data to know that it aligns with reality, but, you know, I can, I remember it, you know, back, you know, even at N Gadget, when we'd look at the data on the commenters, the, the almost, and this is true of almost every publication. The commenting in community is like, it may be a little higher now because there's just more people on the internet, but the commenting community is like about it may be a little higher now because there's just more people on the internet,
Starting point is 00:05:45 but the commenting community is like about 1% or less of the overall readership of a publication. And yet, they're the most visible and the most vocal. And it does 10 towards this kind of thing where this state where very small percentage of users create havoc for the rest of the users. Yeah. I mean, maybe this is life.
Starting point is 00:06:09 You know, I mean, the truth is, as we all know, hell is other people than well-established in New York City. In New York City. In New York City and everywhere. And in many scientific ways, I think we're all now very aware of that. But I do think it's, I do think I do think it's nice to see the data that just proves that I'm not insane and that the time I spent,
Starting point is 00:06:29 I mean, we had internal conversations every place where we had comments and it's like, okay, get a little bit bent out of shape about the commenters, but don't get too bent out of shape because they actually don't represent your readership. But people, those agitators love to think they do represent your readership. And it's hard when you want feedback
Starting point is 00:06:46 and you get feedback to say that's not real feedback. That's a tough, well right. But it's like, you know, the never read the comments, never, you know, never tweet that sort of policy. I mean, you do look at it and it's like, look, the vocal minority does not make up the total, the totality of the audience. And there's always somebody who's going to hate, there's always going to be somebody who hates what you do.
Starting point is 00:07:11 There's always going to be somebody who's mad about what you do. There's always going to be somebody who tries to fuck with what you do. And I think that the problem with one of the problems in the internet is like, we don't know how important or not important. I mean, you can look at the data obviously,
Starting point is 00:07:24 but there is a feeling you want to respond to it. Yeah. You see somebody say something, you're like, I can't tell you the number of times we've gotten an email and it's like, I'm mad, you fuck this article up and then you read their comment and you're like, oh, you just disagree with our opinion on Jordan Peterson.
Starting point is 00:07:38 Yeah. You think Jordan Peterson's really cool. You know, it's like people who are like, I just disagree with your opinion and that's very different than getting something wrong. Well, what I was going to say is two things. One, at times in my life when I've commented, occasionally it was because I was extremely knowledgeable and passionate about a topic. I had to contribute. Sure. Most of the times I was really on
Starting point is 00:07:55 well college student and I had a monster energy drink and I got mad on Galker. I mean, getting mad online. I mean, we've done, you know, pieces on this and it's a well-documented phenomenon of I mean, we've done pieces on this and it's a well-documented phenomenon of the human condition. You get mad online and you've got to express yourself. I mean, listen, I've been the victim of being mad online. I've caused trouble in my own life and in the lives of my loved ones by being mad on the internet. But you know, Reddit is a pretty vast selection of different communities. And it's interesting to imagine that,
Starting point is 00:08:30 what would a Reddit be if that 1% of users that are causing so much of the stress and the strain could be eliminated from the platform? And I do think this kind of goes back to that, the way these modern tech businesses police their audience. I mean, there's actually I tweeted about the speaking of never tweeting. I tweeted about this last time. We're going to talk a lot about later on. We're going to talk a lot about Facebook and this sort of scandal that's happening with them right now. The ongoing ever evolving Facebook scandal,
Starting point is 00:08:58 which is Facebook itself. But you know, there's this, there's this quote in Mark Zuckerberg gave all these interviews where he basically sounds like a fucking robot. I mean, I'm not saying he sounds like a robot. It's like he's repeating the same for this, and this is the same thing. Yeah, no, when he said, I'm very sorry. The way he said that, I had like Westworld shivers. It was so unwell.
Starting point is 00:09:16 I mean, look, I'm not, you know, the guy has a, you know, authentic. Yeah, he's not authentic, that's very true. But I also think Mark Zuckerberg is like, in no way prepared to do what he does. I think he's a guy who got lucky with an idea and he's managed to somehow continue spinning that idea into fucking money.
Starting point is 00:09:33 I don't think he's like, I'm not saying he's not smart, he's very smart. But I don't think he's like, he studies the human condition and is like a really kind of smart. Yeah, I mean, he may be a brilliant engineer. I don't think he's a good leader. I also don't think he ever went through the wilderness. The way that like a, and I hate, don't love this comparison.
Starting point is 00:09:51 But the way that like a Steve Jobs kind of like went through it with life and shit and- I mean, Steve Jobs is also a terrible person. I was just gonna say, he was still a terrible person, but it gave him certain skills that I think applied to his job. So he could be a terrible person, but also effective in the way he does.
Starting point is 00:10:03 He could fool you into thinking he cared about something. I don't think Zuckerberg does that. He's like, I really, no, I agree with that. And I think it's um, but uh, but this thing with Reddit, it gets to, you know, this, this idea with, with Facebook where, oh, sorry, the thing I tweeted about, which is, you know, Zuckerberg's like, I don't, I feel like I should not be the person having to make these decisions and that we should really let the community decide. And it's like, I don't, I feel like I should not be the person having to make these decisions and that we should really let the community decide. And it's like, listen, dude, that is, you are really like deflecting responsibility here. Like, it is your job as the guy who's in charge of this thing. Like, like, and, look, I hate to say it, but the president of the United States of America,
Starting point is 00:10:42 you know, at this moment, we have a really bad one, but his job is to actually set the tone for the country. And it's why fucking racist terrorists are blowing people of color up in Austin and why people are driving their fucking cars into people and why we have a huge surge of neo-Naziism in this country because we have a president who set the tone. And the tone is, hey, neo-Nazi's are fucking okay by me. And there is a direct correlation set the tone. And the tone is, hey, you know, Nazis are fucking okay by me.
Starting point is 00:11:05 And there is a direct correlation between the leadership of a business. And let's call America like a big fucking business, the leadership of a business and what it's users or what it's citizens or whatever you call them, do what it's what it's consumers do, you know. And so, so Facebook, the idea that Mark Zuckerberg is like, well, I don't feel equipped. It's like, well, if you don't feel equipped, get the fuck out of the way. Don't do that job. You know, because there is, there needs to be a somebody to drive a bus.
Starting point is 00:11:27 But let me drive this bus. It's like it's cool. Well, like the community decided where I turned. We talked about this last week, but you know, we're the free speech platform. It's like, okay, but speech has limits, free speech has limits. No matter what, free speech has limits. And in a construct like Facebook,
Starting point is 00:11:42 it's not. It's not. It isn't just regular on the street- It's not- It's not- We're not just two people out in public speaking our minds. It is a system that is designed and can be used in very specific ways. It is a fucking game. And they're still doing the work of updating it
Starting point is 00:11:56 and shaping that system. We're asking them to weigh these ethics. And they do that. And so it's like, it's like just look at what you believe is that, you know, what are your ethics? What what you believe is the, you know, what are your ethics? What do you believe is the right thing? You know, what is the, what are the, what are the right ways for people to treat each other
Starting point is 00:12:11 in one of the wrong ways? There's some really simple answers to this, but every nation that exists, whether they have free speech or not, has set rules around where and when and how you can use that speech and what crosses the line into something that isn't free speech, but is harmful or hateful or can become violent. And like anyhow, the point is, the other thing is that this isn't just about the decisions about the community.
Starting point is 00:12:38 It's about what the decisions that the leadership of Facebook that they make, and we're gonna, again, we're gonna talk about this more, what they make behind the scenes with the information that we give them. So this Cambridge Analytica stuff is about, right? It's actually about the decisions that Facebook makes with what they do with our private data that we've given to them, not speech, but like what we do as an activity on their platform.
Starting point is 00:13:01 The thing with Reddit is interesting because it's like, I think this similar issue is true with Reddit and they've always been very slow and we've written about this recently, very, very slow to take action on communities that are abusing their rules of the road, which they do have. Yeah. They do have, they say like, we won't tolerate X, Y, and Z. And it's like, listen, you know, these that 1% of users are allowed to thrive
Starting point is 00:13:22 because the management of the business won't take responsibility for the behavior of their users and is scared to appear biased or partisan or playing, you know, that they're playing favorites. But the reality is when you own a platform, it's a for-profit platform that's supposed to be an open environment. If 75% of your problems are caused by an extremely small percentage of the users, you have a user problem that needs to be dealt with, you know? And I just think it's interesting how everybody, all of these tech companies, led by brilliant engineers who've never spent a second thinking about the ethics of their fucking platform,
Starting point is 00:13:54 are like, so slow or ignorant, slow to recognize or ignorant to understand that they have the power to change the dialogue on their platforms and they can change it for the better or the worse. I think Twitter has done this, they've seen this, I'm not saying they're doing a perfect job, but there's definitely a change in my feed based on like the settings they've introduced from what I see and what I hear and who's able to get to me and how much they're able to speak and like they've had some positive changes. Like, Twitter's a less feels to me, like a less ugly platform in the last couple of months.
Starting point is 00:14:28 Instagram does a phenomenal job of keeping people from posting porn. I don't understand why they can't do the same thing with like hate speech, Nazi stuff on Facebook. Yeah, is there Nazi stuff on Instagram? I mean, I've never really looked. I just think the reporting, both the reporting functions are different,
Starting point is 00:14:42 but also the way that information is spread is built differently. Well, it has also a different purpose. I mean, it's not like every fucking thought that comes out of your mind. It's like, here's a picture of something, which is really good. Anyhow, all right, so, but the red thing is really interesting,
Starting point is 00:14:53 I think that ultimately, you know, the answer to all of this, for part of it's gonna be fucking regulation. There's gonna start to be oversight. Yeah, we get answers to this. There's gonna be oversight of these platforms if they're big enough. But, and that's good or bad, depending on who you talk to.
Starting point is 00:15:06 But I think it will at least be nice if somebody's paying attention. But the reality is, like, these guys have to all these owners of these businesses have to grow up and say, I will accept responsibility for the behavior of the people on this platform. And I'm going to do something to change it. And it's not going to be, I'm not. And it's not even that. It's accepting responsibility for enabling that behavior. Yeah.
Starting point is 00:15:23 You don't have to say, it's my fault that Nazis exist. It's your fault that you let them write whatever they want for a decade. And it's not the view from, you can't have the view from nowhere. You have to have a view from somewhere. All right, well, speaking of taking a responsibility and being effective at what you do, Cynthia Nixon is running for governor of the North. Yes.
Starting point is 00:15:39 Former star of Sex and City, but also a community organizer and activist. So, you know, listen, this is, well, our celebrity is around. So first off, you know, like, Cynthia Nixon seems great as a person. I don't know much about her. Yeah. She's a good actress. Everything I've seen of her seems wonderful.
Starting point is 00:16:03 Yeah. I don't know if she's done any horrible crimes or committed fraud or I who in fucking knows you know because everybody's got some skeleton but so far I think she seems like a nice person and of course I enjoyed all of sex in the city. You know I have problems lots of problems with the show but she's had much better roles than highly entertaining show Well, that's what she's best known for. And look, I mean, we have a, our governor is not, I would not say beloved, I don't think that the governor of New York is.
Starting point is 00:16:34 Almost tough stuff. I mean, you know, Cuomo is like the son of a governor, you know, and he's in the family business. And I don't know if people think that he is perfect in any way. He's done some stuff that's right. He's done a lot of shit that's wrong. And, you know, would a not career politician have a better shot? Like, I'm not going to make a Trump argument here. I think it's like an educated person who understands the people of a state and understands the
Starting point is 00:17:02 laws that govern that state and the federal laws that that have to be grappled with, you know, if they're smart enough to do that, I'm open to the idea of somebody running whether or not their background is political or otherwise. I do think having experience in the political arena, I mean, yes, community organizes a big part of it, but I think having some, you know, some experience with it is useful. I don't necessarily mean, I don't think that means it will make you more successful. I think like, you know, the idea of a try, it's interesting because people will obviously make the Trump comparisons. And I'm, and by the way, I'm, I'm not saying like, yeah, this is perfectly great news,
Starting point is 00:17:38 but it's also not necessarily bad news, but Trump, I think the hope for people who were like, he's actually really a Democrat at heart was always, well, he really will come into this thing with a view that's like not, he's not going to be mired in the bullshit of Washington. He's going to come into it with a view as a businessman and as a moderate and as a guy who's like, you know, gets along with lots of different people and sure, you know, that's possible. But like, I don't think Trump's a good guy.
Starting point is 00:18:06 And I don't think that you're carrying Washington's bullshit around with you. I think you take the job and you realize, oh, it's surrounded by bulls. And ultimately, and ultimately, it is about who you are. It's about who you are as a person. Like, there is a celebrity like Trump who could become president and do a fucking amazing job
Starting point is 00:18:21 because they're thoughtful and they're intelligent and they're carrying and they're caring and they have great reasoning abilities and they have a cool under pressure. They are famous people who've gotten rich and successful for reasons that would be highly applicable to something like the presidency or to being the governor of New York.
Starting point is 00:18:40 Now look, she's gonna have a hard road. She's got her celebrity, which is a big help but then there's also a lot of pieces that- I feel like it's just a neutral thing. I feel like yeah, it's gonna have a hard road. She's got her celebrity, which is a big help, but then there's also a lot of pieces that fit. I feel like it's just a neutral thing. I feel like, yeah, it's a big help, but don't you feel like being a celebrity is not necessarily the thing that explains a lot about a person, like we believe that it is.
Starting point is 00:18:54 I don't know, I mean, celebrities are pretty big. It's a pretty big. She seems like a completely different person than him. So is that a badge on Girl Scout? Yes. Oh, I mean, is that like part of her life? Yeah, but there is a, there is a, a unifying, there tends to be a unifying sort of thread with celebrity, which is like,
Starting point is 00:19:13 you're making a decision, a conscious decision. No, I get it. She loves acting. She loves the art of the craft of acting, but acting is, is, is by nature a very public, a non-private, very public display of your talents. And that is, does say something about, I mean, does that say something about us that we're behind-catching? Like Donald Trump, like, says something about that you want to put in. I'm making a choice that like, I'm a fucking loud mouth and I want people to hear my bullshit.
Starting point is 00:19:36 And like, I'm definitely like, desirous of people who go, Josh, yeah, people go, Josh, that's so smart and funny and interesting. I really enjoyed it. And I'm like extra set pride, more sense of the most people about people who go, Josh, you're a've seen. Yeah, people go, Josh, that's so smart and funny and interesting, I really enjoyed it. And I'm like extra set pride, but more sense of the most people about people who go, Josh, you're a fucking idiot.
Starting point is 00:19:48 Yeah. Although I've heard it enough now, I feel like I really want to come to terms with that and you're really dead. But, but so like, you know, yeah, there is a thread there where it's like you want the spotlight on you to some degree. But like, so does every politician.
Starting point is 00:19:59 Yeah. Yeah, that's the thing. It's very rare for a politician to go, I mean, nowadays, I don't, I can't speak to the previous generations, but it's clear that for a politician to go, I mean, nowadays, I can't speak to the previous generations, but it's clear that being a politician in America in the 21st century is about being a celebrity. It's about you need to go on TV and perform,
Starting point is 00:20:13 you need to go, look, the stump speech, which is the cornerstone of American politics is a performance. And presidents, I mean, presidents have become, I mean, of course it makes sense that Trump is the president, because Obama was the funniest, most interesting. I mean, Clayton, right in. Yeah, I mean, we have become, I mean, it's, of course, it makes sense that Trump is the president because Obama was the funniest, most interesting. I mean, Clinton, Reagan.
Starting point is 00:20:28 Yeah, I mean, we've had literally, okay, I actor, then, you know, well, Bush, forget about, you know, Bush, senior. I mean, Bush Jr., also, complete fucker. But like the, but the, but we've had some presidents recently who have been, I mean, I just said this about Obama, like unbelievably, unbelievably, just fucking entertaining. Yeah. I mean, just the most interesting guy to listen to, amazing sense of timing, like when it came to humor, you know, like, could basically roll with comedians. That's very unusual, but like makes sense in the context of American politics and how we spend money on politics, how we view politics, how we think about politicians. So anyhow, long and
Starting point is 00:21:08 short of it is, you know, I'm excited to see what her campaign develops into. I'd like to see, I want to know at the end of the day, what are people's policies? What are her policies on school, on guns, on- It's like I'm listening now on social services, like how is she going to help the people of New York, if she's got a plan and surrounds herself with good people, and she is intelligent and sensitive to gonna help the people of New York? If she's got a plan and surrounds herself with good people and she is intelligent and sensitive to the needs of people in New York. Which it seems like she has those fees. Why not? Why the fuck not?
Starting point is 00:21:32 But like, you've got to be all of those things. You can't just pretend, you know, which I think Trump has did at some point, you know, and then it was very quickly like clear that there is no sensitive pretending with a guy like that. I think she's varied. I mean, there is no comparison between the two in terms of like personality. The only comparison is they're both celebrities basically and that's not much of a comparison because there's lots of different celebrities. Ted Nugens of fucking celebrity.
Starting point is 00:21:54 In other related celebrity politicians, yeah, Putin, in two stories that were treated as two stories, but to me were one story. Putin poisoned a former spy in the UK. He didn't personally. He like he took them out the dinner. He got a task rabbit. I love what you've been doing here. You have a drink. And the next thing you know they died. It was very strange. I mean he personally slipped them a Mickey. No, Putin ordered. We believe ordered the yeah, he got a task rabbit. they built his podcast booth and then killed
Starting point is 00:22:25 someone. But yeah, Putin ordered the, the, we assume the assassination of, of, of multiple people, right, on, on UK soil, not just one. Yeah, but it was, it was Sergei Screeple was the former spy. Not the type of analysis. I don't know. Let that allow that. In any event, he did that. and then Trump went around with his do not congratulate instructions and said, congratulations. No, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no,
Starting point is 00:22:52 no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, The electoral victory. The call had to do also with the fact that we will probably get together in the not too distant future so that we can discuss arms, we can discuss the arms race.
Starting point is 00:23:17 You know, first of all, that's an unincredible statement on there's so many things going on in it. First off, it must be said, I mean, Putin recently was like, we have this new nuclear missile that can like, just, I think they showed blowing up Florida. I think it actually, it may show the nuclear weapon, the nuclear missile being dropped. It could be like close to where Mar-a-Lago is,
Starting point is 00:23:39 I'm not really sure, but, you know, they just had this thing where Russia's like, we can just fucking destroy anything with like laser precision. But like, you know, it's incredible to me, first off of this man as the president of the United States of America, it truly is boggles the mind. I'm thrilled that it still hasn't settled in for me. No, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no,
Starting point is 00:24:00 no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, no, Yeah, I was in the natural history museum the other day and where I was looking at a basket weaving like exhibit and I'm standing there and we're all very quietly looking at how they weaved baskets in Asia and like this 14th century or something and I just have this urge to scream at everyone like Donald Trump is president What are we doing? Yeah, what is happening? Why are we looking at best? Why are we all acting like just get out of here? We could do something we have enough bodies here. We could throw ourselves at the light out.
Starting point is 00:24:25 Yeah, I don't think that would help. Honestly, that was my urge, but I was so thankful that I still have that urge because yeah, you got it. Yeah, resist. You got to keep your eyes out. But here's the thing, you know, I, the statement, I mean, first off, I love this idea that his advisors are and this has been reported widely reported.
Starting point is 00:24:41 His advisors are do not congratulate. They wrote him a note, all caps, do not congratulate. We could not stand in high spirits. I'm there like, there's like, they have one of those, they have one of those things that scrolls, the text of the delis they have, like where scrolls they fucking LED text pass. There's like a guy walking through the oval office
Starting point is 00:24:57 that do not congratulate. They feel real life kind of run. They have a, outside his window, they flew planes that did the smoke, the fucking the cloud the cloud writing They're like do not congratulate and Trump's like congratulations on your victory great job, hoon Pony did it You did it again reality TV twist. You're gonna congratulate you crushing it. You're crushing it anyhow I know that's my trauma per se I guess
Starting point is 00:25:21 Okay, so like okay, so it's like one, one, it's like, of course he, fuck it. Of course he did, of course he ignored all the people around him. And you know, listen, the thing that is, to me it's like, okay, okay, yeah, there's it, everyone's like, there's a deep state, it's all, you know, all these war huligans are like, well, we don't know.
Starting point is 00:25:40 We gotta, yeah, yeah, there's no collusion, there's no, it's like, fuck you. Like, we're not dumb. Like, I'm not saying that he's like a puppet of the Russians, okay, I'm not saying that there's some mass of campaign that's going on. I don't know all the details on it. I'm saying that, you're not saying that.
Starting point is 00:25:54 Okay, but what I do know is this, the guy is Putin's bitch. Yeah. He's Putin's bitch, it's clear, okay. I mean, that's not a nice thing to say, but like, I don't really know. I don't know another way to like a dog, like a dog. I mean, a bitch like a dog, okay? I mean, that's not a nice thing to say, but like, I don't really know. I don't know another way to, like a dog. Like a dog.
Starting point is 00:26:06 I mean, a bitch like a dog, okay? He is, he is lapping up whatever Putin is doing. He is, he has avoided saying anything negative about it. He's, he's, you know, Tillerson says something negative. He gets fired. I don't know if there's a connection, but it seems weird. He will, you know, sanctions, we're supposed to,
Starting point is 00:26:20 we're supposed to level sanctions against Russia. He's hesitant to do it. In speeches, speech after speech in debate after debate with Hillary Clinton. He would literally not say, yeah, Putin's a bad guy. We got to keep our eye on him. He all you have to say is, he doesn't have to say I hate Vladimir Putin. He can just say, listen, Russia's done some pretty bad stuff. Putin's a guy that we got to keep our eye on. We can't, you know, we can't play ball with these people. We got to make sure that we're, you know, on guard because, you know, Russia has been a bad, they've
Starting point is 00:26:47 been bad actors recently. Nothing. No. He's like, why would not want to be friends with them? It's like, why wouldn't you? I don't know. He fucking murders people with poison that he doesn't like. That's one idea.
Starting point is 00:26:57 He invades countries. You know, he's a, he just introduced a new nuclear weapon that can blow up Florida really well. Like, there's all kinds of reasons why you wouldn't want fucking play ball with them. I'm not saying you have to ice them out, but you could at least act like you are going to be tough on them. Yeah. And he has it.
Starting point is 00:27:14 And it's like, listen, by the way, I mean, Trump doesn't act like he's going to be tough on anybody who's kind of a fucking psychotic dictator. I mean, Trump's thing is like he's into psychotic dictates. It's like cool. I mean, everything Trump does is like, I can't believe how fucking. It's like cool. I mean, everything Trump does is like, I can't believe how fucking obvious it is. You're like, you can't be this stupid, right? Like, you can't actually be this basic,
Starting point is 00:27:32 but then it's always like, they are his, he is and his administration is as basic as you think. You're like, you can't really have, you're fucking lawyer, like, took out a loan against his mortgage to pay your, I don't know, she's just a fucking porn star, right? Right. Right.
Starting point is 00:27:48 Right. Right. Right. Right. Right. Right. Right. Right. Right. Right.
Starting point is 00:27:56 Right. Right. Right. Right. Right. Right. Right. Right. Right. Right. Right. Right. Right.
Starting point is 00:28:04 Right. Right. Right. Right. Right. Right. Right. Right. who you're sleeping with out of his loan that he took. It's like, how could you be this dumb? You can't figure out another way to do it, get a third party, get a check from a third party or something, didn't sign the NDA. Or like, let's just not do this in the first place. Like, it's good enough to get involved. I mean, but obviously, but I'm just saying like, you're, okay, he's a bad guy.
Starting point is 00:28:19 Fine, is he a savvy guy? Is he a savvy bad guy? I mean, I wrote this thing. I wrote this thing many, many months ago when the deep state stuff, it was actually might have been early in 2017. When the deep state stuff started coming up, I think it was called fuck the deep state.
Starting point is 00:28:32 And it was like, it was by the way, Devon Nunez and all these people who were like, well, we don't know, there could be a plot against Trump. And this is at the beginning of the investigation, okay? I mean, this stuff is not hiding, it is there. It's like out there. And they're like, okay, what's more likely? There's an ongoing, massive, you know,
Starting point is 00:28:49 black ops project to undermine Trump's presidency by all of the intelligence agencies in America or Trump's a fucking, like bad guy who's sloppy, who surrounds himself as sloppy, basically like mobsters and like sleazy, like snake oil salesman. And he just fucked up a bunch of times. Like that to me is Trump. Yeah.
Starting point is 00:29:13 He's just a fuck up. He's just a bad guy who fucks up. Like his sons are bad guys who fuck up. I mean, look at Donald Trump Jr. And he had the point is that I thought this is a, it's priceless. It's just another great, it's very minor in the grand scheme of, of Trump's. I feel like to me, it's like I saw you, you used to see for decades other world leaders would fall into line with what the US was doing.
Starting point is 00:29:34 And not to call them our bitch, but that's essentially was the dynamic. And it was partially reassuring because it's us and we were okay. Right. No, I mean, we weren't always like the US screwed up. It's weird that that that dynamic exists. But what the most unsettling thing is I had never seen the US fall into line with some other country just being like, never gonna do something awful.
Starting point is 00:29:54 And we're like, okay, great. Yeah, it's crazy. It was like, I was like watching kind of like Iraq or something in reverse. No, but I mean, but well, but exactly. I mean, Iraq is a great example of a place where we as leaders, we are not always perfectly in a way. Super far from it, far from it,
Starting point is 00:30:10 but it is unusual to see us, to see the country lay down in a weird way. And most importantly, and I think this is the key piece to watch every Republican. For I understand Trump, they may have something on Trump, Trump's a fuck up, Trump's a bad guy, whatever. Like he's not a professional politician, he doesn't know what he's doing, that's all true. But like to see these career Republicans, it really gives you a sense of who they are. It really tells you who they are and who they are, are the
Starting point is 00:30:36 guys who fucking get in line behind bad dudes. And like that is like, look at any... But they've been telling us forever that greed is good, selfishness is good, self-interested like capitalism is the way, like whatever, and we're shocked that there are philosophy as to save themselves. But even deeper than their bad policies preceding this is the fact that their policies don't exist. Is that when it comes to morals or ethics
Starting point is 00:31:00 or you know where your North Star is, they don't have one. They're not a star, they're a North Star is, how do I fucking stay in power? How do I make sure I can try to get what I want, which is almost always financial, by following whoever will keep me in my position. And so all of these guys, Paul Ryan Ted Cruz,
Starting point is 00:31:15 all of the fucking guys that Trump took a shit on during the campaign have fallen in line with him and are just his little fucking lap dogs. I mean, as Trump is the Putin, they are to Trump. And to me, it's like, that says, if you go out and vote, if you're even a, you're dying the world Republican, you've been standing up for the government. You should want more from your own. Look at this Roy Moore, the Roy Moore elections.
Starting point is 00:31:37 Like this guy's a fucking scumbag. Does he represent Republican Christian values that I have a con to understand or part of that are the kind of cornerstone of that party, not at all. But they've gotten line because they're so fucking desperate to stay in power. And like, by the way, these are the people who are like, other gassing people, well, that's bad,
Starting point is 00:31:54 but like, you know, I'll stay in power for a little bit longer. Like, this is how it happens. It is, I'm not saying it's happening. And the same we're going, heading toward a fascist dictatorship here, but how it happens is little by little, people agree with a bad person and don't fucking stand up in the way of it. And like, little by little, you lose all the shit
Starting point is 00:32:10 that you fucking are. Yeah, it's just following orders. And all these guys are just following orders. Every fucking one of them that sold themselves out, fucking Marka, Rubio, and all these motherfuckers, same with the things like the NRA. They have sold out for these fucking financial interests in this country.
Starting point is 00:32:27 And they should be treated as such when people go to the fucking polls, when people go to vote. Like, look, you wanna vote for people who are bought and paid for by the NRA and bought and paid for by people under the control of, under the command of, in direct, you know, sort of line of sight of people like Putin, like, yeah, sure, do it. But like, that's not the country that I'm interested in, and interested in having, you know, sort of line of side of people like Putin, like, yeah, sure, do it. But like, that's not the country that I'm interested in, and interested in having.
Starting point is 00:32:47 And hopefully there are a lot of other people feel the same way. And let's say the Democrats are all that great. They'll a lot of fucking bad Democrats, but man are they. There's isn't an equivalent. But there isn't. There is no, there is no. We're talking about like a rest broken bicycle and kind of bad call. No, it's like, it's like, no, we're talking about a, a kind of bad plane.
Starting point is 00:33:04 It's like a, it's like a's like no, we're talking about a kind of bad plane. It's like a, it's like a apartment building that is completely on fire and one that like a needs, lost rent control. Needs like a railing fixed, you know, it's like there's not even a fucking comparison, you know, at the end of the day, there are some like not great Democrats, but the core beliefs of the Democratic Party exists. Just exist and are actually just fucking better than what the Republicans believe. So can we should take a quick break and then get to our guest, Paris? So we're gonna take a pause here, listen to some fantastic ads that I know you're gonna
Starting point is 00:33:38 love, and then we'll be back with more tomorrow. What can your data tell you? With Google Cloud Platform, use machine learning at scale to build better products. Google's Cloud AI provides modern machine learning services with pre-trained models and a service to generate your own tailored models. The platform is now available as a cloud service to bring unmatched scale and speed to your business applications, and it predicts so your business can thrive. To find out more about the Google Cloud Platform, go to g.co slash getcloudai. So this was a major region first, and I'm really sorry that this happened.
Starting point is 00:34:54 We have a basic responsibility to protect people's data, and if we can't do that, then we don't deserve to have the opportunity to serve people. So our responsibility now is to make sure that this doesn't happen again. And there are a few basic things that I think we need to do to ensure that. One is making sure that developers like Alexander Kogan, who got access to a lot of information
Starting point is 00:35:20 and then improperly used it, just don't get access to as much information going forward. So we are doing a set of things to restrict the amount of access that developers can get going forward. But the other is we need to make sure that there aren't any other Cambridge Analytica is out there, right, or folks who have improperly accessed data. So we're going to go now and investigate every app that has access to a large amount of information from before we lock down our platform. And if we detect any suspicious activity, we're going to do
Starting point is 00:35:54 a full forensic audit. So you just heard a clip of Mark Zuckerberg talking about the Cambridge analytic data, data gate, whatever we're calling it. I'm here with Paris Martinot, who is a staff writer at the outline, who covers the world of the future, but as of late, I've been covering what is happening with Facebook. Paris, thank you for joining me from mere steps away. It was a long journey to get here. Yeah, well worth it. Well, I really appreciate it.
Starting point is 00:36:25 Okay. We just heard Zuckerberg talking about, he did like four interviews. All of them are essentially like word jumble. He took all the same words and put them in different arrangements, but largely the same set of apologies and we should have done this. I think it's something interesting in this clip here. He says, we're really sorry. You should be able to trust us.
Starting point is 00:36:50 We're going to get into this. Mark Zuckerberg wrote a post, be proceeding these interviews on Facebook. One of his legendary like 3000 word Facebook posts. I think it was 937 words actually. But yeah, it's like a long-winded Facebook post and he was like, in 2007 we introduced the social graph where the app graph that you could get access to the data. And then there's some stuff, some intervening stuff
Starting point is 00:37:14 he describes happening. It's like in 2014, to stop abuse with the tool we did X, Y and Z. And it's like, what the fuck was going on in this intervening seven years? I mean, I assume you people... Yeah, I mean, seven years is a long time. I think that was the part that I found so funny,
Starting point is 00:37:31 but this is it seemed like he was gonna do this long-winded intro to apologizing for what happened, but instead he skipped over seven years. Yeah, yeah. No, it's like, it's so interesting, but it's like, let's be honest. It's Zuckerberg, if they, if no one had ever raised a flag on this, do you think we would have ever heard from Facebook about Cambridge Analytica or a misuse of data?
Starting point is 00:37:54 I mean, I think the thing that I found so surprising about this was that Facebook knew about this in 2015, and I mean, of course, there had been some reports written about this with Facebook mostly kept it under wraps and if Facebook knew about this data, a data breach of this size in 2015. Why did it take us three years to hear about it if Mark is really sorry? So actually, let's, so let's back up and that's interesting. I mean, I want to talk more about that. But let's actually back up if anybody's listening to this and they don't know can you give me the give us the and I know it's quite difficult but can you give us the Minute long version of what the fuck were even talking about right now with Cambridge Analytica Yeah, yeah, what are we called? Does this have a name yet by the way? reach gate Gated gate fake
Starting point is 00:38:39 Book gate face gate face gate face gate. All right, it's a Facegate. Facegate. Facegate. Facegate. Facegate. Facegate. All right. Facegate is a, it's a long and convoluted tale, but I guess the short version of it is there's this company called Cambridge Analytica, which was revealed more or less over the weekend news broke that, where do you even start with this honestly? Yeah, I, that's what I'm trying, I thought I? So, I mean, there are so many. It's one of those stories that has been ongoing for multiple years, so it's hard to find a way in.
Starting point is 00:39:12 But I guess the most important part about it is we knew that Trump's team during the 2016 election had hired this company called Cambridge Analytica, which provided them with data that they use to target Facebook ads and Brad Parscale, who was the digital manager for Trump's campaign at the time, and is now his 2020 campaign manager. A lot of them said Cambridge Analytica was great, helped them. We've slowly, over the course of the the last year to learn that Cambridge Analytica got
Starting point is 00:39:48 all of this data about users that it used to target ads from private Facebook data that they obtained through this guy named Alexander Cogan, who is a researcher at Cambridge University, which is not related to Cambridge Analytica at all. It's not really completely fucking interesting. I know, it's all written. This dude, Cogan, he's a researcher, had access to Facebook's private data for research purposes, was ostensibly like using it to do this personality app, And so he got access to all this sensitive information like you or everything that's in your profile, like status updates, all of that.
Starting point is 00:40:29 Right, essentially your behavior. Your behavior obviously. And all of your friends behavior, which is the like interesting part. So he had all this data. He was just supposed to use it for his personal stuff that Facebook had let him. But instead he gave it to Cambridge Analytica,
Starting point is 00:40:42 which used it to compose these person at like these alleged psychographic profiles, right? So essentially, so essentially would have, so just so I understand this. So at some point there's a quiz on Facebook. So like take this political quiz or something like that. And people are like, hey, I'll opt into this. And when you opt into it, it opens up at that time with the way Facebook, the way they're open graph worked or their social graph. It would open up essentially access to like every interaction, everything you did on Facebook, every bit of data that, or at least a lot of the data that you had sort of like within
Starting point is 00:41:14 the universe of what Facebook saw you do. But also your friends and I don't understand that. Yeah, so I mean whenever you, like one of those stupid quizzes that you take on Facebook, I have to give it. Yeah, I don't take quizzes on Facebook. We have to give it access. I haven't done any of those. I know they're gonna suck your data out. I mean, you don't tell me,
Starting point is 00:41:28 the second you fucking give it access, you know they're gonna suck your data out of there. Oh yeah, I just assume that all of my data is constantly being sucked by everybody. Right, exactly. Two separate Twitter accounts just to allow that. That's smart, I'd like to find the second one. So whenever you give an app or a quiz or something
Starting point is 00:41:42 like this access to your Facebook profile to take this quiz. They really get, well, I mean, in this case, most of the time they're going to be able to get all of your profile information, which is like your posts, your email address, all of your biographical information, but then they also can get all of that information for all of your friends. So what Kogan and Cambridge Analytica basically did is they, through a variety of means, got some people to sign up for this app on Facebook, use it, take a quiz that was supposed to tell you about your personality. But when those people signed up, they gave Kogan and Cambridge Analytica, got all of the
Starting point is 00:42:24 signies, information as well as all of their friends. Right. Information. So even though they had only a couple and then they, and then they, and then they, and then they, and then they actually ended up getting like 30 or 50 million
Starting point is 00:42:34 and then they took it and then they, and then they, and then I think maybe most egregious is that they took the data. The quiz is one thing you're like, here's a finite thing that I'm doing. I'm taking a quiz, but they took the data and Cambridge Analytica then used it and I assume combined it with other data
Starting point is 00:42:48 that they would be scraping, like that would be matching. Like voter files, voter files, and things like that. And they built these like, psychographic profiles of the users and then used those profiles to target users. It gave that to the Trump organization
Starting point is 00:43:01 or gave that to the campaign to target, or we actually don't know who they give it to do. Yeah, we don't, the thing is they say that it was used for with the Trump campaign to target ads among other things. But who knows? So it could have been given to Macedonian hackers, well. Because I mean, the thing is if you have
Starting point is 00:43:19 30 to 50 million people's psychographic data from that data, I mean, the data isn't exactly the most important part of this. It's just the fact that they use that data to build a model or an algorithm that then could predict what political issues you are attracted the most and how you can be swayed as a voter.
Starting point is 00:43:40 Yeah, when this gets right down to it, what the result we think is of this is that like, I'm not saying Trump, but let's just say for the sake of the argument, the Trump campaign was able to find voters in a specific region where maybe there were weaknesses in the Democrat where they saw weaknesses in the Democrat's argument,
Starting point is 00:44:01 or weaknesses in those voters where they were like, these people are very worried about immigration in this area. And if you, if these people are afraid of immigrants taking their jobs, they will respond very strongly to it. And they basically crafted ads and who knows what else around this idea that these voters were susceptible to this messaging.
Starting point is 00:44:21 And in fact, Cambridge Analytica, which they've now been very careful to try to scrub this from all of their accounts, but Cambridge Analytica boasted on multiple occasions that their data and the way that they were able to use data could change the behavior of targeted people. Yeah. Right? Like, they're actually boasted like, what we do is change people's behavior using data. Yeah, and I can't remember,
Starting point is 00:44:45 but some a director or a high ranking member of Cambridge Analytica in an undercover video that was put out by Channel 4 News. Does next, the guy next. I think it may have been next, but I'm not entirely sure, but one of them said like, oh, the best way to run,
Starting point is 00:45:01 the best way to win a campaign isn't on the facts, but on the emotions. And so basically they're able to, or so they say, figure out the emotions and fears of these voters and target them based on that. Right. And I think the interesting thing here is, you know, we don't even know the depth to which this actually had an impact. We don't know how much it was used.
Starting point is 00:45:22 We don't know what's still being used because the data is now just free. It's out there. I mean, Mark Zuckerberg talks in these many interviews about doing this audit where they're gonna go out and figure out who has all this data. But Cambridge Analytica was just one place that might have received data. There could be thousands.
Starting point is 00:45:38 I think he actually said thousands in one of the interviews of other app makers who might have gotten similar amounts of data, right? Yeah, he said thousands. And the thing is that, I don't know, Facebook and a lot of lawmakers and people online are all calling, oh, we are really worried about this data. We want to make sure they deleted it. But the data, whether or not it's been deleted, honestly, isn't even, shouldn't even be the question right now. Right. Right. Because it's because if these companies have done what we think they've done, they've made models and algorithms out of this data that can then predict these
Starting point is 00:46:11 trends. So the data doesn't need to be used anymore. They can delete the data in front of you and still have the same worrying, predictive election changing power. And correct me if I'm wrong, but the data, this data is not like unspecific anonymized data. This is like a person's name has a psychographic profile attached to them that says, I'm not gonna know the exact machinations,
Starting point is 00:46:34 but in essence it is, Josh Topolsky will respond very badly. You know, if you tell him that there's a, you know, there's a, the squirrel population is rising in his neighborhood, he'll be react very strongly to that. Yeah, right. That's sort of information. By the way, if anybody knows that the squirrel population
Starting point is 00:46:50 is rising, am I gonna have to? Don't tell them. No, don't, don't please tell me, because I'm gonna need to do some vomit hunting, you know? Is that what they call it? I don't know. Are they varmant? Is it varmant?
Starting point is 00:47:02 Is the word? Is the word? No, vermin? No, vermin is a, that's a rat, rightmin. No vermin is it that's a rat right? No vermin is a type of rodent I believe but a vermin Is like a rap scallion on the farm Ryan do you know the answer to this? I think what you're talking about is a varmin There was my high school ones a squirrel hunter came on the ground Yeah, because we were by a protected forest land and our whole high school shut down for a
Starting point is 00:47:26 Week because there was a search on for a gunman. Wow and the gunman was a squirrel hunter. Okay, this is a lot more information I thought I'd be getting but I will say squirrel hunting is a hobby that people have yeah I think it's fucked up. I love squirrels, but I will say this I did not think I'd be getting a story from you That was not planned. I had no prior knowledge of that story and I'm extremely disturbed by it But anyhow the point is but the point is there's a profile somewhere that says, Josh Topolz, he loves and hates this or will respond. Maybe not. I didn't sign up for the fucking quiz because you know, I don't take quizzes on Facebook because I've previously stated. Smart.
Starting point is 00:47:56 So, so what is the issue here? Like, it's not that Cambridge Analytica took the data. It's that Facebook could get the date that had the Facebook give you. I think both are issues. I think the fact that Facebook is, these sort of sites where all this data exists and it exists for people to take. And for who knows how many years, developers and advertisers,
Starting point is 00:48:18 and probably some unknown third group of people could access this data if they wanted to. I mean, it was, there was a long period of time where Facebook was trying to build network of advertisers, the developers, so essentially, they let these people have access to whatever sort of data they want.
Starting point is 00:48:36 Okay, so we did an article about on this topic, I think it was yesterday, but like, they're not the only Facebook, it's not the only service that collects data like this on its users. Oh no, every, basically every social media app or site or website you go on is collecting private information about you and selling it in some way or sharing it with third party. Oh, okay, so let's talk about Google for instance.
Starting point is 00:49:01 Like I'm a Google user, you're a Google user. I mean, everybody I'm a Google user, you're a Google user. I mean, everybody I know has a Gmail account. Most people use their Google account as a kind of primary, their primary sort of center of the internet. But Google doesn't, I mean, Google is not giving the data away to partners. Are they? I mean, it explained to me like, I mean, maybe you know this and maybe you don't. But is it the same for Twitter? I mean, explain to me like, I mean, maybe you know this and maybe you don't. But is it the same for Twitter? I mean, Twitter doesn't have as much data
Starting point is 00:49:30 because you're not doing as much on Twitter, I would assume. But this to me feels like a particularly kind of gross negligence with the data where it's like, yes, we collect a lot of data on you. OK, I get it. For Facebook's purposes, they need the data, right? They want to know more about their users so they can put better ads in front of you
Starting point is 00:49:47 and connect you to people they think you should be connected to and all the other bullshit that Facebook does. And obviously, say to advertisers, hey, we can target really well. You want to advertise to this particular group, this demographic, we have that demographic. You can slice and dice it however you want. Internally, that makes a lot of sense for Facebook
Starting point is 00:50:03 and I've come to understand and believe that that is exactly what they are doing and what they, what their business thrives on. But it's like their lack of ownership of the data to me seems like the thing that is the fucked up part. But Google doesn't like, do we think that is there a Cambridge Analytica scenario with like a Google that is in existence
Starting point is 00:50:23 or that is going to happen? I don't think so just because Google doesn't have a similar developer platform as Facebook does. But I mean, all of these companies, I mean, when we're using services that if you're using something that's free, there's obviously money being made there. And if you're not paying the company, then you're the product. So then somebody is like paying the company for you or for access to you.
Starting point is 00:50:52 You know? I mean, I guess I'm asking, I know the answer a little bit. I mean, I agree with what you're saying. I mean, I think that, but the question becomes like, you know, what is trust supposed to look like in the digital age and that's the end of you have the same question? I mean, I think that just the issue is the fact, the issue is that technology and all these developments are happening so quickly, it is,
Starting point is 00:51:16 I don't even say impossible, but it legislates like lawmakers aren't keeping up with it and we aren't being good, we aren't predicting these with it. And we aren't being, we aren't predicting these sort of incidents. Right. This thing with Facebook, I mean, Zuckerberg, he made references several times in his interviews
Starting point is 00:51:35 to, I couldn't have imagined these in my dorm room in 2004. But you should have. But that's, I mean. Right. But in 2007, when you introduced this thing, was there no ethicist or anybody you talked to and said like, okay, well, what's the right way
Starting point is 00:51:47 to handle the state or how much should we allow people to have it because it's like people's personal habits, their life, their friends, their family, whatever. Like I think that there should be researchers and ethicists that are on staff of all these tech companies or better yet in some centralized regulatory hub that can check these companies. In Europe right now, they have, I can't remember exactly the name of it, but they have stricter data regulations and data privacy and protection laws that are coming into effect in May,
Starting point is 00:52:18 which is part of the reason why there's this professor David Carroll who is suing Cambridge Analytica for basically his right to figure out what data they have on him and all Americans right to figure that. He's suing it, suing them in Europe? Yes, because technically Cambridge Analytica is owned by this, like their parent company is called SCL Group, which weirdly is like a super high level British military contractor as well as a political campaign contractor. Because that's fucked up in its own way. But because Cambridge Analytica is technically related to this European based company, the data was processed in Europe.
Starting point is 00:53:01 Right. Because of that, we technically, or so these lawyers are arguing in the suit, all the Americans whose data was processed by Cambridge Analytica have rights under European data protection laws. So in a weird twist of positive fate for everybody, you're saying potentially, like, I'll be able to go or somebody will be able to go and say, I want to know exactly what you have on me. I mean, you can do that now, but Cambridge Analytica, the files they're releasing are pretty op. So how this whole lawsuit started was David Carroll, the person who's now suing.
Starting point is 00:53:33 He requested, yeah, I interviewed him. Yeah, I'm, we have a story on the site. But he, he asked Cambridge Analytica for his data, his voter data file. They sent it to him. And it had quite a few data points, you know, about his beliefs, all that stuff. But it seemed pretty incomplete. Because Cambridge Analytica is boasted numerous times. They have about four to five thousand data points on each person in the US, each American.
Starting point is 00:54:01 But his had maybe like 13 or something like that. So he sued for the entire voter file because under European laws, if a company is a data processor like Cambridge Analytica, they have to show people, if somebody wants to know what data they have on, like if I want to know what data they have, I have that right. Right. Yeah. Yeah. No, I mean, I think that's, and that's how it should be.
Starting point is 00:54:23 But I do, I do agree with you that there's been this, I mean, we have been in this kind of whirlwind of progress where there's been very little thought to like, what do you do in a worst case scenario? Or what is the right way to be for these, what is the right way for these services to be? What is the right way for, you know, for, for a platform as big and as, as, as vast as Facebook. You know, as sorry, as with as many users and as vast
Starting point is 00:54:51 and it's like capabilities as Facebook, like what is the appropriate amount of like data that people should have access to? Or like how do you store it? How do you like take care of it? How do you make sure nobody abuses it? And I think the company should be thinking about that when they're not as big as Facebook,
Starting point is 00:55:03 you know, when they start out, that should be a sort of thing that people are thinking about at every stage of the game, like even if you're a small startup. But Zuckerberg specifically said, I mentioned this earlier in the show, I tweeted about it last night, like he's like, I don't want to be the person sitting in the room making these decisions. This should be the community's job and, you know, I don't want to be sitting here in an office in California making decisions for the world But well, that's what he's doing so he can't I mean he's doing that whether he wants to admit that he wants to do it or not
Starting point is 00:55:31 Yeah, it's the reality and I mean Facebook Specifically, I think that's such a ridiculous claim because Facebook had some sort of community rules program I can't remember the name of it But at some point there was like a page where you can go to and you could like post what sort of changes you wanted to have happen on Facebook, like vote on stuff that was happening in an organizational level, but no one participated because who's gonna do that on Facebook? So they can't it. And Facebook has a ton of of of of
Starting point is 00:55:58 perp seemingly either arbitrary but purposeful rules that exist. Like you can't put porn on Facebook, right? Like, I can't post, even if I want to put my own nude photos on Facebook, like I can't put those up. Which is different from like Twitter or Patreon. I mean, Laura, my wife, has written about, you know, a woman on a mother's, like on a mother group on Facebook posted a picture of her giving birth.
Starting point is 00:56:25 And the picture was taken down because somebody reported it and it's like, oh, they're like, this violates our policies. And it's like, okay, so they're like, we don't want a picture of a woman giving birth on here. They're willing to take that down. And they certainly have policies about content they do or don't like and rules and regulations about how you should use the service.
Starting point is 00:56:40 And the thing that's striking to me is like Zuckerberg asks acts as if these things are just have bubbled up naturally out of a community. There must be, I mean, are there ethicists at Facebook? Do you know this? I mean, I don't, I don't know. I would hope so. I mean, I just think that it's such an obvious
Starting point is 00:56:59 and essential component. I mean, both from a user side and also it's gotta be kind of a plus for the companies, because I mean, they could, user side and also it's got to be kind of a plus for the companies, because I mean, they could, if, let's say like Facebook had an ethicist working for them and whose entire job just was to think about these sort of things from the beginning, they might have avoided all of this. Right. I mean, somebody would have said, hey guys, actually, if you're going to share user data, you've got to do it, if you really want to be careful, they do it this way.
Starting point is 00:57:23 Right. But I mean, the reason why people don't do that is because it's, I mean, ethics are kind of on the opposite is that the spectrum from profit in a lot of these situations. Right, and also, like, they're solving engineering problems. I mean, it's like, oh, of course, this makes sense to an engineer to do X, Y, and Z, but like, what it makes sense to, when it comes to like, whether it's good for a human being
Starting point is 00:57:41 or a group of people is actually a question that an engineer can't really answer, right? It's not like a numbers problem. It's not like a code problem. It's a problem about, well how should people be with one another, right? Yeah, and I mean, I think that we should all start thinking about these companies in the same way
Starting point is 00:58:02 that we're thinking about, I guess, like, you know, like when the iPhone comes out, before the iPhone can ever hit the market, it has to go through a bunch of tests, it has to be approved by certain boards. If you go to your settings, there's all of those little trade marks. You're a regular. So you want to slow what you're saying?
Starting point is 00:58:19 So you'd like to slow progress. You'd like to slow down Silicon Valley. You'd like to rip the beating part. I just take capital of the market. I'm not a problem. You want to rip the beat run of innovation. I just want to beat up Mark's. And progress. And then just put a bunch of red tape all over Silicon Valley, thus making it more difficult for them to test on human beings to create the blood of teenagers, to
Starting point is 00:58:43 create mutants offshore. I mean, you just want to, you want to actually an anti-X-man. Yeah, I see that. You want people to just remain where they are and never be able to escalate their reality to the next level. Mm-hmm. Which I think is very offensive to me and to a lot of my brethren in Silicon Valley,
Starting point is 00:59:01 my man, Peter Teal, my good friend, Peter Teal, who I love a lot and I think is so great. And is, you in the New York Times man Peter Teal, my good friend, Peter Teal, who I love a lot, and I think is so great. And is... You and the New York Times got that in common, right? It's just the best. That's right. We have that in common. Okay, so let me ask you this.
Starting point is 00:59:13 Let's zoom back to the real world for a second. Facebook dead. Facebook over. No, Facebook isn't dead or over. You don't think so? No. It's unquery and cool, though. I mean, it's uncool to cool people, I guess.
Starting point is 00:59:26 But I mean, if you go to any group of like moms, or maybe I kid do kids like Facebook. I mean, moms, dads, grandparents, grandparents, love, for those people, they'll say it's cool. But that's the end of the line, isn't it? That's like fucking, you know, they also read MSN, you know? Oh god.
Starting point is 00:59:42 No offense to MSN, but you know, my dad's like home page is set to MSN. No offense to my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like,
Starting point is 00:59:51 I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like,
Starting point is 00:59:59 I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like, I'm a fan of my dad, but like big to be a dinosaur. I mean, like Facebook is Instagram, Facebook is Facebook Messenger, Facebook. I do use WhatsApp, I do like WhatsApp.
Starting point is 01:00:09 Yeah. I use Instagram, I like Instagram. Facebook is so much bigger than Facebook. Facebook is like one of the largest advertising networks out there, you know? But Instagram only gets so much out of me. I mean, Instagram only. But I mean, you were not the target audience
Starting point is 01:00:21 for Instagram. Why, I'm a dad. That's not the target audience for Instagram. I'm an influencer, I'm an influencer. My dad flew it, sir. Slash Josh, what's up, it's gonna be a lot of good picks on there. Lot of tight picks, my abs, working out on the beach,
Starting point is 01:00:34 hanging out. I'm hanging out, hanging out my bros Jake and Logan Paul, my two best friends in the world. It's everyday, bro. It is everyday, bro, for me, because I'm into that sort of thing and that's what I do on the internet you vlog you're a vlog I'm a vlogger and I hang with vloggers. That's the other thing about me
Starting point is 01:00:49 I mean I've said it before and I'll say again YouTube is my life Mm-hmm, bro. Yeah, I mean you're when I think of team 10. I think of you. That's me. I don't know what that is But that is me But like but like I mean okay But that is a bad sign for Facebook, right? Like that, okay, yes, Instagram is popular, but Instagram is not Facebook. I mean, it is Facebook, but Instagram does not get
Starting point is 01:01:13 the same data from its users that it gets from the Facebook interactions, which are like, here's what you can do. Here's things you can do on Instagram, I'm actually gonna run down the list, okay? Post a picture of something, identify the location of that picture, identify the people in that picture, comment list, okay? Post a picture of something. Identify the location of that picture. Identify the people in that picture.
Starting point is 01:01:27 Comment on a picture. Like a picture. Save a picture to a private bookmark area, which definitely is not for butts, but is for something else. It's like for stuff you wanna buy. Here's, as far as- Oh, now you can shop on Instagram.
Starting point is 01:01:39 You can shop on Instagram. The only thing, so I mean, they're okay, wait, hold on, are there more things you can search for things on Instagram? There's the explore feed Message people on Instagram wow stories. Yeah direct messages stories Stories of locations as well location stories wow Instagrams does a lot actually no Instagram does do a lot Instagram is like Facebook, but what's it it missing? What does it have groups? Okay, Facebook just has too many random things.
Starting point is 01:02:08 Yeah, yeah. They're big, they're like, the colorful posts. Oh, no, you can do that on, you can do that on Instagram. When can you? Yeah, it's like all those like, like in your, in your, yeah, in the stories.
Starting point is 01:02:20 Oh, yeah, in stories, but, but not just like on an Instagram post. I mean, you could, but I don't think you'd do it in the end. I'm just trying to think of like, yeah, in stories, but not just on an Instagram post. I mean, you could, but I don't think you'd do it in the app. I'm just trying to think of like, okay, but Facebook, yeah, Facebook has a bunch of random shit. I mean, like Farmville. Facebook has like delivery, Facebook has Facebook watch.
Starting point is 01:02:33 You could pay people's Facebook? Yeah, you could pay, no, every time. They have an AI assistant, that one is many screen that's like 27 tabs of random task. They have a marketplace. Oh, they have ad trackers that can follow you everywhere, but I mean, that's the same as Instagram. Yeah.
Starting point is 01:02:49 I am, my Instagram is just a bunch of random ads. What are you've written about your apps? You get a bunch of, ever, like, literally today I got like, like, are you depressed? They're like, don't do it. Don't kill yourself. Instagram things, Facebook things, I want to kill myself. Well, maybe you're depressed. How do you feel about it now feeling now? I'm fine. Do you feel sad? No, I'm pretty
Starting point is 01:03:09 talk about it. This is an environment. This is actually an environment where a lot of people work through their stuff. This a very small room. You saved a lot of. Yeah, well, this is temporary, okay? I mean, it's not. The other room is actually smaller than this, so I don't. Is that the phone booth? But anyhow. Okay, so. So what's the future in your opinion? Of what? You write, oh my, Phoenix the fucking question. Jesus, parents. You paused. What is the future in your opinion
Starting point is 01:03:37 for like the evolution of social media? Facebook has been for the long, I don't know how you don't have to give me the, but what I'm saying is Facebook has been like the, you know, the don't have to give me the, but what I'm saying is, Facebook has been like the, you know, the pinnacle of the success that one can have, that a company can have in social media. Does this change, does any of this stuff change our relationship,
Starting point is 01:03:55 change the way we think, or change the way we use these services, do you think, in the long run? I mean, if I'm being optimistic, I'd like to say yes, but at my core, I'm definitely a cynic when it comes to the internet and media, so I don't think anything is going to change. I mean, I think that people, maybe in the grand scheme of things, will start moving towards
Starting point is 01:04:22 platforms that seem to be more enclosed. Like peach. What is peach? You don't know a peach is? What is peach, yeah, I don't know what peach is. You know what I can't. Is that the app that you showed me that one time that seems like the best social network
Starting point is 01:04:35 it's ever been created? I think it's the worst social network it's ever. I mean, it's funny, you know, Djemmer Path, which is Dave Moran, who's one of the guys who was one of the early Facebook guys who left, he started a company called Path many years ago. And it was like personal social networks. It was like a much more like private kind of thing.
Starting point is 01:04:52 He says, he claims last night on Twitter, he's like so many people have been asked me to bring Path back. I actually tweeted them, I'm like, hey, let's talk about your new strategy. He's like, okay, but like, you know, like they're gonna be some new entrants into social media, right? Are there any new ones right now that you think are, could actually, like Snapchat is what?
Starting point is 01:05:08 Snapchat is they did this Rihanna thing, they're all fucked up too. Like they can't get anything right. Who's doing it right? Who's good? Who's good in social media? Gav.ai? Oh God, Gav.
Starting point is 01:05:19 You're on Gav, you have a Gav user, right? Oh yeah, I Gav everyday, I'm a Gav. Everyday, bro. Yeah, on Gav. I'm basically gavber. Everyday, bro. Yeah. I'm gavber. I'm a Logan Paul. Logan J. Caller definitely on gav. For sure, 100%.
Starting point is 01:05:29 Yeah. Maybe I message or slack being social networks that I haven't had a ton of problems or incidences with, but I used to communicate with people all the time. But they're private. It's private. Yeah, and I mean, I think it's going to be stuff like that that seems.
Starting point is 01:05:43 What's the vinyl of social media? That's the question. Like if Spotify is the pinnacle of... Live to the block. That's not bad. Because if Spotify is like the pinnacle of like streaming media if we think of like, oh, you could share your stuff and follow your friends and blah blah
Starting point is 01:05:59 and like you listen to music and it's like scrawl. It scrobbles what you listen to other people and all that shit. But like, so the absolute reverse would be like, I bought an entire album on a piece of physical media. I put it on a platter. I put the needle down and I listen to all six songs on each side all the way through.
Starting point is 01:06:17 What is that for, what is that for social media? Does it exist? You think it's blogs? God, I mean, is it like we're all gonna write books? For, we're passing notes in math class. Yeah, I feel like it would be like physically writing things. Like a newspaper or something. I don't know, is there like a bespoke farm to table,
Starting point is 01:06:34 kind of granola like social media network that has yet to exist? Maybe, it's like four people. Maybe seen. A scene, but a social scene. Yeah, but I don't think it's ever going to, I mean, Mike had probably been some hipster. Yeah, outline.com.
Starting point is 01:06:48 Check it out. Yeah, I mean, basically, this is an audio zine. The world's for a social zine. That's our new tagline. You don't realize it. Well, because we described a dispatch as an audio zine. Yeah, an audio zine. It is like an audio zine.
Starting point is 01:07:01 I mean, zine is right for much of what we do, I think, in the sense that it's like meant to feel made by human beings and not like a manufactured. Yeah, I mean, I think that that ultimately will be a trend in social media. Is there like, but is there a scene version of that? Is there a scene in social media, I feel? Is there like a thing that's like, like wow is there a thing that would actually sorry part of the wall just fell down so we have going really well here we really have that and this is so this is so fucking cheap we have we have like a sound panel tape to the tape to a door with masking tape I'll point out that we have a like ten thousand dollar studio booth in the next thing and we're currently using masking tape
Starting point is 01:07:46 Yeah, it's very this is a real scrappy episode of tomorrow, and I'm frankly loving it All right, Paris any any any final thoughts any parting words anything else that that you feel like the listeners of this Program should know about the future. Oh, I mean no pressure. No no pressure there just Just predicting the future. Oh, I mean, no pressure, no, no pressure there just, uh, just predicting the future. Just predict what's happening. I mean, delete Facebook if you want, but if you're going to, if you're going to, if you're going to, no, because f1, I have to use Facebook to, uh, stand top of that Facebook, but also, yeah, you know, it's, you know, it'll be only journey. But also like, if, to build, if, if you're thinking about deleting one social media app, you're gonna probably have to delete them all or just, just read the privacy policies of everything.
Starting point is 01:08:33 I've been, I've been slowly, surely, but surely going through all the apps that are connected to Facebook and, and definitely do that. Oh yeah, definitely do that. Go through, I'm like, oh wow, the, um, yeah, see you've connected apps to Facebook. The Windows Phone, my Windows Phone connection from 2008, like that's still there. Oh, they're all there. I still have, you know,
Starting point is 01:08:50 Microsoft Siphon in my data. I still have weird ass pro, like personality quizzes from like 2008. Yeah, and I want to stress, I've never done a quiz on Facebook. Yeah, but you've done apps to the same thing. I'm sorry, I'm not saying that everybody's a dope for doing it, but you gotta be real. You're playing with fucking fire, doing a quiz anywhere as far as I'm concerned.
Starting point is 01:09:09 It's not, it's not a first party. I took this quiz back when I was naive and did not think that Mark Zuckerberg was out to get me and steal my data. See, that's my business I never- I had hope and faith in the world at one point before it all- Well, that was actually my- That was gonna be my final- And my final question is like, are you feeling very negative and pessimistic about where we're headed or do you, is there a way?
Starting point is 01:09:29 I mean, I think the one thing that I think is kind of positive is, I guess, because I was just talking about this other day, but in regards to David Carroll's lawsuit of Cambridge and Alina, it has the potential to be kind of a landmark case for data privacy. And it could, like, you're up right now is doing some really cool and interesting things when it comes to data protection law. And they have, like, groups of people, you know, Brussels that are trying to think about all of the issues that are going to come up in the future in regards to data like privacy and the protections that we need. And honestly, I mean, because the way the
Starting point is 01:10:11 internet works, it's not really separated by country, unless you're in China or Russia maybe. But if their laws end up, I don't know, really working and actually causing these companies to change. It will change for us too, and we will have more protection and more privacy. Wow. Wow, an optimistic view. Also, it sounds like we should all just move to Europe, is what I'm mostly here. Awesome.
Starting point is 01:10:37 I mean, we could get an office in France. We can get in Paris, mate. Yeah. It's a place. Great Paris. That's fantastic feedback. I really appreciate that suggestion that we get an office in in Paris, man. Yeah, okay. It's a place. Great Paris, that's fantastic feedback. I really appreciate that suggestion that we get in office in your namesake town.
Starting point is 01:10:51 I do love Paris. This is great. Thank you so much for doing this. Like this is very insightful. And frankly, that last piece right there, like to me, is actually makes me somewhat hopeful that there can be some change in fact. I think I don't know what else is going to happen.
Starting point is 01:11:04 Like it's, we've been been flying completely blind for so long. We've gotta do something. Yeah, I mean, I just think the thing is that all these companies are really coming out of Silicon Valley. And so American regulators or lawmakers aren't really going to want to put in regulations for these companies, because then they'll be slowing American businesses like you just said.
Starting point is 01:11:27 Like I want to. But I mean, people who are outside of the country and have no connection to Silicon Valley, they kind of see it like it is. Well, you know, Silicon Valley may not be the the fount of innovation for the, you know, the rest of time. Like it may very well be overtaken by another, another, another place, another,
Starting point is 01:11:48 and there'll be other issues to you. Paris, maybe, you know, Paris France, could be the new. What is it, a station F? I don't know. People in France, it really worked out hard. I think that's the thing. I mean, I don't get me wrong.
Starting point is 01:11:57 I love the French, but they're not, you know, not like nine to fiveers. That's true. No, they're very much in the opposite. They're like, they're like nine, they're like 11 to three, 30, 10 to two,. That's true. No, they're very much in the opposite. They're like nine to 10 to three. 10 to two. 10 to two. And not on Friday.
Starting point is 01:12:09 Exactly. All right, parents, thanks so much. And you got to come back and do this again. I will make a more track five steps over here at some point. As soon as you uncover more horrendous secrets about the internet, you have to come and talk to me about them. Okay, great. And also put them on the internet because it's important to us.
Starting point is 01:12:23 Okay, thanks. Thanks. Bye. Well, Ryan, that was a very instructive conversation. I feel like we learned a lot. Are you upset now? Yeah. I think I'm going to have to take a couple more of the kill against Island style binges to get over this. I don't know. The final note, the final mention there, the final sort of thought. I feel like, you know, I have a tiny bit of hope that somehow we can be better as people and, you know, just live together, love one another, you know. Just try to bring some peace and harmony to the world, you know, using the social graph. I think it's the least that we can hope for.
Starting point is 01:13:02 So wield the mighty power of the social graph to control the democracy and mighty power of the keyboard. Funnel. To type out a nasty comment. Product capitalism at the top is wrong on the internet. Alright, anyhow, that is our show for this week. We'll be back next week with more tomorrow. And it's always that we wish you and your family the very best, though I have just been given the data dump on their quizzes. And I gotta tell you, there are all Trump supporters, and they want him to build the wall. you

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.