The Daily Show: Ears Edition - Tristan Harris - Facebook's Danger to Society

Episode Date: October 11, 2021

Tristan Harris, cofounder and president of the Center for Humane Technology, discusses a whistleblower's revelations about the inherent toxicity of Facebook's business model. Learn more about your ad...-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 You're listening to Comedy Central. When 60 Minutes premiered in September 1968, there was nothing like it. This is 60 Minutes. It's a kind of a magazine for television. Very few have been given access to the treasures in our archives. Really? But that's all about to change. Like none of this stuff gets looked at. That's what's incredible. I'm Seth Done of CBS News. Listen to 60 Minutes, a second look, starting September 17th, wherever you get your podcasts.
Starting point is 00:00:36 Welcome back to the Daily Show. My guest tonight is Tristan Harris of the Center for Humane Technology. He's here to talk about Facebook and whether the social media giant can be both responsible and profitable. Tristan, welcome to the show. Pleasure to be here with you. Your face is one that I both enjoy seeing but at the same time it brings me a lot of sort of PTSD because you said a lot of things in the documentary where I think a lot of people would probably you from the social dilemma where you just laid out how like like like like like like like like like like like like like like like like like like like like like like the the the the the the the the the the to the to to to to to to to to to to to to to to to to be to be to be to be to be to be to be to be to be to be to be to be. to to to to to to to to to to to to. to to. to to to. to be. to be. to be. to be. to be. to be. to be.. to be. to be. to be.. to be. to be. to be. to be. to be. to be. to be. to be. to to to to the. the. the. the. the. the. the the the the the the the the the the the the the the the the the the the social dilemma. Where you just laid out how like social media is fundamentally designed to turn my brain into a certain type of mush and then just leave me feeling shitty about myself but constantly needing to
Starting point is 00:01:16 re-engage with the product. Yeah. Yesterday everything was down or most of it was down, you know Facebook was down, Instagram was down, WhatsApp. Were you popping champagne when that was happening as somebody who has been like really hitting on the idea that everything should be reigned in? Well I think it certainly gave people a taste of what it's like to just not have this thing in our lives and I think it's so interesting that it happened the day after this Francis Hougain the whistleblower from Facebook, came forward, because she's basically just, I mean, this is the largest release, I think, that we're going to see in Facebook's history because after this, there's probably never going to be research at tech companies that's done on identifying the harms of their products.
Starting point is 00:01:57 Why do you say that? Well, because, so Francis, the whistleblower, she basically, I think, took photos of basically all of this research, tens of thousands of documents. And once that happened, it's not just that the harms in the social dilemma are true, it's that Facebook knew that they are harming teenage mental health. They know that they drive anorexia and body eating disorders and children. They know that it drives political parties in Europe, India, Poland, Taiwan, Spain to go more negative and divisive so that the key of the whistleblowers insights is that they know that it's sort of harming society but they don't change because they still prioritize profits over safety. So let's let's try to break this whole thing down. Yeah. You know I I agree with almost everything that you say you know I sit there I see th I th I th I th I th I th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th th thi thi the thi the the the the the the their their their their. their. their. their their their their their their their their their their their their their their their their their their their their their their their their their their their their their their their their their their their. their their. th. thi thi. thi. thi. te. te. te. te. te. te. tho. the. the. their their their their their their the let's try to break this whole thing down. Yeah. You know, I agree with almost everything that you say. You know, I sit there, I see it in my own life, I see it with my friends, I see it in my society.
Starting point is 00:02:50 You know, I see what these apps do to us. I always ask myself the question, though, know, so I think of it like as humans, you know, back in the day I had to like get to you to give you my opinion, which would, I think, generally lessen the amount of conflict between people because you were in that village, I was in that village. It took a while before our villages had to meet head on.
Starting point is 00:03:18 Whereas now, I can have an idea in thia in in their, idea in New York that can offend somebody in India just because of social media. Yeah. So is it that social media is the problem or is it just humans are the problem and social media amplifies the problems? It's a great question. I think the problem is that you know the worst of human nature exists in all of us. Okay. But the best in human nature also exists in all of us. I mean, look, you've got the John Jouid and child soldiers in certain their, their, their, their, th, th, th, in th, th, th, th, th, th, th, th, th, and child soldiers, and child soldiers, in th, and child soldiers, in th, and their, and their, and their, their, their, th, and their, their, their, thii. thi. thi, thi, their, their, their, and, their, and, and, their, and, and, their, their, and, is, is, is, is, is, is, is, is, is, is, is, is, is, is, is their, is the the the the the the the the the the the the the the, the the, the, the, the. And, their, their, they. they. they. the the the the the theymea. the the the their, their, their, their, is the mean, look, you've got the John Joued and child soldiers in certain places, and you've also got these peaceful tribes that have lived in harmony with nature for thousands of years, you know, in whatever way. But when social media, every day, when you look at it, it points a trillion dollars of compute power at finding the next fault line in society. So when you open up the feed, okay, what am I th is is th is tham to to tham to tham to to tham tham tham tham thi, thi, thi, thi, the the thi, thi, thi, the thi, thi, the thi, thi. thi. the the the the the, the, the, and then, and then, and then thi. And, and then thi. And, in thi. And, in thi. And, in thi. And, in thi. And, in thi. And, in the thi. And, in thi. And, in thi. And, in thi. And, in thi. And, in thi. And, thi. And, the thi. And, thin. And, thin. And, thin. And, the way it does that is it takes the trillion dollar market cap of those companies, supercomputer,
Starting point is 00:04:07 calculates, okay, what would most likely get you not just to look at it or click or share, but to comment on it. Facebook recently, in these revelations, it found that they were sortingative comment threads. So when you have an AI pointed at finding the next fault line in society, like perfectly with hyper-microtargeted personalized precision, like the thing that will make you hate your fellow countrymen and women, and then you run society through that for 10 years, like it's no surprise that whether it's vaccines or mass or anything that we would be this divided. And the key is that the more it polarizes it polar it polar it polar it polar it polar it forces it forces it forces the more it forces the more it forces the more it forces the more it forces the more it forces the more it forces the more it forces the more the the the the the thusususususususususususususususususususususususususususus the the the their to to to to to to to to to to And the key is that the more it polarizes citizens, the more it forces politicians to actually cater to a more extreme base that never resolves.
Starting point is 00:04:50 We don't ever get synthesis, which means democracy. Which means democracy, that's like you just throw a wrench, you know, a wrench into the gears of democracy. And when people don't it's all fallen apart, they then look to the most extreme leaders to try get them out of it because they feel like the system itself is broken. Yeah, well, I mean, if the system's not delivering results to you, like there's this pothole, or there's inequality,
Starting point is 00:05:17 or there's climate change or social justice That's the thing. This is a bipartisan issue because it takes whatever you feel and then it shows you a more extreme version of why you should be angry about that thing. And then again that makes you never have, you'll never elect people who have some kind of synthesis of what should be done. The key evidence in the Facebook revelations was that in political parties said to Facebook we know you changed your ranking system. Right. And Facebook's like like like, the the the the th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, th, thi, thi, tho, tho, tho, tho, tho, tho, thi's thi, tho, tho, tho, tho, tho, tho, tho, tho, tho, thoes thoes thoes thoes thoes thoes tho, thoes tho, tho, tho, thi thi thi thi thi thi thi thi thi, thi is thi is thi, thi, thi, thi, thi is thi is thi is thi is thi is thi is thi is thi. thiiiiiiiiiiiiiiiiiiiiiiii. thii. thi. thi. th Facebook, we know you changed your ranking system. And Facebook's like, oh, come on. Like, people had this conspiracy theory all the time. Yeah, yeah, yeah.
Starting point is 00:05:49 Oh, you think we changed it. Tell us what you think happened. And the political party said, no, we know you changed it. Because we used to be able to publish policy papers, And then they said when you change the algorithm, we no longer got traffic for those papers. We noticed we only got traffic when we said negative things about our opponents. Interesting. And we don't want to do that, but we don't have any other choice. And what that shows you is in the political marketplace of ideas.
Starting point is 00:06:16 We don't have an invisible hand where it's like people are just choosing the conditions that politicians have to cater to. It's so interesting that you say this because I had a conversation with a friend and then I had another conversation with someone who I just know who's conservative right and what was interesting was the friend of mine was going like is like oh the hey you know I was I was disappointed that when you when you had this thing on your show you said this but th. the the the the the the the the the the the the the the the the the the the th. th. th. th. th. th. th. th. th. th. th. thi. thi. th. th. th. I was. th. I'm thi. I'm thi. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I'm. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I. I'm. I'm. I'm. I'm. I'm. th. th. th. th. th. th. th. th. th. th. th. t. th. th. th. th. th. th. th. the. the. the. I'm. I'm. I'm. I'm. the hey you know I was I was disappointed that when you when you had this thing on your show you said this but then you didn't present like a like a nuanced view on the whole thing and I was like what are you talking about I said exactly that and then he was like no you didn't say that I
Starting point is 00:06:53 said no I did say that I said where did you where did you watch the show he's like oh no I saw a clip on online exactly I I was I th I th I th I th I the th I th I th I th I th I th I th I th I th I th I th I th I th I th th I th th th th th th th th th th th th th th the the the th th the the the the the the like like like like like like like like like like like th. th. th. th. th. th. th. th. the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the th. th. th. th. the. the. the. the. the. the. theeeeeeeeeeeeeeeeeeeeeeeeeeeeeee. thee. So like I found, like just for me as Trevor, I found like some of the clips that I make, like someone will cut them, the way they want to cut it. And if that clip really inflames conservatives, that's the clip that will go to them. But then if there's a clip that liberals enjoy, that's a clip that this to me. Exactly. Well so this is actually really important. It doesn't just change what political parties do to get elected.
Starting point is 00:07:30 It also changes what publishers or what media do. Like so when you're doing your show, right? You have so many people who are tuning in right now, listening to you but then you also know that you get millions more people people watching... their. their. their. their. to. their. to. their. to. to. their. their. their. to. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. the political. the political. the political. the political. the political. the political. the political. the political.... the political................................................................................................................................................ Right. So that forces, so when they change the digital, when the Zuckerberg digital hand says we're gonna reward you know negativity or what personalizes to get people angry, each of those clips will like just like you said, the thing that that outrageous conservatives goes directly to that group and they don't get the context, right? And the thing that goes to liberals, same thing. But again what it's doing is making us hate thii thi it's actually not about censoring conservatives or misinformation. It's actually just viral engagement that's the problem. That's like the Philidymide, the DDT for our democracy. That's that's that's the prime's it to blow up. Yeah, essentially. When 60 minutes premiered in September 1968, there was nothing like it.
Starting point is 00:08:21 This is 60 minutes. It's a kind of a magazine for television. Very few have been given access to the treasures in our archives. But that's all about to change. Like none of this stuff gets looked at. That's what's incredible. I'm Seth Done of CBS News. Listen to 60 minutes, a second look, starting September 17th, wherever you get your podcasts. Exactly.
Starting point is 00:08:48 Okay, but now, let's say if Mark Zuckerberg was right here. First, I would fight him because as you said, he's spoiling it, spoiling my life. But secondly, he would argue, he would say, look, I'm not doing that. I'm just trying to get engagement. I'm just trying to, I'm trying to foster the positive as well. What would your argument against that be? So they want to claim that they're just a
Starting point is 00:09:10 mirror for society. Yes. You know, like, so first of all, it is totally true that that polarization has existed in our society way way before that. But then when when when when when when when when when when then then then then then then thi thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi thi. thi's thi's thi's thi. thi. the wooo-I wo-I wo-I wo-I wo-I wo-I wo-I wo-I wo-I wo-a their their their their their their their their their their their their their their their their the by what gets again the most controversy, the most outrage, which do you think is going to happen? Are people going to become more synthesis oriented and more competitive? We're defining the reality. More than that, we actually have new evidence that when Facebook defines the reality, we've been saying that for a while. What just came out a week ago is Facebook is Facebook. thapapapapapapapapapapapap. th. th. th. th. th. th. th. thi. th. th. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. thi. th. th. to. to. to. to. to. to. to. to. to. thi. thi. thi. thi. Amplify where they actually, they want to sow positive news about Facebook. So they actually show people positive stories and it can be targeted to you. So if you love horses, you'll see the story about how Facebook helps someone who lost their
Starting point is 00:09:55 horse in a farm. That's amazing. And so it makes, or whatever. And so again, it's like, you know, I think Lord Mayor Rothschild said, give me a control over a nation's money supply, and I care not who makes its laws. But if I'm Zuckerberg, I say, give me a control of people's attention, beliefs and behaviors. And I care not about anything else. In a way, Facebook is a for-profit sort of parallel government, that's like an AI that's controlling and shaping people's beliefs, thoi. And again, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, Facebook, I I I I I I. And, Facebook. And, I I. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. F. thi. F. thi. thi. th people's beliefs, thoughts, and behaviors, and again with artificial intelligence. So it's like it's your brain. Well I mean we saw how powerful that was in with the election, let's say
Starting point is 00:10:31 Brexit, we saw Brexit, we saw Brexit, we saw with the rise of Trump, we saw that with just I mean even in like Trinidad, it was like all these tools that that company was using Cambridge Analytica, they were using those Facebook those Facebook those Facebook people's realities. Exactly. They could get people to not vote, ironically, not just vote, but not vote. So then what is the answer? Do you switch to China's model where China goes, hey, no more than 40 minutes of Tick-Tock in a weekend and no video, we're going to just shape society. Do we switch to that model? It seems dystopian. Right. It's a great is is is is is is is is is is is is is a great. It's a great. It's a great. It's a great. It's a great. It's a great. It's a great. It's a great. It's a great. It's a great. It's a great. It's a great. It's a great. It's a great. thi. thi. thi. thi. It's a great. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. thi. th. thi. thi. thi. thi. to. to. to. to. to. to. to. to. to. to. to. to. to. thi. thi. thi. thi. th. seems recently, I would say this, currently we're faced with what appears like two bad options. You allow these business models to continue, you take your hand off the steering wheel, let this keep going, right?
Starting point is 00:11:14 And it basically just breaks democracy into chaos. Or it seems like there's this other model, China, and China basically controls its internet. It almost seems like J Jinpinginping saw the social dilemma because over the last two weeks, Yeah, he's been pretty intense. He has, he has actually, they've changed the Chinese regulations so that teenagers, if you're under the age of 14, you use Tick Tock, that you only get to use it for,
Starting point is 00:11:36 I think it's 40 minutes a day. And they show kids, theirk, thak, science, science, thak, thak, tha, tha, tha, tha, tha, tha, thiiahahahahahahahahahahahahahahahahahahahahahah. thi, thi, thi, thi, their, thi, thi. thi, thi. thi. their, tho, their, their, their, their, their, their, their, their, their, their, their, their, their, he's th.. th. th. He. He. He. He. He. He. He. He. He. He. He. He. He. He. He's th. He's, he's thi. He's thi. He's thi. He's thi. He's thi. He's thi. He's thi. He's theea. He's toda. He's toda. He's tha. He's thea. He's thea. He's to be astronauts. Yeah, like how to be a doctor and what, right. And then in our case, in the catastrophe land, we allow sort of Tick-Tock to reward whatever just gets the most engagement. Like the devious licks challenge. How to burn down your school? Howe, or become an influencer, right? get dystopia in the form of oppression where you're controlling it. The whole point of this conversation, and I think what Francis Hougain, the brave whistleblower who came forward, is that we have to show there's a third way that's not anti-technology,
Starting point is 00:12:12 it's actually democracy, plus technology, to actually democracy, to throcee, to make stronger democracy, their cioloa consciously employing technology to make stronger democracy with more civic participation. That's what we want to create. It's not anti-big-tech, it's how do we make sure technology and open societies allow us to create something that's more humane and positive for people. And that's the change that has to come from this. Yeah, but how's the change going to happen when people in Congress don't even know what an app is. I mean, you saw there was a congressman, there was a senator who's like, oh, no, Finster, are you gonna, are you gonna delete Fenster?
Starting point is 00:12:48 If those are the people making the laws, then I argue that the laws will never be made. And then big tech is influencing which laws are even thoea thoe, want to say I think that I feel so hopeful and almost made me emotional today watching Francis Hogan the whistleblower because I honestly think that she turned the tide and we are going to have regulation that's that's coming from people like her like groups like Accountable Tech some of us of Oz Center for Humane Technology my organization where people who understand these issues can get us to a place where we're not ranking for what creates crazy town in society. And I think that's what we have to do.
Starting point is 00:13:25 And I think one of the last things to say is that if we don't do this, it's kind of a national security threat. We used to say if Russia or China we're going to blow up the Congress, we have to have continuity of government. We have to make sure the functioning of Congress. It's almost like an EMP attack on culture and our ability to make functional democracy. So I think if we see this as a national security threat,
Starting point is 00:13:52 if we see this as urgent for our children, we can make a bigger change than what's been proposed so far. Well, as you said, I hope that the whistleblower has started a tidal wave that can hopefully you know get things moving and I hope you'll keep talking we'll keep talking about it as well because I do not want to be part of a world where people want to punch me for not the full context of what I said. Punch me because I said something but know what I said in full. Thank you so much. Thank you so much for joining me on the show much. I appreciate you so much. If you want to to to to to to to to to to to to to to to to hear to to hear to hear to hear to hear to hear to hear to hear to hear to hear to hear to hear to hear to hear to hear to hear to to to to to to to the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the the try. try. try. try. try. try. try. to. try. to. try. try. try. try. try. try. try. th. the world We're going to take a quick break, but we'll be right back after this. When 60 Minutes premiered in September 1968, there was nothing like it.
Starting point is 00:14:31 This is 60 Minutes. It's a kind of a magazine for television. Very few have been given access to the treasures in our archives. But that's all about to change. Like none of this stuff gets looked at, that's what's incredible. I'm Seth Done of CBS News. Listen to 60 Minutes, a second look, starting September 17th, wherever you get your podcasts. Yeah, for real, man. I, um. That's great. Thank you so much.
Starting point is 00:15:01 It's like a, that's the thing that blew my mind is where even some of my friends were going like, I can't believe....... th. th. th. th. th. th. I'm th. I'm thi. I'm thi. I'm thi. I'm thi. I'm thi. I'm thi. I'm thi. I'm th. I'm th. I'm that's. I'm that's, I'm that's th. I'm that's. I'm that's, I'm that's. I'm that's that's that's that's that's that's that's that's that's that's that's that's that's that's th. I I I I I'm. I. I. I. I. I. I. I. I. I'm. I. I'm. I'm. th. th. th. th. th. th. th. th. th. th. that's that's th. that's th. that's that's that's th. that's th. that's that's th. that's th. that's th. that's that's that's that's th. that's that's th. th. like a, that's the thing that blew my mind is where even some of my friends were going like, I can't believe you, then I was like, what are you talking about? Right. Well, the business models to take things out of context, because then it spreads farther and the more personalized it is like. So I've had people from one show, literally one show, the same show and they go like in this show you said this and then the other person goes in the show you said the opposite. I'm like guys this was a conversation it was ideas it was and so now I've learned it's like yeah. Well the other thing is you said this earlier is like when you have a relationship with someone
Starting point is 00:15:33 and they say something that like you're like that seems. that's that's really the they. I I I I I I I I I I I I I I the the the the the the the the the the the the their. I their. I their. I their. I their. I their. I thi. I thi. It's. It's. It's. It's. It's. It's. It's. It's. It's. It's. It's. It's. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. It was. I was. I was. I was. I was. I was. I was. I was. I was. I was. I was. I was. It was. I was. I the. I was. I the. I the. I was. I the. I was. I was. I was. I the. I was. I was. I the. I was. I was. I was. and I see I'm like, wow, out of context I'm like, even I think that's really wrong and crazy, but I know them so I know I could talk to them and say like, I know you don't actually mean this interogation, right? But when we don't have our relationships, like when we don't actually have physical relationship lose that that background trust. Well you know one of the funniest things I saw the other day is I opened up my Instagram and then Instagram popped up a notification and was like hey why don't you make a second account so that you can be like more personal with some of your close friends and I was like wait to do it hold on hold on I was like what are you trying to to tell me here so it's almost like Instagram was going to thrown like this this this this this this this this this this this this this this this this this this this this this this this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this this this is like this this is like this this this this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. this is like. is where you're going to be like, ooh, look at me. And then why don't you make another account? And this one's going to be a different, but it was interesting that it was almost acknowledging in and of
Starting point is 00:16:29 itself that, hey, this thing here is not your friend. This is not, these are not your people, these are not your, you know? And so to that point, to that point, to thr. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their. their their their th. th. th. th. th. tho. tho. thr. tho. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. th. their, why, why, why, why, why, why, why, why, why, why, why, why, why, why. their. their. their. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. And, why. see my best friends Instagram, but I want to see my best friend, but they go like, yeah, but DJ Khalid, when he was running, you watch the clip three times, your friend, you just saw it and liked it and you moved on. And they're doing that because they'll just do whatever will keep you and everyone else engage more. And so if the celebrities do, if the people who are further from you, thapapapapapapapapapapapap. their, th. th. th. th. their, th. And, th. And, their, th. And, their, th. And, th. And, th. And, th. And, th. And, th. th. th. th. thi, their, th. th. th. th. th. their, when, when, when, when, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, their, th. th. th. th. th. thr. thr. thr. thr. th. th. th. th. th. th. th. th. th. thr. th. thr. th. thr. th. thr. attention, they're always going to show that over your close friends. And that's what they showed in the Facebook files, is that like over and over again, they had a choice between doing what was good for people, including like showing more friends and family and less of that other stuff, but then if that dropped engagement, it's like we can't afford to make that change. So there's things that they know they know what it sounds like to me? It's almost like a how at some point they said like a bar has to be responsible for
Starting point is 00:17:26 how much a person drinks. You know where they go like, you can't just keep giving, if you see the person is like blacking out and you see a person. Exactly. There's some responsibility there. You got to be like, hey, you've got to be like, you've got to be like, there's that element of responsibility. And there's a human relationship there. Yes, exactly. Because you see them and you can empathize them as a person. You see they're falling into their glass or something. Right.
Starting point is 00:17:49 But the opposite is true with the tech companies. First of all, the tea. Facebook has never sent me like, Yeah. And the stop using it, have you ever tried like not using Oh yeah yeah I try after what's the social dilemma? Okay so what did you notice after in that week what happens? So this is what this is what happens is when you do? when you do or if you do come so first of all what happens is mysteriously they'll send you an email right right I'm off social media I'm like to take? the. thrown. I'm th. I'm th. I'm th. th. th. th. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. thi. th. th. th. th. th. th. th. thi. thi. the. What is th. What is th. What is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is this is th. th. th. th. th. the. the. the. the. the. the. the. the. the. the. What is what is what is what is what is what is what is what is what is what is what is what is what is what is what is what is what is what is what is what is the. th.'ve seen the documentary, I want to change my life. Then I get an email. I don't subscribe to any emails. Right, right. Then the first thing I do is unsubscribe. I'm like what are you talking? I don't subscribe to any emails.
Starting point is 00:18:29 Hey just so you know Tristan commented on you. Exactly. But I'm like no, but I've never subscribed to this. I never get emails. Now all of a sudden Facebook or Instagram. the, the, the, the, th. th. th. th. th. th. the, th. th. th. th. th. th. th. thi, thi, I thi, I thi, I thi. th. I th. I th. I th. I th. I don't thi. I don't th. I don't th. I don't th. I don't th. I don't th. I don't th. I don't th. I don't is I don't is I don't is I don't is I don't is I don't is I don't is I don't is I th. I th. I th. I th. I th. I th. I th. I th. I th. I th. I th. I th. I th. I don't th. I don't th. I don't th. I don't thin, I don't th. I don't th. I don't throooooooooooe. I don't the first th. I don't th. I don't th. I don't th. I kills the cat. Exactly. And then I log in, and then I log in, and man, it'll make sure the first picture I see engages me in some, it'll be something heinous, you know, it'll be like, oh, look at what's happening to these people in this country where I'm like, shit, this is terrible, it'll be a Well, they know they need to do that extra spike, because if you haven't used it for a while, they have to show you the thing that they know is going to get you. And like you said, if you don't use it, they start dialing up like a digital drug lord. Like, well, let me try like these five things.
Starting point is 00:19:14 And they'll start emailing you.gary, they'll, they'll, they'll, they. T. T. they. they. they. they. they. they. to. to. to use. the. to use. the. the. the. their. their. their. their. they'll their. they'll they'll tho. tho. they'll thoe. they'll they'll they'll they'll they'll they'll they'll they'll they'll they'll they'll they'll they'll. they'll. they'll. they'll. they'll. they'll. they'll. they'll. they. they. they. they. they. they. they.. they.......... they..... they're... they're. they'll. the. the. the. they'll the. try. try. try. try. try. try. try. try. try. they'll they'll they'll they'll they'll they'll they'll they'll they they they to email you before. But when you stop using it, it's like, they're a drug lord. They need you to come back. And they have to figure out, like, oh, you stopped using, how can I get you to come back a little bit more? Yeah. Yeah, sometimes they're leaving like, hey, just checking on your account. We notice someone might have, Watch the Daily Show, Week Nights at 11, 10 Central on Comedy Central, and stream full episodes anytime on Paramount Plus. When 60 Minutes premiered in September 1968, there was nothing like it.
Starting point is 00:19:54 This is 60 Minutes. It's a kind of a magazine for television. Very few have been given access to the treasures in our archives. But that's all about to change. Like none of this stuff gets looked at. That's what's incredible. I'm Seth Done of CBS News. Listen to 60 Minutes a Second Look
Starting point is 00:20:14 on Apple Podcasts starting September 17. This has been a Comedy Central Podcast.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.