What Now? with Trevor Noah - Control Your Scroll with Jiore Craig

Episode Date: August 1, 2024

When it comes to disinformation, our first instinct is often to ask is this real or fake? But disinformation expert Jiore Craig says that way of thinking misses a much bigger problem. Trevor, Christia...na, and Jiore examine how political powers and corporations use information channels to push their agendas, and how we can start pushing back on the systems enabling lies. Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Let me be the one who said there are some really fun Bible verses. Oh, Trevor, that's how they get you. That's how they get you, Trevor! Though I walk through the valley of the shadow of death, I will fear no evil. I used to say that as a kid when I'd go to the bathroom late at night. That's what I would say, because I was terrified of going to the toilet late at night. And then my mom was like, this is the scripture that you will recite. And then I would say it.
Starting point is 00:00:22 And I was like, I will fear no evil. and I feared evil. Let me tell you something even the fact that I had to say I fear no evil made me fear evil more because I was like I wasn't even thinking about evil I was just gonna go pee but now I feel like I'm like taunting the devil and whatever demons might be waiting for me on my little late night pee quest. It was a very very stressful time. Sometimes I just stay in bed and like pee myself. I just be like, you know what? I shall fear evil on this occasion and I'm not going to go anywhere. This is What Now with Trevor Noah. Well, happy podcast day, Christiana. Happy podcast day, Trevor.
Starting point is 00:01:16 Where are you in the world? You're always in LA, right? It's like it was London. I was in London and then I was in Oslo. Oh yeah, actually, yes, yes, yeah. Also, there's like a recurring bit because people keep messaging me that listen to the podcast. Someone stole all my lemons again, Trevor. So I'm not too happy. No, you're lying.
Starting point is 00:01:33 Raided, raided my lemons. Raid. It's actually really terrible. Did they do this when you were away or did they do this? They did it when I was away and I got back and I thought my trees were sick. That's one of the downsides of people knowing that you live somewhere and then also knowing that you have something valuable. Yeah.
Starting point is 00:01:49 Yeah. And it's just like... So now like they, when you're like, you're in Oslo, they're like, it's lemon time. Just go with that woman's lemon trees. And they're all going, it's so greedy. You know, it's just like, why take all of them? And the ones they left are really high up. And I'm like, do you know the optics of a black woman climbing a tree?
Starting point is 00:02:05 Like, I don't want to do that. So apart from like all my lemons being gone, I'm good. Where are you? So I just traveled from, I was performing in Monaco for the first time ever. Yeah. Monte Carlo, the casino of Monte Carlo. Yeah, and then I traveled now, I am now in the, I wouldn't say beautiful city of Berlin, when you're looking at it, but honestly, I think it's one of my favorite cities in the entire world. Interesting, and why is that? I think it's, you know what it is, first of all,
Starting point is 00:02:41 there is more history in this city than most places in the world. True. Obviously, it's a history that is fraught with a lot of pain and and suffering and you know like if ever there was a cautionary tale Berlin is like the home of one of the most important cautionary tales we can ever talk about you know. Like it was even interesting going on like a tour here with one of the tour guides and they they were talking about how Germans have been struggling to have conversations about what's happening with Israel and Palestine. Because there's many people who feel like the Israeli government right now is on the wrong path, but then we're afraid to say something, because if we say anything against anyone who's Israeli or Jewish,
Starting point is 00:03:26 then it brings up what we did. Absolutely. So do we have to support them? They're so afraid of repeating their previous mistake that now they're almost unable to warn somebody that they are making a terrible mistake. I get it. Do you know what I mean? Yeah, I feel like they should probably just set it out.
Starting point is 00:03:42 You know, maybe just maybe sit this one out, leave it to France and England and you know, countries that also have complicated histories, but like their hands aren't as recently bloodied. I'll tell that to them when I meet them, when I talk to more people. I'll just be like, my friend Christiana says, sit it out. Just yeah, Klaus, you just sit it out. And if someone asks you about Israel, Palestine, you just say, you know what, I'm just focusing on becoming a better chairman. And, you know, I just don't want to make any more mistakes in life, yeah? Just keep it moving and, you know, hey, cool, cool, I hope you're enjoying the Olympics.
Starting point is 00:04:17 Hey, another topic. You know, it's actually funny. Being here has taken on a different meaning for me because Berlin is also one of the historical hubs of misinformation. And that's really what today's podcast is about. It's like, not just what is true and what isn't true, but really about like where misinformation lives and sort of trying to dissect or trying to understand where it flows from as one of my friends who is um I wouldn't say chronically online but I guess I just did say chronically online. You're kind, no just say I'm chronically online I'm gonna own it. I am, I've got a problem. I wonder do you do you feel in control of the information that you see online? Oh, no, I don't.
Starting point is 00:05:06 I feel constantly bombarded because I think even if I'm not scrolling on Instagram or TikTok, I'm getting it in the group chats, I'm getting it in my text messages, I'm getting it in my family chat. It feels like I can't avoid being constantly inundated with information. That's interesting. And I'm not necessarily deciding where that information comes from. My mom is sending me something that's been forwarded 4,000 times.
Starting point is 00:05:29 I can't even say where the source is. So I feel like I'm bombarded all the time. You know, I've been on this, I guess, a little bent recently where I've tried to go online without an opinion. All I'm gonna do is look at what is online and then see how people are responding to it. And it's amazing to see how many things online are now basically either true because people want to believe in it or fake and false because
Starting point is 00:05:59 it doesn't support something that they've previously believed. It's almost like we live in a world now where people are not open to the idea that something could come from somewhere to manipulate them. Like I don't even know if fake information sums it up anymore. Because sometimes it's not fake, but it's who's sending it to you for the specific reason that they're sending it to you that makes it more concerning.
Starting point is 00:06:22 Yeah, everything is like super editorialized. And then it's like, let's give the example of like, since Kamala's going to run for president, there's this like very online discussion about her record as a prosecutor, right? So one side are like, she locked up all these black men. She did these horrible things to parents. She's a cop. We can't vote for her. And the other side is like, well, actually no, she was a really progressive prosecutor.
Starting point is 00:06:48 Both to me are like overcorrections because like a progressive prosecutor is kind of an oxymoron. It's just like, oh, the compassionate killer. Yeah. They kill very compact. Like the job is punitive. To me, that's an overcorrection, but the other side is probably not The compassionate killer. The compassionate killer. Come on. Yeah. Of course. Like the job is punitive. To me that's an overcorrection.
Starting point is 00:07:07 But the other side is probably not looking at her record effectively and how she talked about the fact that, you know, black people want to feel safe in their environment. They want law enforcement that lives with them and comes back home to them, et cetera, et cetera. But now I'm looking at both sides and I'm like, who's right? They both seem very informed and they both have like the soundbites. People include like things that other the person has said, sometimes without context. So I'm in a place where I'm like, so what is her actual record as a prosecutor? I couldn't confidently tell you right now.
Starting point is 00:07:40 And did she say it? I think that's what's a little scarier to me. Like one of the big, you know, one of the big tweets that's now causing chaos is Elon Musk, who is notoriously neutral and he only recently decided to support Donald Trump after he was shot, which by the way, can I just say for the record, bullshit. You know, like if you're gonna support somebody, I know this is a tangent right now,
Starting point is 00:08:01 but indulge me on this. If you're gonna support somebody, have the balls to support them. You know what I mean? Just say, I support Donald Trump. Don't like look for like, Elon Musk is like, oh, I'm gonna stay out of the election and I don't believe in either side. And then someone tries to shoot Donald Trump
Starting point is 00:08:18 and you're like, now I support him. So please explain to me how somebody getting shot at has improved their ability to be president of your country. So please explain to me how somebody getting shot at has improved their ability to be president of your country. Like just say you wanted to support him and then after he was shot at, now you felt like you could support him publicly. I just think it's dumb, but anyway,
Starting point is 00:08:36 he posted a video and it was, we retweeted it, right? And it was a voiceover of Kamala Harris basically talking about why she's running now and how she took over from Joe Biden because he's senile and it's not her. But the voice is, if you did not know any better, you'd be like, this is her. Is it an AI voice or is it some, like,
Starting point is 00:08:58 Yeah, it's an AI voice. No, it's an AI voice. And yeah, and then that video's online and it looks like a campaign ad, but it's obviously disparaging. And there are a bunch of people who think it's real. They go like, wow, how could she say this about Joe Biden after he was so kind to her? It's so interesting to see it because you go, we are a few months away from an election and here is a person who is, you know, the owner of one of the largest town squares, if we're going to call it that.
Starting point is 00:09:27 And he is actively posting that video, not saying this is parody, this is fake, this is a joke. He's just like, this is amazing. But isn't it scary that he was fooled? Do you know a man- I don't think he was fooled. You don't think he was? I don't think he was fooled. Because he's been fooled by a lot of things. And he actually has. Do you know what I mean? I'm seeing a lot of people who are relatively savvy, he's working cyber security, he's working tech. I'll give you an example.
Starting point is 00:09:50 There was this video of Mark Zuckerberg that went around, this photo, and he looked really hot and everyone was thirsting over it. Are you talking about the beard pic? Yes! It's the beard and the chain. Yeah, and everyone was like, ooh, Mark, those billions are really making you look good. And it was like a fake AI, like they made him look better. Oh, wow.
Starting point is 00:10:11 People were lusting after an AI, Mark Zuckerberg, that doesn't exist. And these are pretty savvy people who are online all the time. And that's the thing that makes me very worried. It's actually funny that you say Elon Musk has also been fooled because I'm constantly having conversations and I'm always arguing about like where the responsibility lies. And more often than not, people will say, hey, verify what you see online. If you see something online, try your best to research it and figure out where it's from and who's sending it.
Starting point is 00:10:46 And this is all beautiful sounding advice, but I'm sorry, there is no human being who can verify even a single scroll of information on their timeline. Just literally scroll your finger once, count all the videos or the tweets or the posts or the whatever you count them and good luck verifying them. Good luck verifying when that person was shot. Good luck verifying when that person was fired, when that person was punched, when that city was blown up. It is impossible. And I realized social media or the tech industry has taken a page out of the oil industry's
Starting point is 00:11:24 books. And what they've done is they've tricked society into shifting the blame onto us as individuals instead of keeping it on themselves. Right, like one of the most sinister things we learned about the oil industry is way back in the day, you know, they're making oil, they're pumping it out, and then they start realizing they can make byproducts,
Starting point is 00:11:43 right, and one of the byproducts is plastic. And so they go, well, how do we get people to use plastic instead of using paper bags? And then what they come up with is, well, we say that this is good for recycling. Because if you use paper bags, you're killing trees. If you use plastic, it's good for the environment. And they run this whole campaign. And then as we all know, we live in the present, there's plastic everywhere. It's not recyclable, it's reusable.
Starting point is 00:12:07 And then they turn it around on us and they hire consultants to switch the narrative to now be, you need to recycle. You need to move your bottles and your caps, and you need to make sure that the plastics are in the right containers because it's you. You are the reason that the turtles have a straw in their nose. It's not us who produce it because it's you. You are the reason that the turtles have a straw in their nose.
Starting point is 00:12:25 It's not us who produce it, it's you. And that is genius because the likelihood of every individual coming together to fix pollution and recycling and everything that's happening in the oceans and landfills is almost zero compared to us just going to the source and saying, hey, you're not allowed to produce this kind of plastic anymore.
Starting point is 00:12:45 Tech companies have done the same thing. Tech companies have been brilliant in making us, on the app, they tell you, you should be the one who looks at whether or not your kids are getting fake information. It's really slick how they've shifted it to us. Trevor, do you know what it makes me think of? It makes me think of racism because like pre-civil,
Starting point is 00:13:04 during the civil rights movement, the focus was always like systems and policy. It wasn't about treatment so much. It was like we want to integrate schools. We want to make sure that anyone can go to any water fountain they want to go to, black people to have access to the spaces that are public spaces. And then post-civil rights, when some of those games were won, white people made it about individual interactions. Yes. Right? So, what should I say and what should I do? Yeah, that's brilliant actually.
Starting point is 00:13:35 Hey guys, we haven't dealt with the aftermath of redlining. We're talking about wealth and reparations and the systems and the policies that actually govern a sinister thing are more important than how like individual actors behave. And like something like racism, we've just made it be like, okay, let the nice white liberals who feel guilty have book clubs when really it's like, oh no, this is about like public policies, about education policies, about real estate.
Starting point is 00:14:01 It's all about these like bigger things that are like really the government's job. It's exactly the same thing. Racism, recycling and retweets. That's what we're going to call it. That's what it is. Well, I'm really excited because today we're going to be sitting down with Jori Craig. And Jori is a resident senior fellow at the Institute for Strategic Dialogue.
Starting point is 00:14:19 But more broadly, she's an expert on disinformation and how big tech policy and social media influence elections. And yes, there are elections coming up. But don't worry, this is not all doom and gloom. It's actually not defeatist. It's a very sober look at what can be done. So here it is, jury Craig, and of course, my good friend, Christiana Mbaka Medina, who is now lemonless thanks to you, thieves.
Starting point is 00:14:54 So Jori Craig, welcome to the podcast. It's really good to have you joining us. Let's start with your title. What do you just say, experts in disinformation? So, you know, in terms of my title, it's whatever anyone needs it to be. But I try not to use disinformation as much as possible because of exactly what you alluded to, which is the framing of this whole discussion. So it's good for us to talk about what disinformation is,
Starting point is 00:15:21 but when we're talking about disinformation narratives, we're not talking about the networks and the tactics that are behind the disinformation is, but when we're talking about disinformation narratives, we're not talking about the networks and the tactics that are behind the disinformation. And so actually, it's kind of a distraction. I'll give you an example. So if disinformation is an intentionally false or misleading narrative used to deceive or harm, we get focused on the intentionally false part and we start arguing about like which meme has more truth to it instead of the second part of that definition which is used to intentionally harm or deceive and it's that sort of motive that we should really be talking about. So just to clarify so, I guess if we're using the
Starting point is 00:16:05 miss, is really just like a piece of information that wasn't correct. Oh I saw Brad at the bar on Thursday and Brad actually wasn't there, it wasn't a Thursday or Brad, it wasn't Brad. That's misinformation, right? Disinformation is me specifically going out of my way to say that I saw Brad at the bar so that Brad's wife divorces him so that I can take a shot at Brad. Yes, malicious intent. The intent behind the information. Okay. Okay. And then wait, wait, wait. And then what is malinformation? Malinformation is a sort of subcategory. It's supposed to be a false or misleading statement with the intent of harming that has a piece of truth in it. Okay, so if I break that down,
Starting point is 00:16:51 that's like every lie in a relationship, I think. Yeah, it is. Well, I wouldn't know, no lies in my relationship. But I would say that that is often what we're experiencing. And in a way, you know, certain forms of comedy and satire have elements of malinformation, depending on what it's being weaponized towards. So I reject that statement. I reject that. Well, comedy is only truth. You know, it's actually it's funny when you break these things down because, you know,
Starting point is 00:17:26 Cristiana and I, we always talk about, you know, growing up in an African family and, you know, having family in Africa. And you know, Cristiana, I'm sure you're the same, but like in, I don't know if it was exactly the same in Nigerian communities, but in South African communities, there was a class of grandmother or like older aunt and they were essentially the KGB of misinformation and disinformation in like in like the village. Like they, I mean everything, they would tell you things like it was like a wide range. It was everything from somebody's gonna get fired or they are fired or they're, all the way through to like somebody
Starting point is 00:18:08 is bewitching another person and they're using lightning to strike down their family. And this would spread by the way, and you'd be like, wow, this is real. Did you hear that, you know, like that family is now consulting with witches to use lightning against those. So like, is that a big
Starting point is 00:18:26 thing in Nigeria, by the way, Christiane? Is it like, is it also like the grandmothers? Every family has a witch, you know. Oh, yeah. Okay. Yeah. Which I truly believe. As you can see, this is why I'm somebody that's maybe susceptible to misinformation. But something you said earlier that I'm really curious about, because it's funny, the way you describe what you do, and you talk about networks, and you say they, and you say malicious intent, for somebody like myself who maybe spends too much time on Reddit, and around people who are like, not consuming news, but have strong feelings about the news, that sounds like conspiratorial. the news. That sounds like conspiratorial. So is there a way you can give us like a really stark example of misinformation in the digital age by these bad actors? Because I'm really curious about like, what does this feel like? What is this in a very concrete way?
Starting point is 00:19:20 Yeah, who is the they as well? Yeah, sure. The they changes, first of all, but I'll give, I'll give a concrete example. And then maybe I'll give another one. So in 2020, there was a vintage picture of LeBron James, uh, in people's newsfeeds. It was showing up in people's news feeds. It was a targeted ad. And this ad was being run targeted to voters in the South. And it was run from a page called Protect My Vote, pretty vague
Starting point is 00:19:47 language. And it was claiming that LeBron James had said something that validated the idea that mail-in voting wasn't safe. LeBron James didn't do this. He, in fact, launched an organization to fight voter suppression. But it took this quote out of context and directed voters to a website that said mail-in voting is not safe under the guise of this organization with no information about who's behind Protect My Vote, who's funding Protect My Vote. Damn. It was a now defunct super PAC, Freedom Works, that was actually propping up this Protect My Vote website.
Starting point is 00:20:25 And it was being targeted to black voters in the South. And there was no data transparency from the platforms sharing with them any information about sort of who was paying for that and who was behind them. So you have a couple different things happening there, right? You have the network that is this sort of like extreme think tank funding this ad that's meant to get people to stop people from voting in a very discreet way. You have the social media company itself, which is allowing this, even though it violates its policies. You have the social media company that has an incentive not to provide transparency with its users on why something's
Starting point is 00:21:03 showing up in their feed, both who's paying for it, where they come from, but also what data about you did it use to tell you that this is relevant for you. Yeah. So that is one example of a they, and it's both the companies that are involved, it's also these people who don't want certain voters to have access to the ballot. So it can be different theys, and that was a useful example because like LeBron was able to say, hey, I didn't say this and ESPN was able to do a push notification, which gives it a far better chance at ever reaching the people who were targeted in the first place than if it had simply been only written up in the Washington Post.
Starting point is 00:21:43 And I like to use that example because it exhibits a lot of the problems. We have a problem both talking about what happened because we only focus on the claim. And then we also usually have a problem reaching the people targeted in the first place because of the way people are consuming information today. But the point is to shift the conversation
Starting point is 00:22:00 away from truthiness and toward the motives behind what's being pushed. I'm curious to how you do that because most people are just more concerned with truth. And I say that it's like, you know, black people are always being lied to, right? We're being lied to by doctors, we're being lied to by teachers. Like, to be, I think, to be black in anywhere in the world, you feel like you're living a conspiracy. And we have like a long history of things that they said weren't happening and then it's like oh yeah, well that did actually happen. So I think for a lot of people is this true or not is like a big stakesy question for them rather than who's
Starting point is 00:22:35 spreading the lie if that makes sense. So how do you shift people away from being like hey whether it's truth or not doesn't it matter as much even though emotionally that resonates with you rather than focusing on like these bad actors that are spreading lies? I would actually say that people aren't necessarily so focused on truth. People don't like to be scammed. And so if you actually think about, you know, what did you see on social media in the past week? How much truth verification were you doing? Our brains are like wired mostly to want a story. So I would say that I don't find it, I don't find my task to be getting people to shift away from caring so much about the truth. I find my task to be getting the conversation off speech and on to
Starting point is 00:23:20 systems. Right, right, right. Don't go anywhere because we got more What Now after this. If you say we're going to put systems in place that open up or regulate what social media companies can do and then also how companies can spend money. It seems like what you're saying is that's where we sort of get to the crux of the problem now. Yes, tech accountability and having some guardrails around how these companies are set up is certainly one of
Starting point is 00:23:57 about a handful of things that could really improve the way we're all experiencing information and the way we're feeling. If you take it to a different place, look at how the companies interact with parents, okay? Companies interact with, social media companies interact with parents and they say like, we've got all these great tools for you.
Starting point is 00:24:13 We've got all these really great tools for you to keep your kids safe on our platforms. But to your point, why aren't they just safe? Why aren't they just safe by design? And they're like, you know, here, so on each app, you have to click a different set of steps, sometimes three, sometimes 12, to get to the parent control settings.
Starting point is 00:24:31 So you're really busy now. If you think about how many kids are on social media, it surely isn't logical that only kids with good parents shouldn't be harmed by social media. The companies have a vested interest in being like, but look at all of our tools. We're making it so we're on the same page as you. We want safety.
Starting point is 00:24:48 And that is not even about false content. That's about being addicted to scrolling. That's about having exposure to content that might lead you to have an eating disorder or do a viral choking challenge. I mean, it's not even about truth now. Now they can't even say, well, we don't want to be biased. We want to have free expression on here.
Starting point is 00:25:08 This is just now about our kids. And so that's even more sinister when it comes to those actors in particular. Cristiana, I'd love to know what you think about this as a parent, but I'm always fascinated at how social media companies will always present themselves as being helpful but also helpless at the same time. They're like, look, we cannot control everything that happens here and there is no way they'll make it seem like they can't hone in on a specific idea. But go to their advertising section, try paid to create an ad, you will be shocked at how powerful their tools are. Like they'll tell you, we don't even know who's really using the platform. There's no way we could know where a message goes or doesn't go.
Starting point is 00:25:56 But go to the paid section, you pay, and say, I would like to send this out specifically to black men over the age of 62 and they will be like, yeah, we got you. And you're like in this area, they're like, yeah, we got you. You're like in this block. They're like, yeah, we got you, dog. We got you. Don't worry about us. We got you. But I wonder as a parent, Christiana, like, do you, do you feel like the pressures on you? Like, what are you even going to do? Cause I mean, your kids are young now, but like, are you going to be a no phone parent? Are you going to, what do you, kids are young now but yeah like are you gonna be a no phone parents are you gonna what are you I'm hoping to be but I was also a no screen parent and my son woke up one day and was like hey guys don't forget to
Starting point is 00:26:32 subscribe that was it those first words in the morning that tells you I'm not and that was what I was like we're doing and he could spell blippy before he could spell his own name bl Blippi is like this, like, he's like a demon, but he's also helping teach my kid about excavators. And like, Obi was like, B-L-I-P-P-I. So like, I think there's what you do in the abstract version, parent of yourself. And there's the reality of parents being like overstretched, underpaid, under supported. And the easiest thing is to, if you have a teenager here,
Starting point is 00:27:04 go on TikTok. You know, and the easiest thing is to, if you have a teenager here, go on TikTok. You know, it's just like most people, parents don't have the latitude to make quote unquote good choices. But I think my thing is that I still feel so powerless, right, because even if I protect my kid, what about other kids? And in like this capitalist hellscape,
Starting point is 00:27:23 I don't feel like I can take on Metta. I can't take on Twitter. I can't take on TikTok. I feel powerless ultimately, because like even if I do protect my kids, they're gonna go to school with children who are having access to material that's inappropriate. And I read somewhere recently that the average age
Starting point is 00:27:41 a child is exposed to pornography is seven. Damn. And it's so easy. recently, like the average age a child is exposed to pornography is seven. Um, and it, right. Really scared. And it's so easy. You know, it's like, it's not children seeking this stuff out. When I was in school, the first time I got access to porn was like, I think it was like 14, 15, maybe this is, which is much better, double these kids have it good these days.
Starting point is 00:28:01 Let's go back to the days. You know how hard I had to fight to see porn when I was young? Do you understand the lengths I had to go? I had to go to a store. There were these like what they called cafes, but they weren't cafes in South Africa. And you would go in there and they had like magazines and candy, like bodegas. Essentially it's a bodega. Let's think of it like that.
Starting point is 00:28:22 Corner store bodega. And then they would have, as they say, lewd magazines in the corner. And that was the only place I could see it. And you couldn't even see anything because they had pasted things on the nipples on the cover. But I would just go there and act like I was really interested in popular mechanics while staring out the corner of my eye.
Starting point is 00:28:40 And now you're telling me seven years old? Yeah, because it's that easy and it's that prolific. And it's that kids are seeing things that they can't even really process. But then I can't take on big tech. Do you know what I mean? It's like these systems that you're talking about. But wait, can you, Juri, can you? Like, we're making these assumptions.
Starting point is 00:29:01 Well, you know, not by yourself. There are some really promising movements of parents right now organizing and in the UK and in the EU, there's been a lot of success and the companies have had massive fines slapped on them out of Europe for failing to meet safety standards. So there is a move in that direction. You know, you can see the Surgeon General in the US has now made this a national advisory that there's a mental health national advisory, there's a loneliness advisory, and social media is playing a role, which has given some heft to attempts in the US to organize.
Starting point is 00:29:35 There's been some state legislation passed on. So you can and we can, but I think even shifting the conversation to when parents are sort of feeling like, what do I do and it's all on me, even just introducing the fact. Cause most people don't produce the companies as having any role in what they're experiencing about what's hard about parenting until it's prompted. You know, they say like, it's bad. It's the influencers, it's the advertisers, it's the teachers, it's me, or it's bad parents, or it's, you know, it's not my kid. It's other, it's like,
Starting point is 00:30:04 it's so amazing how the companies are able to stay invisible. And that is not just true when it comes to parenting, it's also true when it comes to the companies and others who benefit from this fractured information environment that we're in. There's what parents are going through and what their kids are going through. And then there's what we're all experiencing with the wild way we are forced to consume information today. And when you think about who benefits from that, from us feeling confused and overwhelmed, it again, you can kind of draw a direct line
Starting point is 00:30:35 to people who are trying to concentrate power and make a lot of money. So let me ask you this then, what do you do to address the concerns of people who feel like this is an attempt to limit free speech? Because obviously, you know, like people like Elon Musk, he's the most prominent one now and for obvious reasons. But he's like, look, look, look, it's not about me and whether or not I like it or not. But at the end of the day, free speech is the most important thing that we that we got.
Starting point is 00:31:01 We've got to make sure that everybody can say whatever they like. And people who are trying to stop this well what they're trying to do is they're trying to censor us and and then a lot of people buy this narrative they go you are trying to decide who can and cannot say something and some people will say yes even if somebody's going to lie they should be allowed to lie because that is free speech and if we limit people's ability to lie then we're also going to limit people's ability to tell the truth.
Starting point is 00:31:25 So how do you respond to that? You know, those people are especially loud here in the States, because our First Amendment is something we hold near and dear. In terms of Elon Musk and, you know, him in particular, my next response would be, if you care so much about free speech,
Starting point is 00:31:41 why did you sue the people who criticized you and try to get them to take all the stuff they said about you down? That's like censorship 101. And all the headlines that say, Elon Musk, so-called free speech champion. I'm like, don't put the word free speech and Musk in the same sentence because people are scrolling and they just see his name in free speech. What we should say is, Elon Musk, censor, Elon Musk, bully. That is what his behavior demonstrates. And when I'm not talking about Musk, I express to people, do you feel in control of what information you consume
Starting point is 00:32:15 every day? Who's in control on that platform? Are you picking the content that shows up in your feed? It's already a curated speech environment. It is not an environment where we had free speech and it went away. We don't have control over what's going on. If we do say something, we don't have control in even making sure that it goes to every single friend who's a friend of ours. When I post something, it doesn't even go to all my friends. So I think making the conversation about what can exist and what cannot exist, that's an important conversation to be having because all around the world there are people weaponizing
Starting point is 00:32:52 digital policy attempts to censor people. It's happening, but it's not happening in a lot of the cases where people are saying it's happening. And so I point that out and people don't feel in control of what they see. And most people, like I ask people all the time, do you spend too much time on your phone? And they say yes. And then I say, do you blame yourself? Are you like having this conversation with yourself where you're like, oh, I'm not disciplined, you know, like I didn't, I didn't do a good job. You like down yourself. And I'm like, cool, the platform is make so much money from that whole interaction. Like, you know, it's just, it's just not about us or free will. The examples you've used so far have kind of been right-wing, authoritarian, populist,
Starting point is 00:33:34 capitalistic people using misinformation for their gain. And if I'm a person on the right who's kind of libertarian listening to this, I'm going to say, well, are there any examples of people on the left who may be like anarchists, extreme progressives, you know, environmental type groups doing similar things? Like, why does it's, first of all, do you have examples of people on the other end of the political spectrum using these tools with malicious intent? And if you don't, why does that seem to be the case? So for sure, not even just the extreme left, for sure people on the left and in the center and in all different places are guilty of engaging in some of this.
Starting point is 00:34:19 It's more that there is an entire dedicated apparatus that is behind the MAGA and Trump push. It's all outlined in Project 2025, which I could go on and on about, to use disinformation as just one of about seven prongs of how to backslide democracy, and they're pushing it toward a theocratic authoritarian space. I guess my pushback is, I'm I'm curious about like the systems and the theys. We can, you can categorically say that is not coming from the left.
Starting point is 00:34:53 Because if I'm just being a political animal and I'm on the left, I'm like, okay, we're going to beat them at their own game. We're going to have to do it too. I'm going to have to put up some memes, Mitch McConnell memes. Yep. When they go low, we go lower. They get people to do, get people to vote different. Like, I guess I'm just talking about like, Machiavellian,
Starting point is 00:35:09 you know, just real politic in like, the cruelest and most horrible sense. If we're in this misinformation war, one side seems to be too noble to engage in that, which I respect, but they're losing. And I'm just curious about like, the calculus behind it, because you're talking about systems and they. And I'm like, well, people that believe in the truth, aren't they going to have to be as nasty or they're just going to have no rights over their bodies
Starting point is 00:35:33 and no freedoms and their kids are going to have to say a Bible verse in the morning every day, which doesn't sound very fun, you know? The point, Christiana, that you're making is a really good one and I have bad news and then I have some good news part of the bad news is that Let's say the left or let's say not the Trump backed Christian nationalist, right they sort of had this
Starting point is 00:35:58 digital moment with Obama Where they were like doing the magic digital stuff and it was like, oh, the left is so good at digital. They're so advanced. And then they got pretty lazy and a lot of those people made money off of like the tech of 2008 and they didn't keep innovating. And in the meantime, the right was like pissed about having Obama as president. And they were like, we are going to focus. And they started building up, you know, if they already had Fox, they started building up a whole online apparatus of infrastructure
Starting point is 00:36:30 and they started experimenting. They started doing what Facebook was telling them to do, doing these experiments. In the meantime, you have on the left, you know, these people who are just using the same systems they were using for Obama, but tech has moved on. So they were not doing all the things they could have been doing.
Starting point is 00:36:44 And there's a real asymmetry online between the digital apparatus of the right and the equivalent digital apparatus of the left. When I think of what the left did, it was actually like citizens up. It was Me Too. It was BLM, it was like kind of these very powerful social movements, but that seemed like just a response. I saw Trayvon and I tweeted or I went down to Ferguson. But on the right, to your point, it seemed like a machine rather than just like voices of the people type of thing. Totally. And social media, I mean, when the Arab Spring happened, and we were all talking about social media as this democratizing force, it was in many ways because it was empowering
Starting point is 00:37:29 these organizing movements. And at that time, social media was kind of more social than media, right? It was sort of prioritizing social connection groups, being in communication with your friends. And then there were changes that happened at the platforms. And those changes started to prioritize sort of the media side and the information consumption side and having our experience online, not being about going and talking to our friends, but actually how long we'll stay on those platforms. So that shift started happening.
Starting point is 00:37:56 And in the meantime, certain actors were more clued in than others. So a great example, you know, let's talk about BLM. In 2020, you know, right after George Floyd was murdered, if you looked at the keywords online and you looked at who was talking about certain keywords, when it came to the words Black Lives Matter, George Floyd, the conversation was coming from mainstream actors and people in the movement. But if you fast forwarded to about end of July and you typed in any of the keywords being used by the left,
Starting point is 00:38:26 you'd get taken to right-wing content because they had focused on the repetition and gaming of those keywords and the platforms were then rewarding that by giving that content preferential treatment and feeds. And so it's a longer game that some of these actors are playing and the companies have aligned their features to reward that kind of long game profit seeking power seeking mode instead of the connection component. And one thing that I know you all have talked about before is the way this plays into like what we're feeling post pandemic, what we're feeling in terms of loneliness and connection to each other. Like, you know, attachment theory is a really sexy thing online, like, you know, understanding your attachment style.
Starting point is 00:39:09 And, you know, I'm not saying I'm into it. I just study it. So, you know, like, I'm certainly not consuming anything, but I may I just need to know about it for my work. I just need to know about it for my work. So attachment theory, there's this quote from attachment theory, and it's like, the human life is best organized as a series of excursions short and long from a secure base. Some people claim that like liberalism doesn't have an answer for the secure base. And like we're all kind of just spiraling without any home to come back to.
Starting point is 00:39:38 Oh, that's interesting. Wow. Yeah. There's no like, you know, common sense making institutions anymore. Like church, if modern day preachers are male podcasters, Trevor included. Sounds like Trevor. Trevor says his stuff all the time. It sounds just like modern day preachers are, you know, sort of today's pastors, preachers, whatever, these podcasters are basically filling the role. We don't like go get donuts and talk to each other afterwards, so there's no like common
Starting point is 00:40:04 sense-making space and connectivity. And the platforms, I think in all of their marketing, they sort of portray themselves like this place for connection and this secure base place. But actually they're like an excursion that never ends. They're like an excursion that you can't get off. And they're contributing to, in my opinion, the lack of like secure base community connection, hub, home, belonging,
Starting point is 00:40:28 the stuff that makes us feel warm and better and living life in a way that we want to continue living it. All of that is getting eroded. All of this just plays into how we're feeling and takes advantage of a lot of the void we feel, especially post-pandemic. We'll be right back after this. Okay, so two follow-up things. Please give me the good news. Because that made me feel very afraid. And the second thingist. The second thing is,
Starting point is 00:41:06 for the people at home, I don't know if you know, but the right have been saying that they're being shadow banned, and they're saying that these social media platforms are suppressing their content. My question for you is, is shadow banning another form of misinformation? Because now I'm very confused because I'm like, well, they have this sophisticated infrastructure,
Starting point is 00:41:27 but they're claiming that the social media sites don't reward them, but you're saying empirically that's not true. I'd love your take on that. Great, I'm going to start with shadow banning. So shadow banning is another way to say my content's being throttled and sort of not going as far as I want it to go. And at each platform,
Starting point is 00:41:47 they have different language for what makes this happen. It happens to a number of creators. The reason the right is saying it in this moment is because right now, especially on Meta, Meta has made a bunch of changes to throttle and depromote all political content. So all political content on Instagram, for example, you have to opt into getting political content and you're automatically opted out. And so they're mad about that because their content's political and they enjoyed actually a lot of free amplification for a really long time. During COVID, the companies were struggling. They were trying to figure out how to not direct to sources that were not verified. And so a lot of actors, including the right, were saying, were being censored.
Starting point is 00:42:38 And in a lot of cases, that is an example where that was true. I think there are two things to consider, you know, when trying to dissect those arguments. Number one, I think we actually do have to come back to religion to develop an understanding of how that part of the system works. So one of the main things that is necessary in a religion, and you see this oftentimes abused in churches or by pastors who are trying to scam their their populations. There's many good pastors, many good churches, but you'll see it when it's abused. Basically, you convince people that they are victims regardless of their station or their situation in life. And so an organization that is, you know, right wing and making millions and millions will still say
Starting point is 00:43:29 they're being censored and nobody can hear them. I mean, like Trump is a great example of this. While he was president, he was the most powerful man in the world. And yet he complained like he lived on the bottom floor of an apartment building where everyone was stomping on top of him at 2am. I think that that's part of the appeal and the allure is that they're selling victimization. They're going, no, you are oppressed. You I know you have money, but man, they're coming for you.
Starting point is 00:44:01 And I know that you have a great job, but they're coming for you and all these things. And it's interesting, you know, Jerry, when you're talking about what they put in our feeds, I wonder if we'll one day get to a point, and maybe Europe is inching towards it a lot faster, but I wonder if one day we'll get to a point where we start to think of social media and digital media as a whole, even television.
Starting point is 00:44:27 We start to think of it the same way we think of food, where we ask, what are we allowing companies to put into us? Because we will fight for that as human beings when it comes to food. And again, ironically, like the parallels between America and Europe are apparent even in the foods. You know, there are foods that exist in both countries, but in Europe, they're like, hey, you can't put this chemical in Oreos. We don't care what it does to them. You can't put it in. And then in Europe, Nabisco will be like, all right, we're going to have to figure out another way to make these biscuits taste tasty or make these cookies more delicious. But in America, unfortunately, we've seen that companies have a certain amount of latitude
Starting point is 00:45:09 and they're given free rein on us. In all of this, by the way, whether it's social media or food or anything, I find it particularly interesting that the same analysis isn't applied to drugs, right? Because drugs aren't part of the capitalist system, like the government doesn't have a hold on the drug industry like cocaine and all of these things. They then, they speak about it differently. So they go, these drugs are killing our kids.
Starting point is 00:45:37 These drugs are killing our communities. People are getting addicted. And you go like, well, why don't you legalize drugs? They're like, no, you can't just legalize it because if you do, people could get addicted and they won't know when to stop. And I'm like, ah, interesting. So what you're saying is whether or not people have self-control, you also have to acknowledge the ability of the actual thing that they're consuming to affect
Starting point is 00:45:59 their ability to live a healthy life. And so couldn't you then apply that to food or as you're saying to social media? I think the way you framed it really brings it back around to this idea of where we put our focus. Are we focusing on whether or not a singular food, social media, a singular tweet, a singular story is true or false or healthy? Or are we thinking about the system that we put in place
Starting point is 00:46:24 and how we hold the companies accountable in terms of what they're allowed to do and what they're allowed to put in and how opaque the system is? Yeah, I mean, totally. We have screen time counters on our phone. And I don't think any of us feel like we're in control of that screen time number. Like if we all looked at our phones at the screen time, it would be higher than we want. I'm sure, well, I won't speak for you. You all, I'm sure, are totally regulated. Can I tell you how bad it's got for me?
Starting point is 00:46:52 This is not an ad. I bought a thing called a brick. I don't know if you guys have heard of it. Those things are interesting. Yeah. Yeah. I bought a brick because I was like, I'm out of control. As Trevor has explained, I'm reading about what's happening in Mozambique.
Starting point is 00:47:05 I don't know anyone there. Up at 3 AM, I'm reading about Mozambique. I'm like, even the screen time, when it's like, you've reached your max for Instagram, I'll be like, cancel, and I'm still scrolling. So I had to buy the brick. When you want to go back to the apps, you have to unbrick it.
Starting point is 00:47:20 There's something about the psychology of like, oh, you lose, I don't unbrick already. So that's the thing that's kind of diminished my time on Instagram and Twitter and TikTok is the worst, because it's just like it knows I'm a crunchy mom of a certain age with certain anxieties and it just feeds me, all the things that keep me on. But yeah, I feel like these social media apps should come with cigarette style warnings.
Starting point is 00:47:43 When you see a pack of cigarettes and it says, can cause cancer, I wish the app would give me that reminder every four scrolls or whatever, because it feels like there's no way of getting off them, personally. Yeah. It comes back to one, these apps should be designed in a way that doesn't really hurt us as a society. Okay. So they were not always designed to keep us on for longer. They used to be optimized for other things, mostly collecting our data, which it's not like a totally great alternative,
Starting point is 00:48:18 but it's not keeping our eyes and attention on to those apps, which you can think of all the ways that's hurting us in a disconnected society. So we can do better. They can be required to optimize toward other things that are not that. That's part of the good news. The other part of the good news is that little competition you alluded to with your brick, you and your brick, like, alluded to with your brick, you and your brick, like, you know, it is the brick and am I going to open the brick or not? People don't like to be scammed and people do like to sort of at least feel like they have control over their own lives. So what I've found a lot of success in is talking to people
Starting point is 00:48:57 about like, make what you consume, earn your attention, make them earn it. Whether it's like a funny dog video, make sure it's a really fun. If like you didn't think that video was that funny, get off the app. Make them earn your attention. If someone tells you that something just happened in Mozambique, make them earn it. It better be a really good story with really good reporting, with really clear sources, and you better know who's funding that outlet. I think if you tell people they are coming for you, they're coming for your attention, they're coming for your beliefs,
Starting point is 00:49:30 they're coming for the bad mood you're in and you better make them earn it because they are making money. They are making so much money and all you need is to see the picture of Mark Zuckerberg riding his surfboard water thing. Just have that in your head. He's out on his his surfboard water thing. You know, like just like have that in your head. He's like out on his fancy surfboard water board thing and you are in a bad mood and stressed and don't know what to believe and overwhelmed and not spending time with your kid who needs help with their homework and maybe you're driving while distracted.
Starting point is 00:49:57 Like, come on, you know? And I have found that that like make them earn it message is so much better than trying to be like, okay. And so in order to detect an AI generated image you're gonna look for the funky ear. Like that's just never gonna like, that just sounds like hard work. Instead I really do encounter my information as like,
Starting point is 00:50:15 you know, as decided by me, how much attention should I be giving this? And that includes traditional media. They've got to earn it too. You know, when you were saying earlier, Christiana, I like Reuters, I like AP. What I would encourage you to do is tell people why. You have a background and so you know what kind of reporting standards they have. But most people, we are not learning about what kind of standards even exist.
Starting point is 00:50:39 And so I encourage people to tell them, why do you believe what you believe? What are you using to get someone to earn your attention? So that has worked, kind of, enough. It's working. We're hoping. So that's sort of good news. As humans, we don't like to be scammed. We don't like to be taken advantage of.
Starting point is 00:50:58 We don't like to be profited off of. And that is the story here. I like that. I like that on a macro level and on a micro level. I think it's wonderful to say to people, keep pushing politically, try and get to the point where these companies and these organizations are regulated. And Europe has shown us that it's possible, by the way. So it's not an impossible task.
Starting point is 00:51:20 I like this one a lot. Make them earn it. Yeah, I love that. Make them earn it. Make them earn it. Yeah, I love that. Make them earn it. Make them earn it. I love it. Okay. Like look at the big guys who are making all of this money and trying to control how we live. And you know what this reminds me of actually I just learned recently. If you've got a problem with mosquitoes, you shouldn't try to kill all the mosquitoes that are coming into your house. What you need to do is get like a, it's like a special type of trap
Starting point is 00:51:49 that lures the mosquitoes in, but in particular, not to just kill them, to get them to lay their eggs. And when they've laid all of their eggs in there, it kills all of the eggs and then the ecosystem of mosquito around your house dies. And so you may get the occasional mosquito, but because you've gone for the system, not each little misinformation or false story, you spend less time doing that.
Starting point is 00:52:15 So yes, this has been really great. Thank you so much, Joria. Hopefully one day we're having this conversation and you're like, oh, I have nothing to do because we are successful in our quest. We eliminated lies I can focus now I can focus on my attachment theory that I just can't exactly relationship content which I definitely don't need um this is wonderful thank you Joanne What Now with Trevor Noah is produced by Spotify Studios in partnership with Day Zero Productions and Fullwell73. The show is executive produced by Trevor Noah, Ben Winston, Sanaz Yamin and Jodie Avigan.
Starting point is 00:52:58 Our senior producer is Jess Hackl, Claire Slaughter is our producer. Music, mixing and mastering by Hannes Brown. Thank you so much for listening. Join me next Thursday for another episode of What Now?

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.