On with Kara Swisher - The Fog of War: Navigating Disinformation Now

Episode Date: October 23, 2023

Social media has been inundated with disinformation about the Israel-Hamas war — from a flood of graphic visual content, to unsubstantiated claims and opportunistic content generation (and monetizat...ion) by third parties to this conflict. To make sense of this fog of war, we turn to a panel that brings together a reporter, a researcher and a former Facebook/Meta insider: Shayan Sardarizadeh is a senior disinformation journalist with the BBC, Renée DiResta is a research manager at the Stanford Internet Observatory, and Katie Harbath spent 10 years as the public policy director at Facebook. Together, they unpack how we got here – and how we might seek clarity in a moment fogged by intense emotion, unfolding information and immense complexity. Questions? Comments? Email us at on@voxmedia.com or find us on social media. We’re on Instagram/Threads as @karaswisher and @nayeemaraza  Learn more about your ad choices. Visit podcastchoices.com/adchoices

Transcript
Discussion (0)
Starting point is 00:00:00 Support for this show is brought to you by Nissan Kicks. It's never too late to try new things, and it's never too late to reinvent yourself. The all-new reimagined Nissan Kicks is the city-sized crossover vehicle that's been completely revamped for urban adventure. From the design and styling to the performance, all the way to features like the Bose Personal Plus sound system,
Starting point is 00:00:23 you can get closer to everything you love about city life in the all-new, reimagined Nissan Kicks. Learn more at www.nissanusa.com slash 2025 dash kicks. Available feature, Bose is a registered trademark of the Bose Corporation. Support for this show comes from Constant Contact. If you struggle just to get your customers to notice you, Constant Contact has what you need to grab their attention. Constant Contact's award-winning marketing platform offers all the automation, integration, and reporting tools that get your marketing running seamlessly,
Starting point is 00:01:02 all backed by their expert live customer support. It's time to get going and growing with Constant Contact today. Ready, set, grow. Go to ConstantContact.ca and start your free trial today. Go to ConstantContact.ca for your free trial. ConstantContact.ca. the Vox Media Podcast Network. This is On with Kara Swisher, and I'm Kara Swisher. And I'm Naima Raza. Kara, you sound under the weather. This is my October cold, which comes into a November bronchitis, and then maybe a December flu. When you have toddlers, this is what happens.
Starting point is 00:01:56 Yes, tis the season. And when you work with Kara, this is what happens. I think I'm just at the beginning, just at the early stages of this. Thank you, Cara, for our Monday hangout. Thank Clara and Saul and their preschool class. It's true. On Thursday, we had Christiana Ompour on the show to discuss the conflict in the Middle East. And this is obviously a hot war, but it's also an information war. And that's what we wanted to talk about today, that organizations and people are trying to garner support for their cause. There's a flood of information, and there's also a flood of disinformation that's entered the sphere.
Starting point is 00:02:26 Yeah, every war has information battles within it, whether it's the radio for the Nazis or, you know, television in Vietnam. That certainly played a big role when people saw the pictures coming over. And this one's even sort of more problematic because you don't know what's real and what's not, and you don't know where it's coming from. And people also are making money from it. Yes, the monetization is a real issue. And also just any war presents opportunity for opportunists. So you see this in physical conflicts where people come in for arms, and you're seeing here a number of Islamophobic bot accounts of Indian origin coming on, as well as Chinese and Russian propaganda here
Starting point is 00:03:05 trying to wedge and monetize. It really just is propaganda, just writ large and using AI tools and all kinds of digital tools. And the problem is most people are now getting their information right now. And it can be used in all kinds of ways with video from previous things, with genuine video from another conflict that is used here. There's even people on TikTok who are not in the region who are putting on helmets and pretending they are. It's really quite sick in a lot of ways, and at the same time makes perfect sense. Even in this hog of war over the hospital strike, the conversation all of a sudden moves to this one thing. It moves to this very focused conversation about the hospital,
Starting point is 00:03:44 and it's hard to broaden out. It is, especially because there was some real things that happened, real things, which is including trying to get aid in there, which is, I think, that's really important. And so, look, this is, it was a tragedy, absolutely, but what it becomes is not the actual thing. It becomes an argument over the thing. And it's not really about people losing their lives. It's about, you know, it's about scoring a point for your team. And that's really, really not the point. It's that people died. And not just that people died in this one blast, but people are continuing to die. And if we just focus on trying to understand one thing, any one item of this war, we can actually be, you know, completely confused. It's like all social media.
Starting point is 00:04:29 It's a distraction. And again, I don't mean to minimize it, but people do this, go down these alleys on every topic known to man. Some of them are very benign, like cooking or whatever, like how to cook this particular thing. But some of them, really, people go down these alleys, and it's designed to make you crazy, and that's what it does. Yeah, and it's also a time where there's so much disinformation and so much awareness of disinformation that this phenomenon, the liar's dividend, occurs. If anything can be fake, then nothing can be real, and so anything can be disregarded as fake. Right, which is the whole point of this, is to create, you know, it's called the fog of war in the old
Starting point is 00:05:04 days, and now it's really, the fog of war in the old days, and now it's really, really foggy. And we should mention here how the platforms have changed dramatically in the last year, what pressures they're under. They've pulled back quite considerably. They've had such a hard time. And listen, let me give them one thing. It is really hard, but this is the business they're in. They have to take responsibility for what's happening here, and they just have never wanted to. And they have more tools, for sure, but they've cut back on staff. But it's become clear it's a flood of information, not just by real people, but by bots. And again, this genre of AI has lifted this into a quantum level of difficulty to deal with.
Starting point is 00:05:41 And they really don't have the intent to. They pretend they're not media companies, but they are. And media companies which make mistakes, look, a lot of media companies have made, traditional ones have made mistakes here, but they self-correct pretty quickly and get better. Meta had to apologize after inserting the word terrorist into the translation of bios of some Palestinian Instagram users. They said that was an error of auto-translation. They have issues all the time, but they're really having problems here, as reported by the BBC and by The Guardian and others. And then per reporting from Mike Isaacs of The New York Times and others,
Starting point is 00:06:13 thousands of users posting pro-Palestine content have also reported that their posts have either been suppressed or removed, even when they're not in violation of the platform standard. Yeah. Mehta, of course, is under extreme pressure here and had a response to this issue. They had a typical Facebook statement, long and confusing, but they said the part that's
Starting point is 00:06:32 important is we apply these policies equally around the world and there's no truth to the suggestion we're deliberately suppressing voice. We can make errors and that is why we offer an appeals process for people to tell us when they think we've made the wrong decision so we can look into it. I mean, that's like the alley to end all alleys. I don't describe them nefarious things. It's usually that they can't handle it. Yeah, they're also under extreme pressure from the Europeans on the DSA to get their hands around disinformation.
Starting point is 00:06:57 They're trying to avoid, I presume, violent content being on the platform. I've asked whether it's possible that those policies have disproportionately affected certain groups unintentionally, and we're unable to get specific answers to those questions. You know, everybody has a problem with these platforms because they don't do a good job. That's really the situation. And so that's going to always come up as a problem because they're inadequate to the task, which I've been saying for about a decade now, but we'll see. But we brought on a panel of great guests today to speak to us about this and help make sense of it. Renee DeResta is with the Stanford Internet
Starting point is 00:07:32 Observatory. She's been researching how disinformation spreads online for years and has previously worked with the Senate Select Committee on Intelligence, as well as Congress and the State Department. She's also under attack by a lot of mostly Republican groups about the work she's doing. And it's part of a broad pattern of trying to chill academic research into the area. Katie Harbeth is our second guest, is a former Republican strategist who spent a decade at the public policy team at Facebook, now Meta, where she led the team that managed elections on the platform, I think, till 2021. She also has her own policy advisory firm called Anchor Change. And our final guest is Cheyenne Sardarizadeh.
Starting point is 00:08:09 He's a senior journalist at BBC Verify, where he covers disinformation, conspiracy theories, and extremism. He has not slept a lot recently. He's someone whose work I've followed for a while. It really takes an international lens and has helped debunk misinformation, disinformation everywhere from Iran to Europe and has a very global perspective. Yeah, a lot of media companies now have disinformation reporters because it's almost, it's impossible to keep up. He never probably speaks because there's always an issue
Starting point is 00:08:34 that you have to look into. We'll take a quick break and we'll be back with the panel with Renee, Katie, and Cheyenne. Fox Creative. This is advertiser content from Zelle. When you picture an online scammer, what do you see? For the longest time, we have these images of somebody sitting crouched over their computer with a hoodie on, just kind of typing away in the middle of the night. And honestly, that's not what it is anymore. That's Ian Mitchell, a banker turned fraud fighter.
Starting point is 00:09:17 These days, online scams look more like crime syndicates than individual con artists. And they're making bank. Last year, scammers made off with more than individual con artists. And they're making bank. Last year, scammers made off with more than $10 billion. It's mind-blowing to see the kind of infrastructure that's been built to facilitate scamming at scale. There are hundreds, if not thousands, of scam centers all around the world. These are very savvy business people. These are organized criminal rings. And so once we understand the magnitude of this problem, we can protect people better. One challenge that fraud fighters like Ian face is that scam victims sometimes feel too ashamed to discuss what happened to them.
Starting point is 00:09:58 But Ian says one of our best defenses is simple. We need to talk to each other. We need to have those awkward conversations around what do you do if you have text messages you don't recognize? What do you do if you start getting asked to send information that's more sensitive? Even my own father fell victim to a, thank goodness, a smaller dollar scam, but he fell victim and we have these conversations all the time. So we are all at risk and we all need to work together to protect each other. Learn more about how to protect yourself at vox.com slash Zelle. And when using digital payment platforms, remember to only send money to people you know and trust. Support for this show is brought to you by Nissan Kicks. It's never too late to try new things. And it's never too late to reinvent yourself.
Starting point is 00:10:47 The all-new reimagined Nissan Kicks is the city-sized crossover vehicle that's been completely revamped for urban adventure. From the design and styling to the performance, all the way to features like the Bose Personal Plus sound system, you can get closer to everything you love about city life in the all-new, reimagined Nissan Kicks. Learn more at www.nissanusa.com slash 2025 dash kicks. Available feature. Bose is a registered trademark of the Bose Corporation. Katie, Renee, Cheyenne, thank you all for joining me. Thanks for having us. It is on. the first weeks of the Ukraine war, and why you think that is. Katie first, then Cheyenne, then Renee. To me, it seems like more, particularly given the coordination Hamas had on social media
Starting point is 00:11:52 and ready to be pushing out their own videos and images of this, plus all of everybody kind of jumping on it of adding videos and images from past conflicts and other things. So it just seems like from what I'm hearing from others that the the volume is more, though Rene and Cheyenne may have different opinions on that. Cheyenne? I think it would be difficult to, because, you know, I don't have all the sort of data from both conflicts to be able to compare and say, well, this one was more, this one was less. I think what I would say is, it's definitely been overwhelming, because I was doing the exact same thing in the first few days of the Ukraine conflict and there was a ton of misleading videos and images and it's not just by the way visual posts, it's also completely unsourced,
Starting point is 00:12:36 unevidenced claims. When something breaking is developing and in this case being a war, that can actually have consequences. So I would say it's been quite overwhelming. I would say probably more or less similar to the Ukraine war. I can't definitively say which one was more, but both of them were bad enough, basically. Renee? I'd say we don't know. Reason being, we don't have Twitter API access anymore, right? Right. Explain what that is for people who don't understand that.
Starting point is 00:13:07 Yeah. So back in the olden days, there was really good researcher relationships between academic institutions and Twitter, and we had access to what we might call the firehose, right? Various types of firehoses. I'm not going to get into the details, but ways that we could build tools, create dashboards, and just ingest data directly from the company. And because it was an academic project, we didn't have to pay for it. The kind of data access that we had now costs over $42,000 a month. So a lot of academic institutions have backed out of observing Twitter, which means that our focus has really been on Telegram.
Starting point is 00:13:42 That's not entirely bad, right? That's where a lot of the content that is for the impacted populations, they're not necessarily sitting on Twitter. So spending our time on Telegram isn't a bad thing. Elon very significantly changed curation. So there's what's visible and then there's volume and those are not necessarily the same things. What's visible is really decidedly different. And I think maybe Cheyenne would agree of often a very significantly worse quality in terms of accuracy. Cheyenne would agree of often a very significantly worse quality in terms of accuracy. Cheyenne, we've seen video games and TikToks
Starting point is 00:14:10 from old concerts repurposed as misinformation online. We've also seen rumors, as you just noted. Give us a rundown of what you think are the five most viral pieces of dis and misinformation and tell us what you know about them. Obviously, a lot of attention was given to the unsubstantiated claims that 40 babies were beheaded, but I'd like you to pick out your own. I would say that the sort of the most viral stuff that I've seen, that I've been logging in the last
Starting point is 00:14:36 two weeks has been basically video that is either unrelated to what's been going on in the last two weeks underground in either Israel or Gaza, you know, could be from the past conflicts. Obviously, Israel and Hamas have been involved in several conflicts just in the last 10 or 15 years. So either from those past conflicts or from the war in Syria or from the war in Ukraine or from military exercises. In the case of TikTok, actually, it's become really, really fashionable now that when a conflict happens somewhere, you just say you're running live streams of that conflict.
Starting point is 00:15:06 And you either use video of past conflicts or you use a YouTube video of military exercises and actually make money off of it. And put on a helmet. I've noticed some people are not there putting on helmets, correct? Absolutely. And actually make money off of it. That's the important bit. So most of the stuff that I have seen, and I've seen stuff from both sides, by the way. It's not been one-sided at all.
Starting point is 00:15:26 I've seen claims from both sides, from both directions, also supporters of both parts of this conflict, sharing all sorts of completely untrue material. And the most important thing for me personally is that this is not fringe stuff, and this is what people need to know. Meaning what? We're not talking about stuff that is being shared by 50 people or 100 people and, you know, 100 retweets, 200 likes. We're talking about material that's been viewed tens of millions of times on platforms like X, formerly Twitter, TikTok, YouTube, Facebook, Instagram. You know, we're not living in the 1950s and 1960s anymore. People these days don't necessarily sit in front of a TV and watch their sort of nightly bulletin to find out what's happening around the world. They go on the internet, they go on social media, they look at their feeds.
Starting point is 00:16:09 They want to get updates constantly, particularly when there's an event of this magnitude. So quite a lot of the visual evidence that they've been getting, unfortunately, online in the last two weeks has been completely false. There's also been quite a lot of video that's been shared that's actually genuine and from the last two weeks has been completely false. There's also been quite a lot of video that's been shared that's actually genuine and from the last two weeks and has been helping us, journalists who want to investigate what's going on, for instance, with what happened at the hospital two nights ago. Right. We'll get to that in a minute. Everything that we've done has been based on footage that's been shared online that is genuine.
Starting point is 00:16:38 But the point is you have to verify first that that footage is genuine. And then while you're doing verification, quite a lot of stuff is actually untrue. And you see, you know, a piece of video that is from the Arma 3 video game, which is a military simulation video game, has 4 million views on TikTok. So Katie, there's talk of this being a TikTok war. That's according to Bloomberg, showing a new role for the platform, which has more moderation than the other platforms. Now, of course, you worked at Facebook, now Meta for 10 years. Explain the platform shift and if and why it matters. Well, I think first and foremost,
Starting point is 00:17:09 it's that for many, many, many years, platforms like Google, Facebook have built up their defenses in trying to be able to find some of this content. They're still having a lot of challenges as well in terms of doing that. And as Cheyenne was mentioning
Starting point is 00:17:23 and Renee too, verifying this content and working with fact checkers to verify it and then to decide as well in terms of doing that. And as Cheyenne was mentioning and Renee too, verifying this content and working with fact checkers to verify it and then to decide whether you're going to de-amplify it, are you going to take it down? Who's doing it? What's their intent in sharing it is also a very hard thing to do. And so a lot of these newer platforms are having to grapple with a lot of the questions that some of these legacy platforms have already kind of worked through and spent the years of refining their policies and their algorithms and everything to find. Cheyenne, I do want to ask you about how you identify what is disinformation. How do you account for that? Well, I think the first and most important thing is to clarify,
Starting point is 00:17:58 me and my colleagues, we only go for content that is viral. And then when we have a piece of video or an image or basically just a post online that is a claim that is either inc And then when we have a piece of video or an image or basically just a post online that is a claim that is either incendiary or has implications, what we want to know first of all, when was this piece of video filmed? Where does it come from? Who's the original source? Can we actually source the video to the person who filmed it, the platform that way it was first shared? Because obviously you put something on Telegram or on WhatsApp, then it travels across platforms. So just because you put something on Telegram or on WhatsApp, then it travels across platforms. So just because you see it on TikTok or on Instagram
Starting point is 00:18:27 doesn't mean that's where it came from. You have to source it. You have to find out who filmed this piece of footage, who first posted it online, and then you have to contact them and talk to them because they probably have more context. Then the second thing is, is this actually the entire footage? Is there a longer version?
Starting point is 00:18:43 Have other people been at the scene where this video was filmed? Meaning trying to manipulate it to look. Well, exactly. Has it been edited? And then the next thing is, is this actually current footage or is it old? So we have to go online and look on platforms like YouTube, Instagram, TikTok, you name it, and try to find... You do reverse image search, correct?
Starting point is 00:19:04 Yes. So we take screen grabs of pieces of video with images. You don, you don't need to do that, the image is there, you can just reverse search it. But with video, we take 5, 10, 20 screen grabs of a piece of video, and then we go online on several reverse image search tools, including, you know, Google, Yandex, Bing, and then we try to find whether there are other examples of this video shared online in the past. Then if it is from these past two weeks, then you then want to investigate it properly and find out what it actually shows, who actually filmed it, where it came from,
Starting point is 00:19:35 because sometimes you have genuine pieces of video that are either edited or deceptively manipulated or taken out of context. So talk about the sources of disinformation on both sides and where they're coming from, because a lot of other actors have also gotten involved. There's a lot of anti-Palestinian disinformation and generally more Islamophobic content around this conflict coming from India. So where are the sources? I would say the vast majority of misinformation that I've seen
Starting point is 00:20:03 has come from people who seem to have nothing to do with the conflict directly. So it's just people online who are farming engagement, farming followers, farming influence, and in some cases trying to put as much outrageous, shocking content as they can to make money off of it. When it comes to two sides of the conflict, being the government of Israel and Hamas, we expect the two sides involved in a war to actually try, because wars these days are not just fought on the ground. There's an information war as well that you need to win. So we expect the two sides to try to put whatever they can online, regardless of whether it's factual or not, to win the information war. So that's expected. And also for people who live either in Gaza or in Israel, again, because of all the atrocities that have happened in the last two weeks, you expect them to be emotional and obviously taking, you know, taking a side.
Starting point is 00:20:51 Yeah, having opinions, etc. You absolutely expect that. And people, you know, obviously people have seen horrific stuff. So, you know, I don't pay too much attention to somebody who's emotionally affected by this conflict putting something out that is misinformation. somebody who's emotionally affected by this conflict, putting something out that is misinformation. What is important to me is people who are not directly related to it, say somebody sat in America or sat in Great Britain or in China, and they're posting content that, in my view, is just for getting influence and engagement online. Now, apart from that, there's also more sort of nefarious misinformation or disinformation that is put out for political
Starting point is 00:21:23 gain. One good example of it, last week, there was a video that was posted online that quite a lot of people saw that had the branding, logo, and style of BBC News. This one was actually fake, 100% fake. We didn't produce it, but it looked genuine, and it said that we had reported that the weapons that Hamas militants used on the 7th of October had come from the government of Ukraine, or was weapons that had given to the government of Ukraine by Western powers that had been smuggled out of Ukraine and ended up in the hands of Hamas. There's zero evidence for it. We have not reported it. And then Dmitry Medvedev, who is the former Russian president, put out the same baseless claims online. So you have to think, why would someone go through that effort to produce a fake
Starting point is 00:22:04 BBC video to say, well, Hamas militants got their weapons from the government of Ukraine that has nothing to do with this conflict? Yeah, yeah. Why would they do that? I wonder, Cheyenne, I wonder why they would do that. I don't know. Maybe I have some ideas. I want to get into one specific attack, the blast at the Al-Ali Baptist Hospital. None of us are experts on the airstrikes, but the blast and questions around it are metastasizing online. Katie, you've worked on elections where anyone can build whatever narrative suits their purposes. Is this common, what's happening here? It is, but I think one of the things that I'm seeing that's different is also just the confusion amongst mainstream news organizations in terms of,
Starting point is 00:22:46 you know, the New York Times had a headline that I saw that, you know, initial reporting, trying to be first at this. And that helps to add to the confusion of what Cheyenne was saying was then when people are making fake videos of this, of using the branding and stuff of these news platforms, it continues to contribute to people just not trusting and not knowing what is true because we're all trying to figure it out in real time. And so that makes it much harder to verify what is or is not true from a social media company standpoint. But I think everybody, what you choose to amplify or not amplify and trying to figure
Starting point is 00:23:23 that out while it's all happening in real time. To me right now, this just feels like it's just coming faster and higher volumes and more things happening that it's just a lot more facets to have to deal with than what I've necessarily seen in a particular election situation, unless it's something like January 6th. But even then, this feels like even higher stakes because the amount of gruesome images. And that also has an impact on the people trying to moderate this, trying to cover this.
Starting point is 00:23:52 There's an emotional aspect to this as well and a burnout that continues to happen. This continues to go on longer and longer. Yeah. So, Renee, how do you look at this? Is this common from your perspective? You've seen hundreds of these over the years, I would assume. We have. I think what I would say is most different is the widespread democratization of generative AI.
Starting point is 00:24:15 That's what's really different here, right? That was what wasn't the case in February of 2022 during the initial Russia-Ukraine invasion. There was, you might recall, there was a lot of, you know, there were rumors, there were stories, you might recall Snake Island, right? So, if you remember Snake Island and the way that that was reported and then they hadn't died, but, you know, you had a, they had been taken hostage, but there was a whole narrative that was built around. It was used for heroism, yeah. Right, the heroism narratives, right? So, you do see the governments come into play, right? The Ukrainian government was quite good at war propaganda in the early days, really just kind of like riling people up about that. I followed that story. I remember that with something like this, though, what is really distinctly different now is the liar's dividend piece of this where content that is real, you can say is faked because of the existence of the technology to fake it. Right. And so it's this question of, you know, I see it as like the kind of collapse of consensus reality.
Starting point is 00:25:11 Right. You can pick which ones you're actually going to trust. You can pick which ones you're not going to trust. You can dismiss all the rest of it as like, oh, that's AI generated. And that's really, I think, for me, you know, I did follow actually the beheaded babies rumor quite closely, right? Again, because we were in the first responder telegram channels and, you know, you see images in there and you're like, well, it's not a, you know, there were very specific claims that were made, 40, right? That's a very, very specific claim. There's no evidence of that, right? You know, and so you're trying to piece together, well, something clearly happened here. Like, here's this image.
Starting point is 00:25:44 trying to piece together well something clearly happened here like here's this image and um and then you know of course the the government tries to put out something showing like here are a few instances of this happening no it wasn't 40 but here is this here is this image and then what wound up happening was um ben shapiro tweeted retweeted the image and if you all kind of followed this micro controversy that happened where state of is, you know, official channels put out one of these images and Ben Shapiro shares it, right? Yeah. Saying this really did happen. Look at this. Huge following. And he has a massive following. And then people who didn't like Ben Shapiro, I think actually also on the right, which was sort of funny, went and took the image and ran it through AI or not.com. And those detectors are not particularly reliable. A lot of the time,
Starting point is 00:26:24 there are a lot of false positives and false negatives. And so what wound up happening with that was there was a pixelated section of the image where they had blurred out a logo or something and the AI or not.com initially, you know, returned AI generated. Now, if you did a closer crop just of, you know, unfortunately the kind of majority of the content, right, the sort of deceased person, then it returned that it was not AI, right? So if you had this one section in there. So this was the kind of example of then people. Yeah, everybody's an expert. Everybody's an expert.
Starting point is 00:26:57 Yeah. And it turned into a whole thing. Oh, Ben Shapiro shared an AI generated image. You know, oh, the government of Israel faked an image of a dead baby. Right. generated image, you know, oh, the government of Israel faked an image of a dead baby, right? You know, and so there's like all of these different channels of conversation around this where depending on who you trust or what you trust. So even in trying to explain it, you can get caught in the thing, especially if you have a feeling about Ben Shapiro. In any case, Cheyenne,
Starting point is 00:27:21 your news organization has published an explainer on the blast, moving back to the blast, via BBC Verify, which is updated continually, which I appreciate. Explain what it is and what's concluded so far, but will it make any difference at all? Well, yeah. Yeah, I mean, since that blast happened, we've been working sort of nonstop, sort of gathering every piece of video that we can and all the images. And we've been analyzing them. Well, I've slept like, I think, three, four hours since that blast happened. We've been trying really hard to get to the truth of what happened because dozens and dozens of people have died. We don't know the exact number. Obviously, the Palestinian authorities say about 500.
Starting point is 00:28:04 You know, that's their claim. But clearly, dozens of civilians died. We don't know the exact number. Obviously, the Palestinian authorities say about 500. That's their claim, but clearly, dozens of civilians died. So every piece of video that we could, we've gathered. We've tried to analyze them. We've tried to geolocate them, make sure this piece of video was actually shot. We know that the blast from the live feed that went out by the Al Jazeera channel. And it 100% is verified that the blast happened at around 1859 Gaza time. So we've tried to match that with every piece of footage that we've got to try and make sure that the footage that is being shared online was taken at that time. Definitely shows the hospital footage of which we have obviously from satellite imagery. And then try to find out what exactly happened. And we've contacted something like 20 different experts from, you know, with different views,
Starting point is 00:28:48 different ideas. We've showed them not just the videos that were published at night, because some of them, you know, obviously you can't see everything. But also yesterday morning, there were tons of images and videos that came out. We know that the blast happened at the courtyard of the hospital. It didn't impact the main building. So whatever it was, it was the courtyard of the hospital. It didn't impact the main building. So whatever it was, it was the courtyard of the hospital in the car park.
Starting point is 00:29:12 So there's a crater there that was then was some of the images and videos that we saw. It was a small crater. And also there are signs of obviously the impact of the blast on the cars and on some parts of the hospital, on the windows. So we showed all of that that we had independently verified to experts. You know, at the moment, what we've been told, and obviously more evidence will come out, we also got a reporter, there was a BBC reporter on the ground who's been to the scene and spoken to eyewitnesses. At the moment, it seems inconclusive. From what we've been told by not all, but most of the experts that have spoken to us is that it doesn't at the moment seem consistent
Starting point is 00:29:45 with the damage you would expect from an airstrike. But, you know, which direction it came from, who was exactly responsible at the moment from the experts that they've told that we've spoken to, they say still inconclusive. When something like this happens, and obviously people are outraged, they want easy, quick answers. It's important to say there are no easy, quick answers in a war zone with limited access for journalists. So just be patient. Facts will hopefully come out. And in some cases, we may never know all the details and all the facts surrounding it. Renee, once a conclusive answer is established, if possible, as Cheyenne says,
Starting point is 00:30:17 what's the responsibility to social media companies have to stop or at least de-amplify the reach of misinformation around such an event, given it has such real-world repercussions to upset and create, you know, real, there's protests all over the place. So what's their responsibility? Because this has so many echoes of incidents in Myanmar or elsewhere in the past. It reminds me, again, I feel like I'm on constant loop. Well, I think the challenge is, you know, some people are going to see something go by once. They're going to form an opinion on it. They're not going to spend a whole lot of time thinking about it, and then they'll just move on. Other people are going to really follow this story.
Starting point is 00:30:55 I think in the particular case of the, you know, bombing or missile misfire, whatever it turns out to be, a lot of people are following that, and a lot of very prominent accounts are following it. So I think it is going to stay kind of at the top of the feed as people debate what happened around it. I think with social media, one of the real challenges is in a conflict situation like this, people are looking for it, right? And so you're going to have to return something. And so I think the best thing that you can do is try to return authoritative sources, try to return people who are verified to be in the region. Right. And that's hard to do. That really takes effort. We've staff to do any of the things that ordinarily would be done, right? To say, okay, who are the credible sources? Who are the people who are in region? Who are the official accounts? Maybe we shouldn't surface paid blue checks for this one. Maybe the armchair opinion of some rando commentator who paid me eight bucks is not the person that we should be putting at the top of the feed. That's a crazy idea,
Starting point is 00:32:09 but that's the sort of thing that, you know, Twitter would have done in prior environments. They would return to authoritative through badges, fact checks. Right. And, you know, I think community notes is a great concept, but it's better when you're surfacing corrections for things that are established. It's not equipped. They're not journalists. They're researchers, maybe, or commentators who are clarifying a fact or a connotation, maybe something a politician says. It's great for slow-moving stuff like that. There is nobody sitting on their computer typing up a community note who is on the ground in Gaza or on the ground in Israel who has any idea of what the actual facts are. So you're winding up with community notes, quote unquote, checks that have also been shown to be wrong several hours later. And that, again, you know, is not a dig on citizen journalism. It's that that's not citizen journalism. That's the whole point, right? You should be servicing citizen
Starting point is 00:32:59 journalism, but then it requires the actual effort of going and figuring out who the citizen journalists are in this particular case, who the channels are that that, you know, that that are authoritative and should be returned first in search. We'll be back in a minute. Do you feel like your leads never lead anywhere and you're making content that no one sees and it takes forever to build a campaign? Well, that's why we built HubSpot. It's an AI powered customer platform that builds campaigns for you, tells you which leads are worth knowing and makes writing blogs, creating videos and posting on social a breeze. So now it's easier than ever to be a marketer. Get started at HubSpot.com slash marketers. Thank you. say you heard about Indeed on this podcast. Indeed.com slash podcast. Terms and conditions
Starting point is 00:34:25 apply. Need to hire? You need Indeed. I want to move to the platforms. Katie, let's start with you. You were 10 years in public policy at Facebook, which owns three of the platforms I'm going to mention. I'd like you to stack rank the major social media platform from best to worst in terms of content moderation and their ability and willingness to slow down the spread of misinformation and what tools they have available. That would be TikTok, Facebook, Threads, Instagram. I guess I'll just mash them together. Twitter, unless you want to put them apart. Twitter, Reddit, and YouTube.
Starting point is 00:35:02 Rank them and explain your reasoning. and YouTube. Rank them and explain your reasoning. Yeah, and I think it's a little hard to, it's a little apples to oranges in each of the cases because each of the platforms, how it appears and stuff is a little different to them. I will say, at least in terms of what's been announced, so I know I've seen announcements from Facebook,
Starting point is 00:35:19 from Twitter, from TikTok. I've not seen detailed explanations from YouTube yet or any of the others on what they're doing. Facebook has a lot of nuanced tools that they're putting out that I'm not seen detailed explanations from YouTube yet or any of the others on what they're doing. Facebook has a lot of nuanced tools that they're putting out that I'm not seeing from others. They've got ways for people to lock down their profiles. They're making it so that people can only have comments from people that follow them or their friends. It looks like they're trying to give people more tools in terms of trying to do this. people more tools in terms of trying to do this. Whereas TikTok also put out a myriad of things that they're doing, but it's not necessarily as detailed. It's a lot of very focused.
Starting point is 00:35:51 We have these policies. We have teams that are doing it. The question that we don't know is how well they are at executing on those policies and having those tools and stuff to do that. X slash Twitter is just in a totally different world, universe. It's where we see stuff spreading the most after Telegram, right? And we're not talking enough about Telegram. I'm going to ask about that in a second, but go ahead. I was just going to say, I know the instinct is to talk about
Starting point is 00:36:15 the platforms that we know, but a lot of these efforts are moving to those other ones where they have very lack content moderation. Telegram would be very last. X is maybe a little bit on top of that. And I just think all of them are having a big challenge to what we've been talking about of trying to figure out what to do
Starting point is 00:36:33 while people are still trying to verify this information. And they're being asked to surface authoritative sources, but authoritative sources are getting it wrong. Who do we pick? What is a good eyewitness on the ground account? What is not? Should we take more stuff down to be on the safe side, but then run the risk of actually suppressing legitimate speech? And I think that it's just by virtue of having to do this a lot more, your Facebooks and your YouTubes are more equipped for it. X doesn't have the staff,
Starting point is 00:37:01 nor do I think they have the leadership that's ever gone through this before to really understand these nuanced questions. And then you just have the platforms like Telegram that just don't care. Renee? I think I would agree with most of what Katie said. I think you see a lot of commentary on threads, for example, of people just saying, I just want some facts. I feel like Twitter is a firehose of bullshit. I just want to kind of understand what's happening. The journalists are here now. I'm just going to kind of assume, you know, I was sort of joking around about it, but it's really true that I actually just started going to news sites. I was like, okay, what is, you know, what is wallstreetjournal.com have? What is the times.com have? And then triangulate. Yeah. Right. Go to Al Jazeera.com, right. You know, what, what are the different sites who's reporting this from
Starting point is 00:37:40 different angles and, and, and where are the actual journalists on the ground again i don't want to see commentators who paid for checks um whereas on threads that it was much more of like a it's a it's a much cleaner feed and so it didn't have that kind of that kind of stuff so just in terms of curation i think curation is is different than moderation but in terms of what's being surfaced it's a better experience Telegram doesn't moderate. It's a different kind of content spread, right? Things will hop from channel to channel. And so you'll see, you know, when we look at shares of content on Telegram, we will actually look at who's sharing content from what other channels to try to identify new and emerging channels. Because, you know, sometimes you get a channel that's created and it says something very salacious that you've never seen
Starting point is 00:38:28 anywhere before. All of a sudden, like it gets, you know, their content is disseminated. We saw this constantly in Russia, Ukraine, right? Some random channel would make an allegation, it would get pushed into mainstream ones, then they in turn would get a bunch of new followers. And then, you know, that would be how they would kind of amass an audience through that. Cheyenne, what do you think? Yeah, I think Telegram, I would say, is probably the most important news gathering platform for us, and has been for quite a while. You know, from the war in Ukraine, you know, Telegram is one of the most popular apps in both countries.
Starting point is 00:39:00 And also with the conflict in Israel and Gaza at the moment, also in both countries, Telegram is quite... Particularly, you know, Hamas is obviously banned from major platforms because it's a proscribed organization in most Western countries. So most of its content comes from Telegram. So they post on Telegram, and then it travels from Telegram to other platforms. I also have a particular interest in Telegram because I also cover conspiracy theories and people who start conspiratorial narratives. Most of them are on Telegram because many of them during COVID and also after the 2020 election in the US were either suspended or their reach was limited on major platforms, so they went on Telegram. So that is the most important platform for me i would say there's not much moderation on telegram and it's not
Starting point is 00:39:49 because they sort of set themselves up as some sort of a free speech defending platform it's just i don't think they care that much to be perfectly honest i think it has to be like really egregious content like it has to be proper like you know terrorism content we're talking about like actual neo-nazi people or we're talking about you know islamic state like, you know, terrorism content. We're talking about, like, actual neo-Nazi people, or we're talking about, you know, Islamic State, or, you know, like, really, really horrific stuff like, you know, child sexual abuse images. If it's that type of content, yeah, they will take it down. And they have to, by the way.
Starting point is 00:40:17 They try to, well, they have to, because some of it is obviously illegal. But when it comes to other stuff, you know, you can post whatever you like on Telegram. Right. All right, Katie, speaking of which, Terry Britton, the EU commissioner in charge of enforcing the newly passed DSA, has started an investigation into Twitter X and sent warnings to other social media companies. Elon Musk has threatened to pull X out of Europe, for example. They warned Twitter last December. Again, they've warned them a lot of threatening fines. Yeah, I think, well, first this is, I think, Brighton seeing this as his first chance to, like, really enact the DSA, right?
Starting point is 00:40:52 And I feel like he's taking that opportunity to be like, I'm going to show that I'm going to take this seriously and I'm going to hold these platforms accountable and make them do this. I think there's a lot of concern. Civil society organizations sent him a letter about abusing this power and not being very specific in terms of what they're doing and the chilling effect that this can have on the platforms
Starting point is 00:41:12 in terms of speech and content that is on there. Like, this was not meant to be his bully pulpit, and we wanted to have a more thoughtful process in terms of doing this, and I don't know if these letters are the right way to do it, but we'll have to wait and see how this unfolds. But the other thing is that Telegram's not a part of this at all because they're not registered as a VLAP.
Starting point is 00:41:31 So we're sitting here saying Telegram's one of the most important platforms of where this content is spreading, but I don't know of anybody that has any regulatory power over Telegram to actually investigate them. So they're going after Twitter, but not this one, right? Does it matter? Correct. And it's right to be asking Twitter these questions.
Starting point is 00:41:51 And I do think Threads is not in Europe yet either. And so it's going to be interesting to see as these government regulators and these tech CEOs start to stare each other down of being like, you know, we saw it too in Canada with Meta News. Yeah, and they just pulled out. They pulled out. I think this is the EU trying to show
Starting point is 00:42:09 that they are going to take the DSA seriously. They are going to enforce it. Now it's going to be, how do the companies respond? We're just going to see this back and forth happening while we're also trying to dig into what is actually true or not and what's happening with the content we're seeing. Cheyenne, how do you look at it? As like, as we're noting before, a lot of this stuff gets
Starting point is 00:42:29 monetized and it's good for these companies. And at the same time, these regulators are trying to get their arms around it. In Europe, more aggressively, US, not even slightly. Does it matter to what you're doing to stop disinformation or does it not matter? Look, two things. First of all, I would say, you know, I'm a journalist, and as a journalist, I value immensely the right to free speech and expression. You know, I rely on freedom to be able to report. You know, I've reported in countries where you don't have that right, and it's not fun. So I would like to say, you know, I don't want people to get censored or their views to be blocked. I want everybody to have the right to express themselves freely.
Starting point is 00:43:07 Then I know also as a BBC journalist, I don't think it's for me to tell platforms what sort of policies they should come up with or what the European Union should say to these platforms. All I would say is these platforms have certain terms of service that is clear. And as long as they stick to those, as long as they can stick and have the ability to stick to those, things would be much better than they are now. The problem is because of the volume of content that is posted on those platforms, particularly when something like this happens, when there's just such a torrent of content being posted and so much interest, they just cannot physically keep up with it. of content being posted and so much interest, they just cannot physically keep up with it. And also the tools that they've, particularly the automated tools that they've designed to be able to deal with this type of content cannot keep up. Now, the problem is when you design policies specifically, and also when you design your algorithms in a way specifically to reward content that is outrageous, to reward content that is shocking, to reward content that is posted for engagement,
Starting point is 00:44:07 then obviously this is going to happen. But the platforms are not going to change their algorithms. The platforms are not going to change any of that because that's how they make money. So the best thing we can do is for ourselves to, first of all, try not to amplify content that is unverified and not accurate. And that's the most that we can do.
Starting point is 00:44:28 And for somebody like me, I can just go on and try and find the most viral pieces of footage and tell people this is not true. Renee, there's a lot of other policies on other platforms that are specific to monetization. Where again, this question of your right to be on a platform versus your right to make money spreading, you know, your right to post something that's bullshit versus your right to monetize the bullshit. Other platforms have differentiated between those two things. Well, and there's a scale issue too, right? You can have the policy to say that you want to demonetize them, but then finding those people and making that decision in a quick manner to make sure that that is going to be difficult for any platform.
Starting point is 00:45:06 Okay, we have a couple more questions. There's a couple more. Katie, social media users accused Facebook and Instagram of suppressing pro-Palestinian posts. Met executive said it was due to a bug on the platform related to reshares and posts and that it affected all users equally. Thousands of users are complaining about continued suppression.
Starting point is 00:45:24 Again, this has gone on for a long time with all these social media sites affected all users equally. Thousands of users are complaining about continued suppression. Again, this has gone on for a long time with all these social media sites about suppression and shadow banning, etc., etc., which you're familiar with. Any thoughts? What could be happening? I generally found that this is a mixture of things. There have been many times where it's legitimately a bug. But then there's also more systemic things that I think we're seeing as part of this
Starting point is 00:45:45 because there's been complaints in the past about Facebook suppressing Palestinian voices. There was a human rights assessment that was done that showed that less resources were put into machine learning classifiers in Hebrew versus Arabic. There were mistakes in terms of what dialects that they were using, how much resources they had in terms of content moderators and human capital on this that I think have also played a part in this. And so it's usually when these things are happening, there's a lot of different parts
Starting point is 00:46:16 that are playing a role in determining the contours of these problems that the platforms have. Some of it's the choices they made on resources and where they're putting it. Sometimes it is just a bug has happened and it's just a coincidence of what happened here. It's hard to know beyond what Meta actually said because, again, we don't have a ton of transparency to be able to verify what they are or are not saying what's happening. But I've usually found it's not any one thing. It's usually a combination of a whole bunch of things that contributes to something like this. Renee, on top, speaking of which, actual access to these API from the platforms,
Starting point is 00:46:50 on top of layoffs, you're getting limited access, as you mentioned, and the fact that Meta has let its social monitoring tool CrowdTangle fall apart. It's harder for researchers to track misinformation online in the United States because of attacks from conservatives, including free speech lawsuits and efforts to block any information sharing between the government and the platforms. Explain the chilling impact of how this plays out. Now, I know you and Alex Stamos are personally named in lawsuits. I'm not sure how much you can tell me, but talk about the chilling issues here. Well, I can't comment on the pending litigation. the chilling issues here? Well, I can't comment on the pending litigation. So, you know, that is a little bit of a chilling effect by itself. There's two different things there. The first is just the
Starting point is 00:47:31 technological access and prioritization, right? So Facebook is building a new researcher API. It's taking feedback. TikTok is also building a researcher API and taking feedback. So some platforms are still doing things voluntarily. This is where I think the DSA, this is the part of the DSA that I like, actually, the researcher data access piece is the part that I've been supportive of. In that regard, that basically says that since these tools are so powerful, since they have such social impact, the ability to understand what is happening on them should be made available to researchers, to qualified civil society and journalistic organizations. And by qualified, that means like you can do, you can take certain steps to protect privacy. You have certain capacity to
Starting point is 00:48:15 analyze data. You're not just going to ask for something and then go dump it on the internet somewhere. And so that piece I think is actually a good regulation. And I really wish that we would have the Platform Accountability and Transparency Act passed here similarly. Yes. As far as the chilling effect. They don't have a Speaker of the House, so they can't do that. But go ahead. True, true, right?
Starting point is 00:48:35 I mean, well, this is where I'm like, yeah, sure, we'll regulate someday when we have a functioning government. But they can't order lunch right now. So it's very hard for these people. But the other piece, right, which is that information sharing piece, you know, we hit a point where misleading, you know, nonsense, candidly, made by, you know, rent-a-quote fake think tanks were laundered through conservative media to create an impression of some sort of vast collusion operation between academics and government or government and platforms. There were certain times when governments engaged with platforms in ways that I think
Starting point is 00:49:08 were counterproductive, right? That job-owning line. Were they trying to coerce them to take down content? That's where I think another, again, transparency law to have government requests be logged is a very, very useful and necessary thing. If they ever get a Speaker of the House, they can work on that too. But the government does need the ability to communicate with tech platforms. And what we see a lot of the time is tech platforms reaching out in the other direction too. You saw during COVID, Facebook reaching out to the CDC saying proactively, hey, we see this rumor going by, what should we know about it? We want to surface authoritative information and again the same thing in in conflict zones you want them to be able to be in touch with governments in the region we we
Starting point is 00:49:50 parse so much of this through the stupid american culture war and polarized domestic politics but the the chilling effect impacts their ability to set policy and engage internationally and that's one of the things that's happening here right right? It creates norms and it creates a fear that they're going to engage in some way and then they're going to get dinged for it in a regulatory sense in some country or another. And that's actually very bad. You do want that channel of communication open, particularly where terrorism is concerned or where atrocities or violent events are concerned, because you might remember back in the olden days of 2015 when this was happening with ISIS, those communication channels were not
Starting point is 00:50:30 really there and it wasn't great. Yeah, absolutely. The importance of research is really critical to these companies, the ability to talk among and between themselves. They make small, stupid mistakes. That's very different than creating this ridiculous conspiracy theory. And it is nonsense. You cannot say that. But it is. It's absolute nonsense. And we are all concerned about overreach of government. But in this case, it's being used for another reason altogether. That's my opinion. So I'll just say that. Okay, one last question, very quickly for each of you. I'd love to know what sources of information you think are most credible right now and why. What's your media diet? Cheyenne, let's start with you.
Starting point is 00:51:11 Well, obviously, I work for BBC News, so I consume quite a lot of BBC content. But, you know, I would say there are plenty of diligent, hardworking, dedicated journalists on the ground. And, you know, when a conflict like this happens, and, you know, saw the same thing with the conflict in Ukraine before that, the war in Syria, there are dedicated journalists who are not actually employed by major news organizations and do not have access to all sorts of support that journalists who work for major news organizations have access to,
Starting point is 00:51:42 who risk their lives with basically limited support and just go to war zones and try to report accurately without any any sense of partisanship or any bias um you know i would say for any apart from like major sources of news that that try to actually um at least verify information and not be partisan in one direction or the other direction. Just trust journalists who are, try to trust journalists who are underground reporting, underground risking their lives. You know, they're doing a huge service to... And is there any non-journalist organization that is journalist adjacent that you like to use?
Starting point is 00:52:19 Well, I think the ones that I, I don't want to name names, the ones that I follow closely, they all work independently, like they do it in their own time and then they sell their content to major news organizations but basically they're doing it off their own volition you know, they haven't been asked by anybody to go and they haven't got support, they just do it themselves
Starting point is 00:52:39 and then whatever piece of material they get they just try to sell it Yeah, not Uncle Harvey at his house or in a bar and wherever, although those exist everywhere. Katie? So I would say hard news. I've got CNN on. I mean, I think they just do a great job in terms of crisis situations like this of hard news reporting, New York Times, Post. For analysts of what's happening online, the Atlantic Council's Digital Forensics Research Lab does a fantastic job of doing all of this.
Starting point is 00:53:07 And so I'm a non-resident fellow there. And so I've been talking to a lot of them, following what they're putting out. And then I do get a lot. Threads has been where I've been reconstituting my journalism and my journalist feeds and kind of getting some of those stories and stuff from there. And seeing that community rebuild has been a really interesting thing. And then I do just get a lot of my news through newsletters, whether they're sub-stack newsletters, those coming from news organizations,
Starting point is 00:53:32 because it's just hard for me to curate. So being able to get that through my inbox to find those stories, whether it's analysts or hard news, is sort of like what my go-to day-to-day is. Okay. And Renee, let's finish with you. Bellingcat, one of the, I think the most useful. I saw some, you know, you always have to have the meta narrative about
Starting point is 00:53:54 is media collapsing? Did disinformation researchers know nothing? Of course, that's been all over X. And one of the things that I thought was funny was like seeing that thread and then literally like there's Bellingcat's post like next on down the list. And I'm like, oh, you know know you could actually just follow these guys media's
Starting point is 00:54:06 been dying for centuries there it is dying for centuries um so i do you know i i am in the telegram channels right i mean i i follow very closely um all of the stuff as it comes out i think for me it's just a matter of um take it parse it but like let it sit for a couple days the world doesn't need my commentary on you know i like it doesn't need my outrage. And I think that that ability to, you know, let something emerge, the facts may change, is the kind of critical life skill that we need in this new social media mediated environment. It's true. Renee, you're never going to be a tech bro then because you have to have an opinion about everything. In any case, I really appreciate it. Thank you. Thank you.
Starting point is 00:54:53 What is your media diet? I have a very good media diet. I read all the major outlets, you know, including cable and across the world. I look at Twitter largely on quick things I know they do well, like what's going on in Congress right now, they're the best at it. And I read everything. I use Artifact to find stories that I might not have been aware of. And I mostly rely on friends who have recommendations wherever I find them. I obviously read a lot of authoritative sources. I also think a good add to the media diet is to read international news, not just in wartime, to experience and understand the world, not just in these extremely hot moments, but to have a view of a situation of politics on the ground,
Starting point is 00:55:31 how they affect everything that plays out in these heated moments. Yep, absolutely. It's important. But most people aren't. Most people don't read anything. They read the cereal box and then go to Facebook to yell at their uncle. That's really what they do. Most of our listeners are probably reading stuff. I think a lot of people, I think the United States has a terrible media diet. I have a friend, Walt Mossberg, who's working on the Media Literacy Project. I think it's the single most important thing we can do around media, which is get people literate in it.
Starting point is 00:55:59 And now it's more important than ever because you have to add the online element. Well, I thought one of the most interesting things to come out of that conversation was Renee's point that this might actually lead to the revival of homepages, going to newyorktimes.com or wallstreetjournal.com. This irony that somehow in the great mess of Twitter and X, Elon has helped resuscitate the media business, the legacy media business. Oh, I don't know. I think Twitter gets a lot more attention than it deserves in that arena. I've run media sites and Twitter is never the highest, never,
Starting point is 00:56:28 ever. It's one of the lowest, actually. We used to get more referrals from Instagram, a lot from LinkedIn, original Google, you know. And so I don't know if it's Twitter. I just think a lot of journalists are on it. And so maybe across the world, that's a little different. And I suspect it is. But I think Facebook continues to be the most important purveyor of information. I think all the studies bear me out on that. But Twitter X now is definitely worse than before, right? Oh, terrible. The verification.
Starting point is 00:56:54 It's very hard. I mean, there are so many journalists on there. And it's historically been, I mean, look at the Arab Spring. Historically been a way that you could see what's happening on the ground, that you could understand. And you could curate your feed, you know, through verification to some extent. And now? I think a lot of what they were saying
Starting point is 00:57:10 is that now mainstream media dances to the tune of social media, and that's a problem because social media is really bad at their job. And so they should just stick to doing the reporting, and that's it, because you cannot rely on the stuff. You can some of it, but not much of it, unfortunately.
Starting point is 00:57:26 There's also all these independent groups verifying things. Yeah, the growth of Bellingcat and the other things. Those are great. Those are fantastic. That's different than these things. These are not, those are not online publications per se. They're just really good at journalism. Yes, they are.
Starting point is 00:57:40 Do you think that, by the way, here's a question. Do you think that journalists right now, given the mess that is Twitter and Axe, and part of it being what this panel is describing, the algorithm preferring the random comments of somebody who paid eight bucks to Elon, mean that media companies should be paying right now? Should media companies be paying to get through the noise? No. Or journalists? Absolutely not. Look, NPR came off and they're like, yeah, it didn't make a difference. They just took the verification badge off New York Times. It doesn't matter.
Starting point is 00:58:09 It just doesn't. This is, this is, it just doesn't. It doesn't matter for the sales of, or for ads, for getting people onto their shows, but it might matter for shaping conversation. I don't think it does. I just think it's overblown because journalists are there
Starting point is 00:58:23 and Elon Musk makes a spectacle of himself. But in the real matters, I do think broad social media absolutely has an impact. Broad social media does 100%, not Twitter necessarily, but broad social media. And not just broad social media, but niche social media, Telegram, WhatsApp. Telegram is more influential. WhatsApp is more influential. All these different sites in India, there's tons of them, are more influential. So I would say, yes, broad social media can have an impact, including creating riots, real riots,
Starting point is 00:58:55 due to misinformation that people see online. Or even just information. Yeah, it's through their phones. But the most valuable moment for me of that conversation was really when Cheyenne described the process for verification. 100%. How he does his work, really. And I think that journalists, many of us, we do that work when we see a piece of information, but we cannot expect that everybody on the internet is doing that work all the time before sharing. They aren't at all. They aren't at all. I spent a lot of time many years ago trying to prove Hillary Clinton wasn't a lizard.
Starting point is 00:59:23 It was just one of these things I was doing. Did you know i i did i ran down i went to the person who was putting it up and i i ran down every one of their sources and it was all wrong even coming back to them they were like i don't believe that and it was incredible it's just pointless it was ultimately it was pointless but a lot of the internet is a rochard test right it's like people see what they want to see out of it, out of the fog. Yeah. Well, I think the most important thing is when you're trying to deal with these problems, again, something Debra Roy has talked about, you don't talk about them opinions and facts.
Starting point is 00:59:54 You talk about personal experiences with them. And you can start to get to the truth through that because you remove the anger part or this fact, that fact. And unfortunately, facts have been weaponized. And opinions are, you know, there's an expression, every asshole has one, you know, pretty much. So that's the problem. So three predictions. Is this moment going to change how platforms moderate? No. No, not at all. I don't think they care. Then second, do you think the DSA, the Europeans, will have any teeth? No, they will a little, but, you know, they got to collect.
Starting point is 01:00:28 And do you think this moment or the proximity of the election is going to change the way that the U.S. government looks at or U.S. stakeholders look at regulation of tech companies? I don't know. We'll see. Look how they're doing right now. They're doing terribly. They can't decide anything. Can't decide on lunch, these people. And so, no, no, they're inadequate to the task. So, no, sorry. It's always so pessimistic. I'm not pessimistic. I'm realistic. You're realistic. It's the way things are. They don't care. They don't care enough. And at some point,
Starting point is 01:01:00 they'll be made to care. But an insurrection in the United States Capitol wasn't enough for them. So I don't know what to say. All right. Well, they can't decide on lunch, but we will go have lunch. Kara, do you want to read us out, please? Yep. Today's show was produced by Naeem Araza, Christian Castro-Rossell, Megan Burney, and Claire Tai. Special thanks to Kate Gallagher.
Starting point is 01:01:21 Our engineers are Fernando Arruda and Rick Kwan. And our theme music is by Trackademics. If you're already following the show, clearer days ahead. If not, you're stuck in the fog. I know I am right now. Go wherever you listen to podcasts, search for On with Kara Swisher and hit follow. Thanks for listening to On with Kara Swisher from New York Magazine, the Vox Media Podcast Network, and us. We'll be back on Thursday with more. All-new Reimagined Nissan Kicks is the city-sized crossover vehicle that's been completely revamped for urban adventure. From the design and styling to the performance, all the way to features like the Bose Personal Plus sound system, you can get closer to everything you love about city life in the all-new Reimagined Nissan Kicks.
Starting point is 01:02:19 Learn more at www.nissanusa.com slash 2025 dash kicks. Available feature, Bose is a registered trademark of the Bose Corporation. www.nissanusa.com slash 2025 dash kicks. Available feature. Bose is a registered trademark of the Bose Corporation. Autograph Collection Hotels offer over 300 independent hotels around the world, each exactly like nothing else. Hand-selected for their inherent craft, each hotel tells its own unique story through distinctive design and immersive experiences, from medieval falconry to volcanic wine tasting. Autograph Collection is part of the Marriott Bonvoy portfolio of over 30 hotel brands around the world.
Starting point is 01:03:01 Find the unforgettable at AutographCollection.com.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.