The Daily - The Mosque Attacks in New Zealand

Episode Date: March 18, 2019

A gunman opened fire at two mosques in Christchurch, New Zealand, killing at least 50 people. The massacre was partly streamed online. We look at why the attack was, in some ways, made by and for the ...internet. Guest: Kevin Roose, who writes about technology for The New York Times. For more information on today’s episode, visit nytimes.com/thedaily.

Transcript
Discussion (0)
Starting point is 00:00:00 From The New York Times, I'm Michael Barbaro. This is The Daily. Today. The death toll from a mass shooting targeting Muslims in New Zealand rose from 49 to 50 over the weekend, after officials found another body at the Al Noor Mosque, where most of the deaths occurred. Kevin Roos on why this attack was made by and for the Internet.
Starting point is 00:00:33 It's Monday, March 18th. Farid, would you mind just telling us one more time what happened in the mosque? Farid, would you mind just telling us one more time what happened in the mosque? When shooting started, it started from the hallway. So I could hear, so... Then Maghzini finished, then he revealed again and came back again. And I saw all the blustering coming down from the wall and the ceiling. again and came back again. I attack here and I come here and I find the gun somewhere here and a dead body here as well. I feel now, I repeated the story a lot, but this is a good idea to say. Over the weekend, through dozens of interviews with survivors, a story began to emerge of what happened on Friday inside the mosques in Christchurch.
Starting point is 00:01:46 Was that your regular mosque or were you visiting that mosque? Regular mosque, yeah. The shooting began at the Al Noor Mosque, where Farid Ahmad and his wife Humsa, who had moved to New Zealand from Bangladesh, were attending afternoon prayer. The ladies' room was on the right-hand side. So all the ladies were there. And my wife is always a leading person for ladies she had a philosophy she always used to
Starting point is 00:02:13 tell me I don't want to hold any position and I want to prove that you don't need to have any position to help people she was like a magnet. And exactly the same thing happened. The shooting started. She started instructing several ladies and children to get out. And she was screaming, come this way, hurry up, this and that, you know, she was doing all these things. And then she took many children and ladies into a safe garden. Then she was coming back, checking about me because I was in a wheelchair. Do you mind me asking why you're in a wheelchair? I was run over by a car.
Starting point is 00:03:00 He was a drunk driver and it was 1998 and it happened. I'm sorry. It's okay. So she went out of the mosque and then she came back in? Yeah, she was coming back and once she was approaching the gate, then she was shot. She came back in to fetch you? Yes, yes. Farid learned hours later that Humsa was one of the 42 people police say were killed at the mosque.
Starting point is 00:03:31 So she was busy with saving lives, you know, forgetting about herself. And that's what she is. She always has been like this. Six minutes after firing the first shot, and as police raced toward Al Noor Mosque, the shooter drove to a second mosque, the Linwood Mosque, four miles east. Tell me, I'm sorry, what was your name? Abdul Aziz. And the mosque that you go to, is it mixed Pakistani?
Starting point is 00:04:00 I mean, that mosque, who was there that day? That mosque, we got from every race. From Malaysia, from Philippines, from Afghanistan, from every sort of country. My colleague Damien Cave spoke with Abdul Aziz, who was praying at the Linwood Mosque with his four sons when he heard gunshots. Aziz ran toward the shots, grabbing the first thing he could find, a credit card machine, which he flung at the attacker.
Starting point is 00:04:30 The shooter dropped a gun, and Aziz picked it up. And I pick up the gun, and I check that they had no bullets, and I was screaming to the guys, come here, I'm here. I just wanted to put more focus on me than go inside the masjid, but unfortunately on me than go inside the masjid.
Starting point is 00:04:45 But unfortunately, he just got himself to the masjid. Then I heard more shooting sound and I see he's shooting inside the masjid. Moments later, when the gunman went to his car to retrieve more weapons, Aziz followed him. He tried to get more gun from his car. When he see me, I'm chasing with a gun, he sat on his car. And I just got that gun and throw on his window like an arrow and blast his window. And he thought I probably shot him or something. And the guns come back and just he drive off.
Starting point is 00:05:18 Aziz used the gun to shatter the gunman's car window, which many witnesses believe is what prompted him to speed away, rather than re-enter the mosque and kill more people. Anybody would do the same thing. If you was there, you would have done the same thing. Have you, um, can I ask you... Minutes later, video shows the suspect being pulled by police from his car, two and a half miles down the road, where two more guns and homemade
Starting point is 00:05:46 explosives were also found. I want to speak specifically about the firearms used in this terrorist act. There were two semi-automatic weapons and two shotguns. On Sunday, New Zealand's Prime Minister Jacinda Ardern said that the suspect, an Australian citizen, would be tried in New Zealand and that her government would meet today to discuss the country's gun laws. I can tell you one thing right now. Our gun laws will change. Funerals for all 50 victims are expected to be held in the coming days. As the police commissioner confirmed this morning, 50 people have been killed and 34 people remain in Christchurch Hospital. 12 of them in the intensive care unit in critical condition. A four-year-old girl remains in a critical condition at Starship
Starting point is 00:06:47 Hospital in Auckland. Islamic burial rituals typically require bodies to be buried as soon as possible and usually within 24 hours. But New Zealand authorities say that the process of identifying the victims and returning them to their families could take several more days. It is the expectation that all bodies will be returned to families by Wednesday. I want to finish by saying that while the nation grapples with a form of grief and anger that we have not experienced before. We are seeking answers. We'll be right back. Kevin, I want to talk to you about the moments before this mass shooting began.
Starting point is 00:07:43 What do you know about those? Well, what we know comes from a video that was live-streamed on Facebook while this was all happening by the gunman. He taped himself in the car on his way over to the mosque, listening to music, talking. And right before he gets out of the car and goes into the mosque, he pauses and
Starting point is 00:08:07 says, remember, lads, subscribe to PewDiePie. And when I heard that, I just, like, I knew, oh, this is something different than we're used to. What do you mean? What is PewDiePie? And why does that reference matter? What do you mean? What is PewDiePie? And why does that reference matter? So PewDiePie is this really popular YouTube personality. He has the most subscribers of anyone on YouTube. Some people think he's offensive. Some people really like him. He's got this whole fan base.
Starting point is 00:08:40 And a few months ago, his fans started sort of spamming this phrase, subscribe to PewDiePie, in an attempt to kind of keep him from being eclipsed by another account that was going to have more followers than him. So it sort of became this competition, then it became this joke. And now subscribe to PewDiePie is just kind of like a thing that people say on certain parts of the internet. It's just kind of like a signifier, like, I understand the internet, you understand the internet, this is how we're going to signal to each other that we understand the internet. And this is what he's signaling in saying that. Yeah, so I have that in my head. And then I see all these other signs that something is weirdly kind of internet-y about all of this.
Starting point is 00:09:15 Like there's this post on 8chan, which is kind of like a scummy message board that lots of extremists and weirdos go on. message board that lots of extremists and weirdos go on. And in the post, the gunman links to the Facebook stream before it happens. The Facebook stream that he will record of the massacre itself. Exactly. And then he pastes a bunch of links to copies of his manifesto. He has a 74-page manifesto that he wrote. And some of the stuff was fairly standard, hard-right ideology. Very fascist, very white nationalist. Muslims are kind of like
Starting point is 00:09:57 the primary target for white nationalists around the world, calling them invaders, saying they're taking over. You know, this is a sort of classic white nationalist trope. And then there was all this kind of meta humor, like saying that he was radicalized by video games, which is another thing internet extremists love to sort of troll the media with. Like, you know, he posted previews of his gun on Twitter. The whole thing just kind of felt like it just set this shooting up as like almost an internet performance. Like it was native to the internet and it was born out of and aimed into this culture of extremely concentrated internet radicalism. But underneath it all is white nationalism, white supremacy, whatever you want to call it, a kind of racism that has always existed. So why does the internet's role in this
Starting point is 00:10:54 feel especially different to you? I want to make clear that, like, this is not just a tech story, right? There's a real core of anti-Muslim violence here, Islamophobia, far-right ideology. That's all very, very important. And we should focus there. But I think there's this other piece that we really need to start grappling with as a society, which is that there's an entire generation of people who have been exposed to radical extremist politics online, who have been fed a steady diet of this stuff, it's transformed by the tools that the internet provides. So I've talked to a lot of white nationalists, unfortunately, and, you know, when I ask them how they got into this,
Starting point is 00:11:41 a lot of them will say, I found a couple videos on YouTube. And then, you know, I found some more videos on YouTube, and it kind of started opening my eyes to this ideology. And pretty soon, you're a white nationalist. And that's different from historically, how extremism has been born. I mean, what do you mean, you know, if you go to the library, and you like take out a book about World War Two, you know, if you go to the library and you like take out a book about World War II, right as you're about to finish it, like the librarian doesn't say here,
Starting point is 00:12:11 here's a copy of Mein Kampf. You might like this. There's not this kind of like algorithmic nudge toward the extremes that really exists on social media and has a demonstrated effect on people. Walk me through this algorithmic nudge. I want to make sure I understand what you're referring to. This is pretty specific to YouTube, but that's where a lot of this stuff happens.
Starting point is 00:12:32 So on YouTube, there's this, you know, recommendations bar. And after a video plays, another one follows it. And historically, the way that this algorithm that chose which video came next worked is it would try to keep you on the site for as long as possible to try to maximize the number of videos you watch, the amount of time you spent, which would maximize the ad revenue, right? Maximize lots of things. And so it turned out that what kept people on the site for longer and longer periods of time was gradually moving them toward more extreme content. You start at a video about, you know, spaceships,
Starting point is 00:13:09 and you'd end on something that was questioning, you know, whether the moon landing was a hoax. Or you'd start at a video about some piece of U.S. history, and, you know, five videos later, you're at kind of a 9-11 conspiracy theory video. Just these kind of like gradual tugs toward the stuff that the algorithm decides is going to keep you hooked. And in a lot of cases, that means making it a little more extreme. And what's the white nationalist version of this
Starting point is 00:13:36 nudge? I mean, there's a ton of white nationalism on YouTube. YouTube, you know, from the conversations I've had with people in this movement is sort of central to how these ideas spread. YouTube, you know, from the conversations I've had with people in this movement, is sort of central to how these ideas spread. Like, you start watching some videos about politics, maybe they're about Trump, then you start watching some videos by sort of more fringy, kind of far-right characters, and all of a sudden you are watching someone's video who is espousing open white nationalism and you're not exactly sure how you got there, but you keep watching and for some percentage of people,
Starting point is 00:14:14 you internalize that. So it's a kind of computer-driven on-ramp or onboarding. Yeah, and this has been studied. This is a well-documented phenomenon and YouTube has done some things to try to fix the algorithm and make it so that it's not sending you down been studied. Like, this is a well-documented phenomenon, and YouTube has done some things to try to fix the algorithm and, you know, make it so that it's not sending you down these rabbit holes, but still a pretty observable effect. And what's your understanding of why these platforms didn't
Starting point is 00:14:35 act years ago to police, to delete these hate-filled videos, this content that, through these algorithmic nudges you described, directs people further and further towards extremism. They had no reason to. I mean, they were making a lot of money. They saw their responsibility as providing a platform for free speech. They were very hesitant to kind of seem like they were censoring certain political views. They were committed to free speech. And I think that's kind of the original sin that's baked into all of this. That's like part of how this was born is this idea that we just provide the platform. And if people, you know, signal to us that they like something, we'll show them more of it. And maybe we'll show them something that pushes the envelope a little bit more. And we're not optimizing for truth. We're not
Starting point is 00:15:29 optimizing for things that we think are healthy for people. We're just giving them what they want. And they're trying to change that now, some of them. There's a reckoning now where these platforms have come to understand that this is the role that they've played and they're trying to correct it. But there's a lot of people who have already been sucked up into this world, who have been radicalized, and who may not be coming back. It's going to be very, very tricky
Starting point is 00:15:58 to slow the thing that has been set into motion. And I don't even know if it's possible. At this point. Yeah. These platforms played a pivotal role, have played, are playing a pivotal role in how these extremist groups gather momentum and share their ideas
Starting point is 00:16:23 and coalesce into real movements and grow. And like, that's the part that I don't think they've completely reckoned with. And then I don't think we've completely reckoned with. I think we're still sort of coming to terms with the fact that there's this pipeline for extremism. and we know how it runs. We know where it happens. We know who's involved. And we know that sometimes it has these devastating, tragic consequences. What I've been thinking is just how inevitable this feels. What do you mean? I've been watching these people in these kind of dark corners of the internet
Starting point is 00:17:10 multiplying and hardening and becoming more extreme, and it was inevitable. This is the nightmare, right? This is the worst possible version of something that could happen and be broadcast on the internet. And it's not getting better. And it's going to be with us for a long time. But it also strikes me that in a way, and in a pretty awful way,
Starting point is 00:17:48 this gunman and the way he has approached this massacre is kind of reflecting back how the internet functions. Because I'm thinking about him making a video of this attack, which in a sense means he's making content that feeds that loop that we're discussing, perhaps feeds this algorithm that possibly fed him, that he's basically putting something back into the system. Yeah. And I saw this happening on these platforms, like in real time. So you mean on Friday? Yeah. So if you went on to 8chan,
Starting point is 00:18:26 which is the website where the, you know, all this stuff was posted, the comments below this post were all about, you know, let's save these videos so that we can re-upload them somewhere else in case 8chan gets taken down. Let's spread this.
Starting point is 00:18:40 Let's seed this all over the internet. I mean, there's no doubt in my mind that this guy was very aware of how his video and his manifesto would kind of filter through the internet and get refracted and picked up and analyzed. This was a very deliberate act, not only of murder and violence,
Starting point is 00:19:01 but also of media creation. I mean, this was, in a way, like engineered for internet virality. And then it did go viral. Yes, Twitter and Facebook and YouTube, all the platforms tried to take down the video as soon as it popped up, but it just kept popping back up. It's very hard to contain. So it's still out there. I mean, yeah, I'm looking at right now something posted, you know, six hours ago.
Starting point is 00:19:26 It's the video of the shooting, and it's still up. And I don't think it'll ever fully disappear. Kevin, thank you very much. Thank you. Here's what else you need to know today. Ethiopian officials say that information retrieved from the data and voice recorders on the Boeing jetliner that crashed last Sunday establishes further similarities to an earlier crash of the same Boeing model in Indonesia. The officials did not specify the similarities, but the disclosure is another indication that the causes of the two crashes may be related.
Starting point is 00:20:27 The Ethiopian crash led to the worldwide grounding of the Boeing jet, a 737 MAX, whose automated flight control system is believed to have contributed to the Indonesian crash. That system is now a focus of the investigation into the Ethiopian crash. And, the Times reports that a campaign of the investigation into the Ethiopian crash. And the Times reports that a campaign by the Trump administration to prevent foreign governments from using Chinese telecommunications equipment, especially those from Huawei, is failing. Several U.S. allies, including Britain, Germany, and India, have rejected the White House's argument that Chinese technology poses a national security
Starting point is 00:21:05 threat that could potentially allow China's government to disrupt their communications and are refusing the U.S.'s request to ban the equipment in their countries. That's it for The Daily. I'm Michael Barbaro. See you tomorrow.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.