Radiolab - The Internet Dilemma

Episode Date: August 11, 2023

Matthew Herrick was sitting on his stoop in Harlem when something weird happened. Then, it happened again. And again. It happened so many times that it became an absolute nightmare—a nightmare that ...haunted his life daily and flipped it completely upside down. What stood between Matthew and help were 26 little words. These 26 words, known as Section 230, are the core of an Internet law that coats the tech industry in Teflon. No matter what happens, who gets hurt, or what harm is done, tech companies can’t be held responsible for the things that happen on their platforms. Section 230 affects the lives of an untold number of people like Matthew, and makes the Internet a far more ominous place for all of us. But also, in a strange twist, it’s what keeps the whole thing up and running in the first place. Why do we have this law? And more importantly, why can’t we just delete it? Special thanks to James Grimmelmann, Eric Goldman, Naomi Leeds, Jeff Kosseff, Carrie Goldberg, and Kashmir Hill. EPISODE CREDITSReported by - Rachael CusickProduced by - Rachael Cusick and Simon Adlerwith mixing help from - Arianne WackFact-checking by - Natalie MiddletonEdited by - Pat Walters EPISODE CITATIONS: Articles:Kashmir Hill’s story introduced us to Section 230. Books: Jeff Kosseff’s book The Twenty-Six Words That Created the Internet (https://zpr.io/8ara6vtQVTuK) is a fantastic biography of Section 230To read more about Carrie Goldberg’s work, head to her website (https://www.cagoldberglaw.com/) or check out her bookcheck out her book Nobody's Victim (https://zpr.io/Ra9mXtT9eNvb). Our newsletter comes out every Wednesday. It includes short essays, recommendations, and details about other ways to interact with the show. Sign up (https://radiolab.org/newsletter)! Radiolab is supported by listeners like you. Support Radiolab by becoming a member of The Lab (https://members.radiolab.org/) today. Follow our show on Instagram, Twitter and Facebook @radiolab, and share your thoughts with us by emailing radiolab@wnyc.org. Leadership support for Radiolab’s science programming is provided by the Gordon and Betty Moore Foundation, Science Sandbox, a Simons Foundation Initiative, and the John Templeton Foundation. Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation.

Transcript
Discussion (0)
Starting point is 00:00:00 Hey folks, just a quick warning before we get started, this episode contains some swear words as well as some frank discussions about sex and suicide. You're listening to Radio Lab radio from W and Y Come on come on I'll be a stasifras Hey, hey, this is Radio Lab and I'm Simon Adler sitting in for Lulu and Latif this week. Yeah, how do you feel that the B team has been set in for this? Yeah, they're like, you know, the understudy of the understudy was out today, so you're going to have to take some of them out. So you're going to have to take just the...
Starting point is 00:00:59 The usher. The ushering. Because a while back, our reporter producer Rachel Cusik, she sat me down in the studio. To tell me a story about both how beautiful we humans are, but also just how downright awful we can be. And the tricky business of deciding who should be held responsible when that ugly part of us shows. who should be held responsible when that ugly part of us shows. Let's hit it. Mm-hmm. So, okay, so we are gonna start on a stoop in Harlem.
Starting point is 00:01:31 Over in West Harlem in what, 2016? Yeah, 2016. With this guy. Matthew Harrick. Wait, do you go by Matt or Matthew? Most people call me Matthew. Cool. At the time, Matthew had recently moved from LA to New York City.
Starting point is 00:01:46 It was definitely a punch to the face, if you will. Trading palm trees and sunshine for a smelly city stoop. Yeah. Anyway. I think it was around October, mid-October. I was sitting on my stairs, which is like in the front of my building. What's this guy look like, do we know?
Starting point is 00:02:04 Like tall, muscular, salt and pepper hair nowadays. I think probably just pepper back in the front of my building. What's this guy look like, do we know? Like tall, muscular, salt and pepper hair nowadays, I think probably just pepper back in the day. And I was having a cigarette and a gentleman walked up and stopped and stood in front of me. And you know, it's New York, so you have a lot of fucking weirdos. So I figured it was just some weirdo being weird. Like whatever, I'm just gonna ignore them.
Starting point is 00:02:23 Yeah, like it's all good. So I'm like kind of avoiding eye contact, but then I realize that they're standing there for, you know, it took standard period of time. So I like looked up. And it's someone that he doesn't know, someone that he doesn't recognize, but this guy, he went, hey, Matt.
Starting point is 00:02:40 And I was like, hi, like, how the hell do you know my name? And he says it's so-and-so we were talking on Grindr. Grindr dating app used primarily by gay men, and so this stranger tilts his phone towards Matthew, and he's like, look. And it's a profile on the app with, you know, a picture of me. And I was like, that's not possible. Like, that is a photo of me. But that's not me.
Starting point is 00:03:11 That is not a profile that I made. I am not on Grindr, you know? So he looks up at this guy. Like, I don't know how to explain this. I don't know who you're talking to. I'm very confused right now. Like, you don't need to leave. And, like, got up and I went inside. And I remember I looked at my roommate, Michael, and I was like
Starting point is 00:03:28 someone just showed up looking for me from Grindr. Like how weird is that? Yeah. Little today. Because after that another guy came. And then another. People started showing up. It just keeps happening. Sometimes I would be home or sometimes I'd be leaving the building and there'd be people outside. Each time a different man. You don't want her to a week. All saying the same thing. Like we were talking on Grindr, let's have sex.
Starting point is 00:03:57 So we reports the profile to Grindr, got the automatic reply. We'll get back to you soon or whatever the hell it said. But he doesn't actually hear back from anyone at the company. And meanwhile, people were showing up to my home in large volumes. And like, I'm living my life, leave me alone. Can I get a break?
Starting point is 00:04:15 And finally, one night, he's annoyed, he's fed up. I stood up and said, fuck this. And he decides he is going to get to the bottom of this. Yeah, so I downloaded Grindr. He makes a profile without a photo. And then logs on. And I saw the fake profile of myself very close proximity. Grindr actually has this map feature where you can see where other people are who are on Grindr.
Starting point is 00:04:42 And this person who had Matthew's name and his photo is like right there. Right outside his apartment. So I walked outside and I remember looking on the street and he took off running and I went and chased after him. And while he's running, Matthew is looking at the grinder app, scanning for the fake him. Because you can refresh it and it'll tell you how far that person is way. And this fake Matthew, he was like 20 feet away.
Starting point is 00:05:17 So I'm walking along Morningside Park, refresh the app. Then 15. He's walking and I was walking. Then 10. And I remember I looked down and I was like, he's five feet away. How does he five feet away? And I stood on the park bench
Starting point is 00:05:33 and I looked over the fence and laying face down in the bushes was JC. His ex, JC. And I screamed, I fucking caught you. I caught you, I knew it was you. JC started yelling back in him, Matthew ran away, JC chased him, the cops got called, it was a mess. So an ex lover made a fake profile for the purpose of terrorizing him? Yeah, Matthew and this guy, JC, had started dating not long after Matthew arrived in the city.
Starting point is 00:06:10 And we dated for, I want to say eight to 10 months. Matthew saw some little red flags and ended things. And that's when these people started coming. I think once Matthew broke up with him, he was like, if you don't want to date me, then like screw you. I'll make your life a living hell. Anyhow, once he knew up with him, he was like, if you don't today me, then like screw you. I'll make your life a living hell. Anyhow, once he knew it was JC, I ended up getting an order of protection against him.
Starting point is 00:06:31 I told him, and so JC couldn't go anywhere near him in real life. But in order of protection, doesn't really apply when JC sending other people to his door. There was no ramifications for what he was doing. There wasn't anything the courts or the cops could do about it So Matthew context grinder again, and is like this is the guy shut him down please But still nothing no acknowledgement we reach out to grinder for comment didn't hear back anyhow with grinder doing nothing JC went on the offensive.
Starting point is 00:07:05 He actually made multiple fake Matthew profiles. There were two to three existing on the platform. And that's when, you know, the gaze zombies started coming for me. Do you call them gaze zombies? Yeah, it's because it's like everyone's like mad. Just like must get sex now. Yeah.
Starting point is 00:07:24 I would leave at six o'clock in the morning to walk my dog. There would be somebody outside waiting for me. I would come home at night 11, 30, 12 o'clock at night. There'd be somebody waiting for me. And literally every single day of my life. And it wasn't just a lot of awkward, but harmless encounters because these profiles said I was looking for rape fantasies.
Starting point is 00:07:43 Matthew would try to explain the situation to people calmly, but then the Profiles were telling these individuals it was part of my turn on so to stay and then approach me again Just diabolical. Yeah, and so again. He tries reporting the profiles. I had friends reporting profiles family reporting profiles sending emails to the company. Again, nothing. I was slammed against the wall. Oh my god. There was someone who broke into my building and physically assaulted my roommate.
Starting point is 00:08:13 We're trying to get to me. He goes to the cops. File or police reports and that turn me away. They were like, we don't understand. I don't think anybody really could grasp what I was actually talking about. Jaycey started making profiles that promised people crystal meth and said they should show up at the restaurant where Matthew worked.
Starting point is 00:08:29 I was taking an order at a fucking table and I remember this guy is tapping all my shoulders saying my name, high out of his mind. And I'm looking at the people that are sitting down and they're looking back up at me and they're like, what is going on? And my eyes are just welling up with tears because I'm like oh my god and I'm like how do you want your burger cooked you know what I mean?
Starting point is 00:08:51 And this went on for months. Jesus. Did you hate hearing your name at that point in your life? Oh I hated it. I hated everything about existing. I hated it all. Like I remember sitting there saying to myself Like I either want to fucking blow my brains out or throw myself off a building
Starting point is 00:09:13 And then one day Matthews talking to his lawyer my family court lawyer. She said hey, there's this woman named Carrie Goldberg She might be able to help you. And at that time, I was so just beat to a pulp. I just heard the word help. And I was like, yeah. So he takes the train to downtown Brooklyn. I sat in our office and she told me a little bit about herself.
Starting point is 00:09:37 I mean, as background, I started this law firm after I had been the target of a malicious and creative and relentless ex attorney, Kerry Goldberg. One of the most malicious things that my ex was doing was blackmailing me with naked pictures and videos that he had contacting everybody in my life. He's making false IRS reports against my family. Now, at that time, Kerry was already a lawyer herself, but she really only did family law stuff, wills, guardianships, things like that.
Starting point is 00:10:12 And I had so much difficulty during that process finding a lawyer who knew what to do and like was at the intersection of intimate partner violence and internet law and first amendment and knew how to get an order of protection. I was really desperate and so after this all ended, I became the lawyer I needed, a lawyer for people like Matthew. So she told me her story and I told her mine and before I could finish, she said I would like to represent you. I think we can slow this attack on you. And the way to do that, Carrie said,
Starting point is 00:10:49 was to go after Grindr. Take him to court and argue that this guy, JC, used Grindr essentially as a weapon that Grindr knew all about it and did absolutely nothing to stop it. And so, they have the hearing, Carrie and her team file in, sit down, confident looking at their little table. We're pretty bad as litigators. And across the aisle is,
Starting point is 00:11:12 of course, Grindr's lawyers. And as the hearing begins, the Grindr guys, they stand up, imagining they do that thing, that men do, where they put their tie tucked in inside their jacket and they're like, you're on her. We don't have to do anything because of section 230. Section 230. Yeah, and the judge is like, you're right. We don't have to go any further. That's the end of this. Okay, so let's start there. Okay. Section 230 is a provision passed by Congress in 1996. That's not a typo, 1996, right?
Starting point is 00:12:12 Attorney and Justice Correspondent at the Nation Magazine, LA Mistal. So that's how old this law is. Now, it's worth pointing out that most of the rest of the law is no longer good law. It's been amended, it's been shaped, it's been overcome with a kind of newer better laws that take into account what the internet has actually become, but section 230 is the beating core that remains. And 230, it does one simple thing. It relieves internet companies of liability for illegal things posted on their websites. Meaning he says not only in Matthew's case, but in others like it, when someone lies about someone else or threatens them or even tries to do them some kind of harm using an
Starting point is 00:13:02 app or a website, these tech companies, they get off scot free. That's exactly what's happening. Section 230 is fundamentally at core a liability shield. A shield that no other industry gets, except the tech industry. In short, Section 230 makes the tech world untouchable. It's just not fair. So unlike Matthew, Carrie already knew about section 230.
Starting point is 00:13:29 She knew the grinder lawyers would use it against Matthew. And so she had actually been trying out this way to get around 230 by arguing that grinder was a faulty product that harmed Matthew as a consumer. But the judge wasn't buying it. And with Matthew, appeal after appeal after appeal, the judge wasn't buying it. And with Matthew, a peel after a peel after a peel, the case just kept getting dismissed each time because of section 230. And so section 230 is my nemesis.
Starting point is 00:13:55 She hates it. I can only talk about it once a day because I get so aggravated that I then can't do my job. I think it can be totally decimated and thrown into the sun. And weirdly enough, this hatred carry feels... As you know, Google enjoys a special immunity from liability under section 230 of communications decency. Is shared by a lot of people. A lot of Americans have concerns. Conservative lawmakers like Ted Cruz, Congress to get rid of special immunity
Starting point is 00:14:29 for social media companies in a country. President Joe Biden called to have it removed. Section 230, we have to get rid of Section 230 politicians. And so did former President Donald Trump. That is the thing about Section 230. It's kind of built this like king-sized mattress of strange bedfellows Who are all teaming up and saying we want it gone? It is literally just like unmistrooming
Starting point is 00:14:56 monster But Matthew I don't think they should get rid of it Is not one of those people I don't think they should get rid of it. Is not one of those people. Because even though this law lets companies like Grindr completely ignore what happens on their platforms. Without Section 230, we couldn't live the way we do today. It is the law that makes the internet possible.
Starting point is 00:15:21 And so now we're really getting into the heart of Section 230. That heart and what our lives might look like without it after the break. Simon, Rachel, radio lab and we are back and we're gonna go backwards. All right. To a time not that long ago when what the internet would be, what it would look like and feel like was a very open question. A time when sort of anything seemed possible. It's the world of the split set. Sorry, what year are we in?
Starting point is 00:16:03 They're in the lab. Okay, so 1992 things are starting to happen Back when getting on the internet required somebody else in your house to get off the phone The internet has evolved from this thing that really only academics use It's taking us five years of real hard work to develop a system like this to something niche and nerdy communities are playing on in the form of chat rooms. To finally… Something that everyday people like you and me were using through these bulletin boards. And although it may have been primitive, you had access to information all around the world. And as amazing as that was, as more and more people were logging on to get world news,
Starting point is 00:17:11 or share recipes, or share their opinions about financial markets, Prodigy needs your attention. You have new mail. These bold and words, they began to get... You're done. He did. Did message. Don't make fun of you. Good bye. And so guys like Chuck, Chuck Epstein, moderator of the Proudity Money Talk bulletin board, were brought in to turn down the temperature by removing posts that went too far.
Starting point is 00:17:41 So I just took down swear words, theogatory words, racial slurs, etc. And it's just you there, you're all by yourself. That's correct. That was the only one who had the software, the moderating software, and there were, oh, it, I, I, easily a couple thousand posts per day on the Money Talk bulletin board.
Starting point is 00:17:59 What stocks, bonds, real estate, equities. So, it was exciting. And so, Chuck, he managed to create this little neighborhood where people could connect and say what they wanted, but where he could also be a kindly curator. Make sure that no one got out of line. Until... Well one evening I was at my house, took my poodle out the front door for a walk in the Lovey little fella.
Starting point is 00:18:23 He was a miniature French poodle bowl, and we walked down the street about, you know, 40 paces. Bo does his business, Chuck stretches his legs. A man literally jumps out of the bushes. Oh my god. It was like from the spy movies. I didn't know what this guy was doing. And standing there under a street light. He says, like, Mr. Epstein, I said, yes,
Starting point is 00:18:48 and he hands me a piece of paper in an envelope. It was an envelope. And he says, uh, thank you, have a nice night. I thought you were going to say he said, I'm here to have sex with you. I've been sent. I am a gay zombie of yesterday. This is weird old story. Yeah. Anyhow, so Chuck turned around, walks home quickly. I went back in the house and opened the envelope. The first thing that he sees on this piece of paper, it says, Stratnokman versus prodigy and big letters at the very top,
Starting point is 00:19:20 answered, what the hell is this? Turns out that Stratnokman was suing Chuck's employer, Prodigy, claiming that someone had used Chuck's bulletin board to smear their company, saying that their president was a thief involved in some scams, and Stratton Ockmont was a criminal organization. Basically, attacked the reputation and the financial acumen, and honesty and the ethics of Stratnogman. And that these posts constituted defamation. In this $100 billion lawsuit. Now, as would be discovered years later, these posts were not defamatory.
Starting point is 00:19:57 In fact, Stratnogman and their president were doing so many illegal things that they would one day inspire Leonardo DiCaprio. Was all this legal? Absolutely not. In the film The Wolf of Wall Street? We don't work for you, man. Yeah, my money taped your boobs. Technically, they work for me. Yeah, Jonah Hill's character was actually based on the guy who cried defamation. But at the time of the suit, nobody knew anything about that.
Starting point is 00:20:26 And so the last it was about a see it went over a step up when the thing went to trial. These wolves of Wall Street, they argued that because prodigy employed people like Chuck moderators who left posts up and took posts down, that they were responsible for every defamatory post they left up. And this judge agreed. The judge ruled that the prodigy was responsible. The word of prodigy. The word of prodigy. The word of prodigy. The word of prodigy.
Starting point is 00:20:53 Now, the irony here is that right around this time, there was another company. The access to the internet, enter a comp you serve. Comp you serve. And the comp you served did not hold itself out to be a family- friendly bulletin board. They were just like Prodigy, but they had no moderators, no chucks, didn't take anything down. All the swear words, defamation, racial slurs, all of it stayed there.
Starting point is 00:21:15 And when they went to trial in a very similar online defamation lawsuit, they won. And so, weirdly in this situation, if Prodigy had never set out to be a family-friendly place, if they said whatever you want, have it, they would not have lost this lawsuit. Well, that seems completely ass backwards. Yes, yes. What the law was saying is that if your approach is anything goes, then you'll be Scott free. But if you attempt to have rules in the road, then we're going to make you responsible for every piece of content that's uploaded by every single one of your users. Former Republican representative of California, Chris Cox, and when he learned about this, he was like, this is not the way
Starting point is 00:22:07 we want the internet to be regulated. Because of the obvious consequences, the rate of increase in users of the internet was exponential and it was clear that this new means of communication was gonna be vital importance, either for good or for real. And he worried that this president set by these two cases, like reward the Wild West,
Starting point is 00:22:29 punish the family friendly sites, that that could be disastrous. And one of the great things about being a member of Congress is that when you pick up the newspaper and you see something that needs to be fixed, you say, there ought to be a law. And then your next phone is... Who can do this for me? Yeah, I could do that. But you need a partner. So...
Starting point is 00:22:51 I made a B-line to my friend Ron. Ron Whiden, one of Oregon's United States senators. Democrat, little buds, they get ice cream together. Chocolate chip, for me. I'm chocolate, although when I'm being very extravagant, I have one scoop of vanilla and one scoop of chocolate. That's live in the life. And he says, Ron, like, I think it's really, really important that we do something about this, explain these two cases,
Starting point is 00:23:16 and how... You know, online platforms would offer the choice. You could either police your website and be liable for everything, even if something slipped through, or you could turn a blind eye and not place anything. And Chris and I said, maybe we can come up with something that really promotes the best internet. And internet where sites could take down what they wanted without getting in trouble.
Starting point is 00:23:40 And the point was to keep it really simple. So Chris and I went to the house dining room, sat by ourselves and we put this together. A couple days later. We're done. It wasn't perfect by any means. And do you either of you know it by heart? I'm sure you do, because you talk about this all the time. But could you just say it, just so we have it on the recording?
Starting point is 00:24:00 Yes, sure. What it says is that no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider. In other words, these internet companies could control the things that got posted on their sites as they saw fit without the threat of being sued. they saw fit without the threat of being sued. And so right now we're going to take you over to the Library of Congress for this signing ceremony. Mr. Clinton, you're the same.
Starting point is 00:24:32 February 8, 1996. Today, our world is being rebayed yet again by an information revolution. In a wood paneled hall, President Clinton signed these 26 words, Manta law. This historic legislation recognizes that with freedom comes responsibility. This bill protects consumers against monopolies. It guarantees the diversity of voices. Our democracy depends upon. Perhaps most of all, it enhances the common good.
Starting point is 00:25:01 Thank you very much. I mean, just as one example, if it hadn't passed, and sites remained liable for every little thing that we posted. You couldn't imagine a project like Wikipedia taking off as it has. Or the MeToo movement. Or that ice bucket challenge that raised millions of dollars for ALS research or Black Lives Matter. They absolutely could not exist without Section 230.
Starting point is 00:25:34 I mean Section 230 let websites moderate as best as they could without the threat of constantly being dragged to court. Which allowed space for this massive online wave That we're all still surfing today It but of course waves can be dangerous. And now more than ever, it's starting to feel like we could use some more lifeguards. Because you know, these wonky little bulletin boards that sparked all of this, they evolved into comment sections which evolved into social media platforms like Facebook and Twitter
Starting point is 00:26:24 and Instagram. And then into dating platforms like Facebook and Twitter and Instagram, and then into dating apps like Tinder and Grindr. And before we knew it, billions of people were on these things. And while these sites have enabled lots of good things to happen, they've also given us this whole new universe of ways to be cruel to one another. Cyber-Bully Doxing, Revenge-Born, Deep Fake, or Not. And even though the platforms make some efforts to weed out the bad stuff, So much of it gets through. My pit is on it, it's on my little stuff, my name and everything.
Starting point is 00:27:01 Report the account, please. And when someone comes to them and says, please make it stop. Like Matthew or Grindr Guy or countless other people. This is what it looks like to see yourself naked against your well-being spread all over the internet. This is what it looks like. They say it's not our problem. Section 230.
Starting point is 00:27:21 I really don't think I'll ever get these images down from the internet. And I just, I'm sorry to my husband and I'm sorry to my children. Again, section 230, while critical to how the internet was made, critical to how it functions is old. Once again, justice correspondent, Elie Mistal. And contemplates a late 90s internet world that simply no longer exists. Yeah.
Starting point is 00:27:59 So yes, it's a sense of like, our laws should be updated to reflect how the internet works today, not how it works in 1996. And so there is a coalition amongst hardcore conservatives and some progressives to do something about Section 230 and take it away. It seems not unlike 1996 when Section 230 passed. Like, there's this open question again of what is the sort of internet that we want.
Starting point is 00:28:30 However, the other side of it is also, you know, we're kind of backing into the actual point. So do you mind? I want to start the point differently, right? Yeah, do it. Because here's the thing. One of the reasons why we don't know what's going to happen with Section 230, sorry, the best way of saying it.
Starting point is 00:28:53 You're your own producer. I'm gonna get there. I'm gonna get there. Thank you, Ellie. The bottom line is that we don't know what's going to happen to the internet if we take away section 230. One way it could go is for everybody to go back to a wild, wild west scenario where there is no moderation anywhere at all, right?
Starting point is 00:29:19 However, the other way it could go would be to have extreme moderation. Nobody has open comment threads. Nobody has a form where they can say whatever they want. Everything is either completely closed off or highly monitored by an AI, by the algorithm that is just without pride or prejudice, just running around and smacking people based on keywords. Doesn't matter the context, right?
Starting point is 00:29:48 Which, you know, it would take out racial slurs, problematic stuff, but it also might weed out these kernels of ideas that led to the Arab Spring and Black Lives Matter and me too. So only the most kind of anodine disnified, I like soup. Kind of. Yeah. Are those options like both equally likely if section 230 were to go away? and disnified. I like soup. Kind of. What happens?
Starting point is 00:30:05 Are those options like both equally likely if Section 230 were to go away? Well, are you conservative or are you liberal? Because what you think is more likely really tracks with your politics right now. Liberals, at least the ones who think Section 230 should be taken away, think these, the social media platforms will go full on aggressive, stopping, hateful comments.
Starting point is 00:30:35 However, conservatives like Josh Hawley really think that it's going to go the other way. That in a post-Section 230 world, because of the threat of liability, these companies, they would go in a wild west format and just let everything ride, so nobody gets in trouble. But the problem there, Ellie says, is... You've gotta be able to turn the internet upside down and shake money out of it, right? Like, none of this happens.
Starting point is 00:30:58 If somebody can't make money off of it, right? Meaning in most cases, Sometimes I just wanna to rent a car. Go. You know? I do know. And I think I can help you with that. Really?
Starting point is 00:31:11 Advertisers, yeah. I love words. Oh, yes. Love words. And what the advertisers want is for there to be moderation. Because they make more money when things are, for lack of a better word, nice. So it's highly likely that the advertisers simply will not stand for a wild, wild, western area where like, when you click on the page, all the comments are like, if you dumb
Starting point is 00:31:39 n-word. And if that happens, you're basically calcifying the internet as we have it today. Like these small companies, these startups, these home spun sites, they will not have the resources to moderate. If you put these moderation controls on them, the next Twitter, the next Facebook, the next TikTok, there will be no way for them to compete. So what we will have is basically just the Titans that we have today.
Starting point is 00:32:09 So we are stuck between like Iraq, our hard place, and if we're in like dagger, right in front of our face, like there's no, it feels like there's like no clear way to tackle 230 without then destroying the internet as we know it. It wouldn't be so comp, it wouldn't be a complicated issue if it wasn't a complicated issue. Once more, Matthew Herrick from the beginning of this episode, whose life got literally destroyed by section 230, but still thinks we shouldn't get rid of it.
Starting point is 00:32:41 I'm so surprised that you don't want to just get rid of it all together. I don't know, it's like a freaking shark came and bit your arm and you're like, well, the shark has done some good for the ocean, you know? Well, because I understand how complicated it is. I mean, I don't want to sound, you know. I mean, obviously I'm fucking pissed. But like, I'm launching a coalition with, you know, a non-profit organization to help survivors. I'm trying to like
Starting point is 00:33:09 seek out what I can utilize through that experience to create positive in the world. And I think that's all I can do. But I'd be bullshitting you if I said that I had the right answer. I just know all the wrong answers. And he's not alone. Like no one's quite sure how to fix this thing. So the decision for now just seems to be to just leave it. And the Supreme Court said so. So this past term, the Supreme Court heard two cases about Section 230.
Starting point is 00:34:05 Google V. Gonzales and Twitter V. Tamina. And during one of these trials from the bench, Elena Kagan says, Why is it that the tech industry gets to pass? A little bit unclear. On the other hand, and we're a court. We really don't know about these things. These are not like the nine greatest experts
Starting point is 00:34:28 on the internet. And boy, there is a lot of uncertainty. And they decided to leave it in place. You know, there is a reason why a law from 1996 is still the law today, and it's because, not because it works, but because it is the least bad option. This story was reported by Rachel Cusick and produced by Rachel and myself with mixing from Arianne Wack. Special thanks this week to James Grimmelman, Eric, Goldman, Naomi Leeds, and an extra extra big thank you to Jeff Kossiff. Alright, that's about it for it here.
Starting point is 00:35:39 I'm Simon Adler, this is Radio Lab. Thanks for listening. Hi, this is Mr. Feeleer's fifth grade class calling in from Monona, Wisconsin. Radio Lab was created by Chad R. R. R. I'm in the clinic, I'm in the clinic, I'm in the clinic. I'm in the clinic, I'm in the clinic, I'm in the clinic. I'm in the clinic, I'm in the clinic, I'm in the clinic. I'm in the clinic, I'm in the clinic, I'm in, I'm sorry. I'm sorry, I'm sorry. I'm sorry. I'm sorry.
Starting point is 00:36:08 I'm sorry. Simon Adler, Jeremy Bloom, Becca Brassler, Rachel Kusek, I could eat foster keys, W. Harry Fourtutor, David Gable, Mavia Paz, Gutierrez, Sendo, Nayana Sombundam, Matthew Tees, Anima Kiwan. Alex Nisen. Sir, Pari.
Starting point is 00:36:27 Anakurepaz. Chershanback. Erie Anawak. Pat Wolters. An Mali Webster. With help from Sanchi, Kijima Moki. Our fact-shakers are Dianne, Kili, Emily Emily Krieger, and Natalie Middleton. Hi, this is Jeremiah Marba, and I'm calling from San Francisco, California.
Starting point is 00:36:52 Leadership support for Radio Lab Science Programming is provided by the Gordon and Betty Moore Foundation, Science Sandbox, Simon Foundation Initiative, and the John Templeton Foundation. Foundational support for Radio Lab was provided by the Alfred P. Sloan Foundation.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.