It Could Happen Here - EARN IT & the Death of Online Privacy

Episode Date: February 28, 2022

Garrison and Mia discuss the EARN IT Act and the broad sweeping negative effects it would have on internet freedom, privacy, and encryption if passed. https://linktr.ee/StopEarnIT   Learn more about ...your ad-choices at https://www.iheartpodcastnetwork.comSee omnystudio.com/listener for privacy information.

Transcript
Discussion (0)
Starting point is 00:00:00 You should probably keep your lights on for Nocturnal Tales from the Shadowbride. Join me, Danny Trejo, and step into the flames of fright. An anthology podcast of modern-day horror stories inspired by the most terrifying legends and lore of Latin America. Listen to Nocturnal on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Alright, I will find some clever way to introduce us.
Starting point is 00:00:39 Hi, Daniel. Huh. You know what? I might just go with that. I will find some clever way to introduce us. And that is the introduction now, because I said it. Hello! Welcome to It Could Happen Here. Today we are going to be talking about internet privacy and some new bills that will possibly undermine it. Today I have with me Christopher. Hello Christopher. Hello I am here. I am I don't know if excited is the right word but no no one should ever be people are rarely excited to come on the show. Yeah no this is a this is a uh mild dread of the future of the internet i try
Starting point is 00:01:27 because we we definitely we definitely can't things don't need to be always horrible and grim even when you're talking about things that aren't great but yeah today today we'll be talking about some some interesting things as per the title of this episode will probably be related to we are talking about the proposed Earn It Act. And I'll explain what it is and the different kind of implications it could have on how, like, everyone uses the internet, but also affects a few specific types of people in particular. So, but kind of part of this whole thing, we're going to start off by talking about something a little bit different and then to kind of segue to the Earn and act. So last year, Apple, the company, announced a controversial
Starting point is 00:02:10 plan to install photo scanning software into every device. Apple's kind of long been seen as a pro-privacy company. In the past, they have refused FBI demands to help investigators bypass locked phones. So this idea and this plan to create a backdoor into the iPhone storage system to scan for photos was kind of a big deal coming for Apple. Because they were definitely, at least in the past, known as a generally, like, out of all of the companies, the ones that, if you're dealing with sensitive matters, Apple is generally the better one. Now, that has become less of a case in the past few years, but that definitely was the case. So when this kind of idea was announced, there was a decently sized global coalition also formed to push back on this thing.
Starting point is 00:02:57 And the company did pause the plan. Now, this came at a time that a lot of different kind of companies were also pushing back against not safe for work materials, specifically for like relating to like the transaction of money and banks. This was, you know, around the time that OnlyFans was flip-flopping on whether they would actually have not safe for work kind of materials. as a part of this kind of growing trend of like worrying about um the the term now is like child sexual abuse materials more traditionally it was it's called child pornography or you know it's so it's part of this kind of overall kind of extra focus that tech companies have about being worried that if they if someone is doing that who is underage or if someone's being exploited who's underage, they could financially hurt the company. So they're trying to – lots of companies have been trying to do this thing to prevent that legal and financial issue from happening. Now, of course, all this really actually ends up doing is just negatively affecting sex workers.
Starting point is 00:04:00 But that's kind of a topic for a different episode because we're talking about the internet act more specifically and not, not specifically talking about only fans, but this was Apple's plan to kind of scan all these photos to, to make sure that, that there were not, you know, naked photos of, of children. Um, now there's a whole bunch of other privacy issues around that because I mean, obviously teens do take nudes and send them to each other and there is really no stopping that um so the idea that all these photos are getting scanned and then seen and then like it would be the idea was that parents would be like alerted if something was found on the phone automatically which means that for me i have a whole bunch of other issues with that like that is yeah
Starting point is 00:04:41 a whole nother kind of level is like fucked up um especially for queer kids like that is like that is a whole that again but that is mostly a whole whole other discussion that i'm not going to talk about right now because we are i do want to focus this focus this more on the more on the urn it act um but so this this plan was paused um but now that may not necessarily matter, actually, because Congress kind of wants to force Apple's hand, along with essentially every other company that allows users to store or share messages or kind of really any content. And Congress is some senators and there's a bill that will try to essentially mandate photo scanning and specific photo scanning technology approved by the government. So yeah. So while Apple's plan would have put privacy and the security at risk for all of its users, the EARN IT Act compromises the security and free speech for basically everyone who uses the internet. The bill would create serious legal risks for
Starting point is 00:05:43 businesses that host content such as messages or photos stored in the cloud, online backups, and potentially even any kind of cloud hosting sites such as Amazon Web Services, which means basically most of the internet. So all of these services and companies would be in serious legal risk unless they use this government-approved scanning tools. A version of this bill was first introduced two years ago, sponsored by Senator Lindsey Graham, a Republican from South Carolina, and Senator Richard Blumenthal, a Democrat from Connecticut. And like a lot of these other things, it is allegedly aimed at tackling so-called child sexual abuse material online um which is which is a problem there is the kids definitely do get exploited um kids do get groomed exploited photos of children do get shared online uh that like that that actually is
Starting point is 00:06:37 a real issue um now a lot of the ways that these tools get implemented don't actually address that issue and of course it doesn't actually deal with the people that do this, like, you know, like the bad people that do exploit kids. It doesn't necessarily deal with them either. That is what they wrap this idea as. The original bill that was introduced two years ago threatened encryption and privacy features that would have actually, you know, put Americans' privacy, particularly the privacy of children, at risk. It also gutted Section 230 in ways that caused over 50 civil rights groups to
Starting point is 00:07:10 pen a letter describing the potential consequences of such things like censorship, you know, cramming down on free speech, and the basically destruction of encryption. So when the legislation failed to advance two years ago, digital liberty advocates, you know, sex workers, civil rights organizations, all breathed a sigh of relief. But this past month, as I record this, in February 2022, a group of lawmakers, again, led by Senator Richard Blumenthal and Senator Lindsey Graham, reintroduced the Urnit Act, a slightly modified version of it, and on the 10th of February, the Senate Judiciary Committee voted to
Starting point is 00:07:50 advance the dangerous Earned Act bill. So, yeah, it is chugging along a bit further than what it did last time. The Earned Act aims to tackle the horrific criminal activity related to child sexual abuse material
Starting point is 00:08:05 by making Section 230 protections contingent on the prevention and response to such material online. So Section 230 shields online services, like, you know, commonly used social media, from liability from most user-generated content. Under EARN IT, Section 230 would be amended to enable civil claims and state criminal prosecution related to child abuse materials online against platforms. Now, already, this can kind of happen federally a little bit, but it depends on how the company responds to it. But this would introduce a whole new wave for civil claims and state claims to be filed against companies like this.
Starting point is 00:08:50 If, if, if, if material like this is to be found hosted on their site, you know, including, you know, that would even include if like someone who's underage operates a not safe for work Twitter account that they probably should not be operating. But this, this, you know, this, this could also, this could basically make the company in trouble. They could fall under state claims or civil claims. So as a result, online services could be subject to endless litigation under 50 different legal law systems for all the states regarding finding child sexual abuse material online. So the bill's proponents claim that this isn't necessarily a problem for any service as long as it is scanning the files and reporting child sexual abuse material to law enforcement. Internet companies are already required to report suspected material if they come across it.
Starting point is 00:09:32 And they do report material on a massive scale. That often comes with a lot of mistakes. Facebook is often held as a positive example by lawmakers and law enforcement for how much they do report such material. example by lawmakers and law enforcement for how much they do report such material but while their new scanning techniques have produced many millions of reports most of them are inaccurate like most of them actually aren't of minors it's it's it's not it's not actually none of the scanning material is good because a lot of cases many people up into their 30s can get often flagged and often like even non-humans can get flagged like pictures of fruit okay like it's not like it's not like yeah none of these scanning tools are actually very good yeah and like this is i think i think a thing that like if you've
Starting point is 00:10:16 never like had to work with a machine learning algorithm before i think it's difficult to understand how unbelievably bad these things are yeah like it's just it is oh god like the the incomprehensible horror of trying to get a machine learning algorithm to do the thing that you wanted to do and not do the other things that you're not that you don't want it to do to like you know be able to tell the difference between like a particularly smooth and round peach and like child sex abuse material you know you human being can do this right the machine cannot and it is they it is horrifically inaccurate you have to do all kinds of like hacking stuff together in order to get the stuff to work and yeah it's it's it's a fiasco a A good example of this that I've heard before
Starting point is 00:11:05 that I'm probably going to butcher this explanation, but that you can take a photo of a wolf, maybe even three photos of a wolf and say, here, these are photos of wolves. Here's these other photos. Find which ones are wolves and it'll sort through other ones. Some of them have wolves, some of them don't.
Starting point is 00:11:25 And it only finds one picture that says, based on the three photos you've given me, this photo is a wolf. And instead, the photo is not. The photo is of a tree. And you're like, why did it tell me this is a wolf? And the computer will answer, well, look, all of the backgrounds are the same.
Starting point is 00:11:41 Because it's trying to imagine, it doesn't have the same thing that humans do when all these computer algorithms that are trying to learn to replicate and find these patterns it is never perfect so the big thing that is people often overlook is that yeah specifically with this like with with facebook's scanning tool and the millions of reports that that it does make you know federal law enforcement will frequently use the massive number of reports to suggest there is this giant recent uptick in child sexual abuse materials. But that's not because there actually is. That's because the scanning that some companies are doing
Starting point is 00:12:16 is just so bad. Like, it's just so inaccurate that it flags so many things. So, like, in action, the new EARN IT Act would just pave a massive new surveillance system run by private companies that would roll back some of the most important privacy and security features and technology used by people around the globe, right? The idea is to compel private companies to scan every message that's sent online and report violations to law enforcement. And it may not stop there. The EARN IT Act could ensure anything hosted online, including like backups, websites, cloud photos, and more, is all scanned. Now, of course, you can say, I mean, there is no actual true privacy online, right?
Starting point is 00:12:51 The NSA does see everything, which is basically true. But stuff like local police departments and the FBI do not have constant access to what the NSA has. It does actually – like, legally, it does actually take some time for that to happen. The fact that all these private companies would be doing it for them, and the fact that this would actually break encryption makes people like the FBI, makes local law enforcement have a much easier
Starting point is 00:13:16 time accessing what we do on the internet. Because yes, the NSA kind of does always see everything, but that this actually is quite different in terms of the accessibility of that information and i i think i think it's also you know to to go back into one of the sort of like encryption arguments too right so okay once you put a backdoor into encryption right once once you have you know you have your system you have your encryption system but you know now there's now there's a way to access it right because oh well
Starting point is 00:13:44 we need to access these you know we need to be able to dec have your encryption system, but you know, now there's, now there's a way to access it, right? Because, oh, well, we need to access these, you know, we need to be able to decrypt this in order to see if there's like child pornography materials on it, right? Once that backdoor exists, any, anyone who finds it can use it for anything they want. And it's, it's, it's not even just that, like we'll get into some other things around encryption, but, but yeah, continue. Yeah. And you know, and I think, I think this is something that I think people don't like the, the, like the the people who are just thinking about this in terms of child pornography don't think about which is that like i don't know a lot like there these these kinds of backdoors right like other people can find them yes and you know okay now now you've
Starting point is 00:14:22 just put a backdoor in all of your encryption like oh hey here's you know like here you you you are you are going to get people killed and you're going to get people killed because you're going to have people who are doing things under governments that you know will will like you know you're going to have people in myanmar you're going to have people in yes you're going to have people in in egypt you're going to have people in syria who like these these regimes and like these you know private private companies right are going are going to sell the back doors to these regimes and they're going to use it to hunt down torture and kill people and so yeah there is there is a lot of problems with it especially especially how it how it kind
Starting point is 00:15:01 of addresses encryption because the bill does try to actually have some encryption protections, but the way they go about it is not adequate. And it even kind of fosters its own negation in some ways, if you read the entire bill. So, but I'll get more into encryption in a sec, because there are other like technical issues with the way this bill is designed and how it would be enacted welcome i'm danny thrill won't you join me at the fire and dare enter nocturnal tales from the shadows presented by iheart and sonora An anthology of modern day horror stories inspired by the legends of Latin America. From ghastly encounters with shapeshifters to bone-chilling brushes with supernatural creatures.
Starting point is 00:16:01 I know you. Take a trip and experience the horrors that have haunted Latin America since the beginning of time. Listen to Nocturnal Tales from the Shadows as part of my Cultura
Starting point is 00:16:19 podcast network. Available on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. There is this sort of benefit to having a legal material that is actually exploding minors being primarily hosted on big tech platforms because these platforms are used so much and are mostly non-restricted.
Starting point is 00:16:48 So it makes catching this stuff and reporting it actually much easier. Like it is, if they're hosted on these mainstream things, it does make seeing it and reporting less difficult. So not only will this bill make tech companies be more likely just to ban all not safe work material in general, right? Because if companies are forced to scan
Starting point is 00:17:04 and they're going to be filing so many reports, this will result in a lot more companies just saying no nude photos at all, like just completely gone. Not only will this bill just make tech companies more likely to ban all not safe work content in general, which would be horrible for sex workers and just a bad precedent.
Starting point is 00:17:22 But yeah, they would be more likely just to do that because of how much over scanning there would be and just a bad precedent. But yeah, they would be more likely just to do that because of how much overscanning there would be and just a whole bunch of things. It would create too many fears of legal repercussions. Thus, you know, that would force people who distribute child porn onto more sketchy sites and sites that might just refuse to scan content in general because they're temporary hosting. But the bill could also just scare these bad people off of mainstream platforms and make them voluntarily migrate to more niche and hard to find corners of the internet making illegal
Starting point is 00:17:49 content harder to catch and take down because there will always be weird temporary sites to host this type of thing like they're always there like these bad people will find a way it's always like that's it is it is gonna always be a problem um and so in a way it It's always like, that's, it is, it is going to always be a problem. Um, and so in a way it's, it is better to have these things on mainstream platforms because reporting them and taking them down can be much easier. Um, it's, it's like, it's like when people really advocate platforms like telegram shut down all fascist channels, right? The thing is, is that there's a lot of benefits to having these chat rooms on Telegram because it makes them really easy to monitor and really easy to infiltrate. So there's a lot worse places for fascists to organize.
Starting point is 00:18:33 If you're doing it on Telegram, it's actually really easy to watch. So it's this weird give and take in terms of where these things happen because they are going to happen somewhere. of where these things happen, because they are going to happen somewhere. So I now want to talk about how specifically this bill threatens online encryption services. The bill would strip critical legal protection for websites, apps, and specifically Section 230. If passed, it would empower many different levels of government to make sweeping new internet regulations. Individual states will be able to pass laws to hold private companies liable as long as they somehow relate their new rules to child abuse
Starting point is 00:19:13 materials. It's like they will be able to have a whole bunch of new rules on internet regulations if they can sift it through this lens. The goal is to get states to pass these laws that will punish companies when they deploy end-to-end encryption or offer other encryption services. This includes messaging systems like WhatsApp, Signal, iMessage, and as well as web hosting like Amazon Web Services. EarnIt aims to spread the use of tools to scan all online content against law enforcement databases directly. In a myths and facts document distributed by the bill's proponents, it even names a government-approved software program that
Starting point is 00:19:51 they could mandate called a PhotoDNA, which is a program that Microsoft made that reports directly to law enforcement databases. So, EARNIT doesn't specifically attack encryption per se, but that's because it doesn't need to. It doesn't have to because of the way the bill is designed. How it approaches encryption is actually a little more insidious. It allows the fact that encryption exists on the platform itself to be used as evidence against a company in order to find it liable for hosting child sexual abuse materials. They can use the fact that encryption exists as evidence, which is wild. This is the thing that CCP does a lot. They'll use the fact that someone is using a VPN,
Starting point is 00:20:40 for example, as evidence that they're a terrorist. This happens constantly, and it makes a lot of encryption stuff incredibly unsafe because like you know you show up with your phone you have signal on it and the ccp is like well you're this is this is proof we're just gonna we're gonna lock it up and throw away the key yeah and yeah it's extremely bad so the result is that laws will make companies liable um if they don't scan and report user content for child sexual abuse materials, which they can't do unless they break encryption. You know, big companies like Apple are going to to protect themselves. So Earn It is like it coerces these sites and platforms and services
Starting point is 00:21:18 to do this sort of scanning and not just on messages, but all online content encrypted or not. Companies that handle online content would have to weigh the benefit of their users securely encrypting their data content against the legal risk of doing so. And encryption becomes much harder when it, you know, puts the company's bottom line at risk. And like end-to-end encryption isn't just for messages, right? It's not just on signal. It secures most of the internet, or at least a lot of it, keeping what you do allegedly private and safe online.
Starting point is 00:21:51 You can't have a secure internet where all of the content is also screened because you can't have internet encryption alongside mass scanning requirements. So this isn't just an attack on encryption. It's an attack on any fundamental security that the internet you know has yeah and you know there's lots of like god there are lots of extremely technical reasons why this is an extremely bad thing like it's like okay yeah like you you you
Starting point is 00:22:27 you think malware is bad now like oh gosh like look at like the the things the things that will happen like you you you think people are stealing apes now like the things that will happen if you have to if you have to deal with an internet that's unencrypted or you know yeah no it's an absolute horror show yeah like if yeah like i i yeah it's this this is a thing bad enough that like i i do not have the words to express how catastrophic this would be because yeah just the fundamental structure of the internet it really is not just for messages like the urn it uh myths and fast document also like specifically attacks amazon for not scanning enough of its content. And since Amazon is the home of Amazon Web Services, which hosts a huge number of websites, that implies that, like, the bill's aim is to ensure that anything hosted online also gets scanned.
Starting point is 00:23:19 Like, everything. The online service providers, even the smallest ones, will be compelled to scan user content with government-approved software like PhotoDNA. And if EARNIT supporters succeed in getting large platforms like CloudFair and Amazon Web Services to scan, they may not even need to compel smaller websites because the government will already have access to the data through the cloud platforms.
Starting point is 00:23:44 So as long as they get, you know, these big hosting platforms, they don't even like, they won't even need to bother with a lot of, with a lot of smaller sites. I think there's another thing I think that's probably worth mentioning here, which is,
Starting point is 00:23:56 so we don't really have like the time to fully go into this in this episode, but like, there's a lot of this sort of stuff is being pushed by these like incredibly right-wing evangelical anti-porn groups yes and their goal is just to eliminate anything that is not like part of their sort of fundamentalist christianity yes from the internet and those people and this is, this is particularly relevant to this because those people are going to find a way to, to, to,
Starting point is 00:24:28 to, to like, to, to bring lawsuits against these companies specifically so that they specifically so that they can do this. Cause this is, you know, what,
Starting point is 00:24:34 what you've done is you've just handed them a gun. Porn sites will all be taken down because there'll be, cause there'll be facing so many endless lawsuits. Yeah. Um, like only like only fans will no longer host ethical porn. Like, likeFans will no longer host Ethical Poor. Like, none of this will happen. Like, all of it will be taken down.
Starting point is 00:24:48 No one, like, this will attack sex workers to such an absurd degree. It'll make a lot of, if not most, online sex work just impossible because there will be so many lawsuits always happening that companies will just always ban it just because they can't risk dealing with all those legal fees.
Starting point is 00:25:09 Yeah. And even the fact that state prosecutors and private attorneys will be able to drag an online service provider into court over accusation that their users committed crimes and then use the fact that the service chose to use encryption at all as evidence against them, is the fact that that's a strategy specifically allowed under EARNIT makes the possibilities of this type of thing just endless. Like, imagine they'll be able to take down Signal so easily. It's wild.
Starting point is 00:25:41 If they can find one instance of an abuser using Signal or has used Signal, then basically all of Signal's encryption will be severely threatened because of the way that legal fines will be forced onto this company. It is specifically for stuff based – for any kind of – any service allowed in the States. based like for any kind of any any service allowed in the states and yeah it's really frustrating because you know people including senators um who are pro-earn it say that the new tools are necessary to tackle the issue of online child abuse and the and the distribution of illegal materials online but you know obviously like possessing viewing or distributing child porn or child sexual abuse materials is already written into law as a serious, as an extremely serious crime. It's like the most legal thing you can do.
Starting point is 00:26:30 Yes, and it has a broad framework of existing laws seeking to eradicate it, right? People cannot, like, companies can already get in federal trouble if they're found to continue, if stuff is found and they continue to host it. Or if their stated purpose is to continue uh if like you know if stuff is found and they continue to host it or if their goal if like their stated purpose is to host it like like some of the most trouble you can get
Starting point is 00:26:50 into um at least at least like on the books um because you know you can you can look at you know how many cops are involved with this type of thing as like evidence to be like oh like like as evidence that like it doesn't like it may not get enacted upon always there was there was a horrible story recently of a teacher who sorry this is this is this is this is going to be quite graphic um but of a like skip ahead like a minute or two if you don't want to um of a teacher who fed students um food containing um her husband's semen um her husband was a cop and her and her husband again who was a cop? Uh, the, the, also the leader of a SWAT team, um, had raped multiple children. Um, as like it had,
Starting point is 00:27:33 and had, had, had pictures of children. Um, and like, like both of them were doing this together. So like, yeah, that's like the leader of a SWAT team. It's like police, like the fact that you're like, if you, if you look at the people often doing this type of stuff, it's cops a lot of the time, like the cops rape so many, um, uh, kids that,
Starting point is 00:27:52 that they, that they arrest and detain. It is a shock. Like you, you, you can Google this every week and you'll find, you'll find like new reports of it. It is,
Starting point is 00:28:01 it is, it is horrific. Um, and you know, online service providers that have actual knowledge of an apparent or imminent violation of current laws around child sexual abuse materials are required to report it or they will face legal trouble um yeah like you you could you can kill people and get in less trouble with the law than than you will get if if you intentionally do this
Starting point is 00:28:23 stuff like there are scenarios where you can kill people where you won't get in trouble with law there is no scenario where you do you you like you you intentionally do like you intentionally do this stuff where you will like unless you're a cop with like a with with legal protection of your other cops won't rat you out like yeah or you're like very very rich yeah like you know unless unless you have extra legal protection yeah like you are fucking going to vanish so yeah like we already have a lot of stuff to deal with this and the methods proposed by earn it would not only chip at the last semblance of of privacy online but it would all but it would arguably make actually combating
Starting point is 00:29:06 real instances of online child abuse a lot more difficult. It would pressure distributors and abusers into harder-to-find corners of the internet that don't fall under big tech companies. Plus the massive increase in content scanning would produce so many false flags, it would clog up any effort to find actual materials
Starting point is 00:29:25 because so much stuff is going to get flagged right it's going to you're going to get a wave of so many images that you have that you have to sort sort through and figure out if the people in it actually are underage because a lot of people who look 30 and sorry sorry a lot of people that are 30 can also look underage sometimes, like with lighting, with effect. It's going to be such a task. And we can already see this in effect with new scanning techniques used by Facebook that have produced millions of reports to law enforcement, most of them inaccurate. And of course, federal law enforcement uses this massive number of reports produced by low-quality scanning software to suggest there's a huge uptick in these images.
Starting point is 00:30:08 Thus, armed with misleading statistics, the same law enforcement groups make new demands to break encryption or, with EARN IT, hold companies liable if they don't scan user content. Welcome. I'm Danny Trejo. Won't you join me at the fire and dare enter? Nocturnum, Tales from the Shadows, presented by iHeart and Sonora. An anthology of modern day horror stories inspired by the legends of Latin America. From ghastly encounters with shapeshifters to bone-chilling brushes with supernatural creatures. I know you. Take a trip and experience the horrors that have haunted Latin America since the beginning of time.
Starting point is 00:31:08 Listen to Nocturnal Tales from the Shadows as part of my Cultura podcast network, available on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Those kind of algorithms, right? or wherever you get your podcast. the this is the same stuff that like you know how you know how there's there's those there's those like trending topics on twitter yeah and they'll they'll show you a tweet and the tweet will be like i don't know there'll be someone talking about a subway sandwich and it'll get like it'll it'll show up under trains right because it's the subway like exactly this this those are the algorithms that they want to fucking run the entire internet through. I have seen some very erotic bell peppers. And it's going to...
Starting point is 00:32:10 These things aren't going to be good. And independent child protection experts are not asking for systems to read everyone's private messages. Rather, they recognize that children, particularly children that might be abused or exploited, actually need encrypted and private messaging just as much, if not more than the rest of us. Like no one, including the most vulnerable among us, can have privacy or security online without strong encryption. And the EARN IT Act doesn't really just target big tech. What it does is it targets every individual internet user,
Starting point is 00:32:45 treating all of us as potential criminals who deserve to have every single message, photograph, or document scanned and, you know, compared against a government database, like directly to law enforcement. And since direct government surveillance would be, you know, blatantly unconstitutional and provoke public outrage, earn it to use these tech companies, you know, from the largest ones to the smallest ones as its tools to kind of bypass that be blatantly unconstitutional and provoke public outrage, earn it to use these tech companies, from the largest ones to the smallest ones, as its tools to kind of bypass that constitutional barrier. Because yeah, if you hit the tech companies where it hurts, they will not allow this type of stuff at all. And this is also, you cannot deny that this is also just part of a larger to, you know, remove porn and remove, you know, any not safe work material from being hosted online. So the strategy is to get private
Starting point is 00:33:53 companies to do the dirty work of mass surveillance. You know, it's the same tactic that governments tried to use this year, trying to, you know, convince Apple to subvert its own encryption and scan all of its users' photos. It's the same strategy that the UK law enforcement is using to convince the British public to give up their privacy, having spent public money on a laughable publicity campaign
Starting point is 00:34:15 that demonizes companies that use encryption. So that's really how it's operating. I do want to shout out the EFF for providing a lot of the research that I used for compiling stuff on this episode.
Starting point is 00:34:32 Thank you, EFF. You often do good work. That's the Electronic Frontier Foundation. They focus a lot on internet privacy issues. I do want to point people to a link tree
Starting point is 00:34:48 it is linktr.ee slash stop earn it so yeah you can find different ways if you're the type of person that enjoys calling representatives or something it has links for that kind of thing
Starting point is 00:35:06 it has links to send automated messages to your representative to vote no on the Earn It Act it has stuff, if you're the type of person that enjoys signing petitions it has more info on what Earn It is and what it does and a whole bunch of
Starting point is 00:35:22 other stuff around organizing to help stop this bill. There's like Discord channels that have people organizing to stop this bill. Links for it on that. There's info on like actions you can take. So yeah, I would, if you're interested in like looking for the different ways that can maybe, you know, contribute,
Starting point is 00:35:41 you know, no single person can make an impact, but you know, enough people can. So yeah, that's Linktree slash StopEarnIt. And then also, again, another shout out to the EFF. Yeah, I want to say two closing things before we close this out. One, if you think that once you're handing the entire contents of the internet over to the government to run through scanning algorithms that the only thing they're
Starting point is 00:36:09 ever going to scan for is child pornography i have no nft to sell you it is a picture of a bridge uh once once you buy this nft of the bridge you will own the brooklyn bridge uh contact contact me for more details uh the second thing is that um you, when we talk about like when we talk about anti-porn stuff, when we talk about how, you know companies for for whatever reason and this is true just of of companies that are trying to comply with you know like the app store or stuff like that like whenever you get target things that target not suited for content they they inevitably inevitably without fail target queer content yes queer content has literally nothing at all through sexuality because that's you know this is this is this is this has always been like accusing queer people of being child predators has been the attack line on queer people queer people are always on the front line of all of this stuff
Starting point is 00:37:14 yeah yeah they will always be the first people impacted they'll be the first people demonized even if even even if it's not even not safe material if it has if it has nothing to do with it it will still always always be impacted more than basically any anyone else yeah and we've been seeing this on youtube like constantly yes yes lots of lots of people who just you know make trans content queer channels are always being banned or demonetized yep yep marked as adult content like yeah it's it's horrifying and if if you want an internet not even just if you want an internet that has like sex on it if you want an internet that has queer people on it right expressing themselves in in in in any way that's not like literally just it's straight person but
Starting point is 00:37:58 you say queer right if that's a thing that you think is valuable and if and if you think that you know it is important for queer people to be able to express themselves for their own health and safety, you have to oppose this. Yes, absolutely. I guess one final thing I'll add because I know someone will probably message me about it. There is a slate opinion piece by somebody saying that this bill would actually let child abusers walk free because they could use the fact that this bill essentially compels um the evidence that they was collected to prosecute abusers to become invalid in court so this would actually also make the this would just make this would make the bill um uh you know actually make people walk free i do not agree with this take i don't think that's how it would work out at all because especially for you know you can use this for like political organizing you can use this for
Starting point is 00:39:03 a lot of like you can use the same argument for a lot of cases and it never works out that way because the government does not care about that sort of thing um it's it's it's not that's not how it works um yeah i mean like again like things that get violated like in in theory that does not know that's no way like like they illegally seized you know ted kaczynski's, you know, like evidence. And yeah, no, it doesn't matter. Like, yeah, that's not going to matter because then this bill would be seen as a good thing because it would prevent people from, you know, then encryption wouldn't be necessary because then none of the evidence that people would, you know, would have gathered would
Starting point is 00:39:41 ever be admissible in court. And that is that they would never design a bill like that that that's not that's not the case i i i disagree with this take so do not send me this article saying actually it's gonna have this happen because i i i do not believe it because this assumes that the government operates like like coherently and operates like you know like no the government just does does not care no like again like the the the first amendment is superseded by traffic law yeah like no you you know that you're not going to be able to use no this isn't no this just won't secretly let abusers go free
Starting point is 00:40:17 this is not secretly a good thing because it'll make all evidence inadmissible in court. Bullshit. Anyway. The courts, yeah. Anyway, I'll give a final shout out to the link tree, link tree slash stop earn it. That's L-I-N-K-T dot E-R slash stop earn it. If you're the type of person that likes doing those types of things. And also it has links to Discord channels for other types of organizing beyond petitions and calling and senators and sending messages and blah, blah, blah, blah, blah. Anyway, that's the episode. Thank you for listening.
Starting point is 00:40:56 I just thought this is an important thing enough that I haven't seen enough. I have not seen enough people talking about the Earn It Act and the way it does seriously threaten digital privacy. And because it was approved by the Senate Judiciary Committee to be pushed forward, it is actually chugging along on the slow legal process. So it's gotten further than what it got in 2020. So I thought it was actually worth talking about, you know, for privacy issues, how it affects queer people how it affects sex workers um and all all that general thing so yeah and i also i want to you know the it's very easy to feel hopeless with this kind of stuff but like we've beaten legislation we have before like i like one of my one of my formative childhood experiences was when we beat when we beat sopa and pipa like
Starting point is 00:41:40 yep we can beat them it takes it takes a lot it takes a lot of mobilization but yeah like we can beat them. It takes it takes a lot. It takes a lot of mobilization. But yeah, like we we we we I know we can. I know we can beat this because we've beaten things like it before. Agreed. All right. That does it for us today. If you want to find us on a currently more secure than what it could be Internet, you can follow us on Twitter at Cool ZoneoneMedia and HappenHerePod. I think apparently Instagram too.
Starting point is 00:42:07 So that's cool if you're an Instagram person. Good for you because Twitter is bad. You can find me on Twitter at HungryBowTie. Yeah. You can find me at GetMeCHR3. You can indeed. That does it for us. Encryption! You can, indeed. That does it for us.
Starting point is 00:42:25 Encryption! Encryption! out on the iHeartRadio app, Apple Podcasts, or wherever you listen to podcasts. You can find sources for It Could Happen Here updated monthly at coolzonemedia.com slash sources. Thanks for listening. You should probably keep your lights on for Nocturnal Tales from the Shadow. Join me, Danny Trails, and step into the flames of fright. An anthology podcast of modern-day horror stories inspired by the most terrifying legends and lore of Latin America. Listen to Nocturno on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.