The Changelog: Software Development, Open Source - How we got here (Interview)

Episode Date: September 23, 2016

Cory is a science fiction author, activist, journalist, co-editor of Boing Boing and the author of many books. We talked to Cory about open source, the open web, internet freedom, his involvement with... the EFF, where he began his career, the details he'll be covering in his keynote at OSCON, and his thoughts on open source today and where developers should be focusing their efforts.

Transcript
Discussion (0)
Starting point is 00:00:00 I'm Cory Doctorow, and you're listening to The Change Log. Welcome back, everyone. This is The Change Log, and I'm your host, Adam Stachowiak. This is episode 221, and today, Jared and I are talking to Cory Doctorow, a science fiction author, activist, journalist, co-editor of Boing Boing, and author of many books. We're producing this show in partnership with O'Reilly Media. We'll be at OzCon next month in London. Use the code CHANGELOG30. If you want to get 30% off your registration, head to OzCon.com slash UK to learn more and register.
Starting point is 00:00:40 We talked to Corey today about his involvement with the EFF and where he began his career. He shared some details he'll be covering in his keynote at OSCON about open source licenses and the potentially dark side of open source software if we don't do it right. We have three sponsors today, CodeSchool, TopTile, and Linode, our cloud server of choice. Our first sponsor of the show is our friends at CodeS code school and if you want to learn something new a proven method is to learn by doing and that's exactly the way code school works you learn to program by doing with hands-on courses code school's courses are organized into paths based on technologies like html and css javascript hot topics like React and Angular, Ruby, Python,.NET, iOS, Git, databases, and
Starting point is 00:01:28 even electives that take you off the beaten path. Let's see you want to learn React. You can start level one of CodeSchool's React course, which begins with a quick video lesson on React components. After the video, you get hands-on practice building with components using in-browser coding challenges. There's no hassle, no setup, just learning. There's a path for everyone at CodeSchool. Head to codeschool.com to get started
Starting point is 00:01:50 and learn by doing. And now onto the show. All right, we're back. We got a fun show. And Jared, this is another show we're doing in partnership with O'Reilly as part of OzCon London. Big show today. The keynote speaker is here, Corey Doctorow. He doesn't need much of an introduction, but if we were giving him one, Jared, what would it be? Well, many people know him as the fiction author, science fiction. He's an activist with the EFF, journalist, many things, blogger. Corey, how about, first of all, welcome to the show, and then maybe give how you like to introduce yourself to people. You know, I usually say I'm a science fiction writer and activist, and then if pressed, I say a few things more. When I lived in the UK before I became a citizen,
Starting point is 00:02:35 it was always a problem because when you'd land there, if you're not a citizen, you get these landing cards, and they've got this like three centimeter long blank in which you're supposed to write occupation. And I always wanted to say, you know, see attached and then like have my Wikipedia entry. There you go. Well, we'll just say see attached and we'll, we'll include your Wikipedia entry in our show notes. Surely our audience is probably well-versed with you and your work over the years, especially as co-editor of Boing Boing and the many books you wrote. In fact, Adam, I believe you first found out, Corey, about a book that he wrote.
Starting point is 00:03:12 I'm glad you brought that up because, Corey, I don't know if you know this, but I'm sure that you get this often is that people know you by what you've written before, right? I mean, that's one of the ways. Yeah. And for me, it goes back to an early thing you've done, which was a commission you'd actually written, a science fiction story about Google. Oh, yeah. The day they became evil. And so I found, Googled, read it end to end, which was a short read
Starting point is 00:03:34 anyways, but it changed my life. And that was like the earmark of my life of knowing Cory Doctorow. So that was a cool moment for me. You know, there's a super cool, going back to me you know there's a super cool going back to audiobooks there's a super cool reading of that that will wheaton did uh in my short story collection with a little help that's a free mp3 download or pay what you like mp3 download huh we'll definitely share that because i like reading it but if uh if it's read like audiobooks are read not just and here's the story it's actually got some reading behind it. I love it.
Starting point is 00:04:06 Will Wheaton would definitely do it. Will is a killer reader. So for those who aren't familiar with that story, though, it's the day Google became evil. You wrote this short story. I guess I learned today as part of, you know, the research of doing this call with you, that it was actually a contract writing for you.
Starting point is 00:04:22 So somebody commissioned you to write that for them. But it was basically about, you know, Google becoming evil, INS, somebody goes out of country, gets locked out, passport this and that, you know, your story. But to the listeners, can I give a quick, brief synopsis of that story just so they know? Sure. Well, you know, I am always reluctant to think of science fiction as a prescient literature, particularly. But the one thing that that book did or that story did that I think really paid off is it's about it's about Google making a compromise on its metadata. And so, you know, Google's always had this position that they're not going to let Uncle Sam spy on its users. But they say, I'll tell you what, U.S. Customs Authority, we will show you the same ads that this user sees when they look at their email. And through that, you can make up your
Starting point is 00:05:11 mind about whether or not you trust them when they enter the country. And we've actually had proposals like that since. I mean, obviously, the thing we learned from Snowden is that, you know, their metadata is considered fair game and has been spied on at great length and in great depth, but also that since then, the U.S. Customs and Integration Service has proposed or had a notice of proposed rulemaking to see whether or not they should be allowed to tell people who enter the country that they're required to give their social media handles and then allow them to do data mining of their social media presence as a condition of entry. That's that's a motif that we see in lots of other places. It's a thing that landlords are now doing in a lot of places. Employers have periodically made the news for requiring that their employees give them not
Starting point is 00:06:01 just their logins, but also their passwords. So their employees can log into Facebook and see what's posted in private, that sort of thing. It's so funny that you wrote that so long ago. And later on, we're asking about things you've written as a science fiction author and how they've either inventions or have become real. But basically, you kind of teed up this idea of this world we could live in. And some of it, not the exact truth of what you wrote but some of it is actually playing out where you know like jared he just made me think about that recent show we did with ben bixby tensorflow
Starting point is 00:06:36 and like just passing somebody's social handle into like a deep learning algorithm and seeing what comes out of it, you know? Absolutely. Yeah. I mean, I think that what science fiction does is it's like what a doctor does when you go and say that you've got a sore throat and she'll like swab the back of your throat and rub it on a Petri dish and leave it for the weekend and then come back and look through a microscope to see what's going on. She's not making an accurate model of your body. She's making a usefully inaccurate model of your body.
Starting point is 00:07:05 And a science fiction writer like reaches in and plucks out a single technology or idea and builds a world around it. That's not meant to be a model of the world as it is or can be, but a model that's useful because it's not like the world is going to be. It's an exaggeration or it's, you know, it's like that one fact becomes reified as the most important fact in that world. It's also can be somewhat, you know, self-fulfilling prophecy as well as you're often influencing or inspiring the engineers and the scientists and the people who are creating the things of the future in either direct or indirect ways. Yeah. I mean,
Starting point is 00:07:41 that's actually I call that the opposite of a prophecy. I think that's inspiration. Yeah, in prophecy, you know, the whole problem with the idea of predicting the future and with with fortune telling is that it's this like intrinsically fatalistic idea that, you know, the future is only reason to care about it is that we can change it and so what science fiction can do at its very best is it can make different futures than the ones that we're headed towards manifest which is you know a very exciting idea well it's like uh it's it's funny because i was on netflix last night and i was like what star trek episode number one is on here my dad would love it like he's not alive but he'd be like, that's awesome. Let me go back and watch this. But I was thinking like, Oh yeah. iPhone is, is kind of reminiscence of beaming up Scotty,
Starting point is 00:08:30 that whole device. They had to do different things in this magical handheld device. And while it may not have been prescriptive, it was sort of like a hinting at what might come. And then you, you know, life sort of evolves from the art and art evolved from the life. Well, and even more than that, those Motorola flip evolves from the art and art evolved from life.
Starting point is 00:08:55 Well, and even more than that, those Motorola flip phones that looked exactly like Star Trek communicators didn't look exactly like Star Trek communicators because Gene Roddenberry had a crystal ball that showed him what would happen in Motorola. Right. It was like something way more awesome than that. Right. Motorola engineers grew up wanting to be live in Gene Roddenberry's future. Like, give me the choice between those two, like knowing what's coming and changing what's coming and i'll take changing any day yeah so let's uh let's pause there for a second i know we riffed quite a bit on some some fun stuff but before we go much deeper for those who don't really know cory how do you describe yourself how do you take us back to to kind of introduce us to who you are but maybe even take us back to like where you got your start in like the paths you've taken around activism, around EFF and freedoms
Starting point is 00:09:31 and things like that. Well, I grew up in Toronto and in the mid 90s, I started commuting to Silicon Valley to do systems integration for an IREX shop, Silicon Graphics Unix shop. And through that became more interested in free and open source software and network administration. And then as the web took off, I became a CIO of a web services company and then started a dotcom that did free and open source peer to peer search that we raised some money on and that we had an acquisition offer for
Starting point is 00:10:05 and that made our investors see dollar signs on. And so they took all of the founder's equity. And in the ensuing chaos, the acquisition deal fell through and the company just imploded. And while we were working on that, because it was doing peer-to-peer and file sharing, it was really involved with kind of the legal fights of the day, which revolved around Napster. And our best legal advice came from Electronic Frontier Foundation. A lot of our programmers were also members of the Cult of the Dead Cow, which was this amazing hacker group. And CDC hooked us up with the Electronic Frontier Foundation, and I got more and more involved with them. And as the company started to implode, I quit my job, quit the company I'd started,
Starting point is 00:10:49 and went to work for Electronic Frontier Foundation. And all this time, I'd been writing novels and stories, and they started to sell around this time as well. I took a job with EFF overseas. I went to London to be their European director and worked on digital standards and treaties at the United Nations and then in Brussels, mostly on killing DRM. And I was also all this time writing this web blog called Boing Boing that started off as a hobby that some friends and I put together. It was founded by my friend Mark Fraunfelder and his wife, Carla Sinclair. And then it built into a fairly big, significant commercial concern with, you know, 9 million unique readers a month and still going strong. And so I was doing that as well. And that was also a useful platform for talking about the political work I was doing and my writing. When my writing got to the point where it was occupying too much of my time to do a proper
Starting point is 00:11:46 job at eff i i did what i always told them i was going to do i quit to write full time but like literally within seconds of announcing that i was going to do that i got offered a fulbright at the university of southern california to go and teach about drm in la and my wife at the time was working for the b. She's British. And she was able to transfer to BBC America. So we came and lived in LA for a year and moved back to the UK in 2007. With my wife pregnant at the time, she delivered our daughter in our flat in East London. And we lived there ever since. I did some contract work for Disney Imagineering and wrote books and, you know, went and spoke to people and did other bits and pieces. But I became more and more alarmed about
Starting point is 00:12:32 the proliferation of DRM. It started off as like a, you know, harmless folly that was used to lock up game consoles and DVD players. But because it has this law, the DMCA, that says that breaking DRM is illegal, even for legal purposes, companies started adding it to like cars and tractors and insulin pumps and cat litter trays and arranging the DRM so that you had to break the DRM to do things like put your own detergent in your cat litter tray or to use to broadcast your own seed using the soil density data that your john deere tractor had gathered while you drove it around your field and the torque sensors on your wheels gathered centimeter accurate soil density data and because things that are drm are off limits to security researchers because knowing about a defect in a product
Starting point is 00:13:22 helps you break the drM. And because DRM can't be implemented as free and open source software, because like it's kind of obvious on its face that if you're designing a program that treats its user as its adversary, that making that program modifiable by that user is not a good idea, right? From a security model perspective, if like there is a flag in the source that says DRM on equals one, and the user doesn't want the DRM, which I think universally users don't want DRM, no one woke up this morning and said, I wish there was a way I could do less with my music, then, you know, someone is going to just turn that one into a zero and recompile your program. And so free and open source software is antithetical to DRM. So we had this existential threat to floss and this existential threat to security
Starting point is 00:14:14 at the same moment that software was metastasizing and invading the world and moving into our light bulbs and our baby monitors. And I thought that this was a terrible thing. And I came up with an idea for fixing it that was built around first launching a lawsuit to invalidate section 12.1 of the DMCA, which protects DRM, and then going around the world and getting other countries to drop their own versions of the DMCA, and also doing some activism with standards bodies. The W3C, the World Wide Web Consortium,
Starting point is 00:14:49 is adding DRM to the core suite of web standards and trying to get them to abandon that. And so I came up with this plan and I pitched it to EFF. And they said, that is an awesome idea, but we don't think anyone here has the bandwidth for it. But it sure sounds like something you'd be good at, hint, hint, hint. And so I said I was going to take a couple of years off from writing books full time and writing slowly now, a page a day on the third little brother book. And then I would go back and work half time for EFF on this, which I started doing. And the director of the MIT Media Lab, Joey Ito, gave us a grant. He made me his activist in residence. And so that pays my way there. And this is my gig. I'm going to kill all the DRM in the world within a decade.
Starting point is 00:15:27 Wow. That's my project. Awesome. And I continue to write novels, and I continue to do some contract work for Disney, which is always ironic. But, you know, I'm a giant theme park nut, and their engineering organization is amazing.
Starting point is 00:15:39 Yeah, to hear the Disney Imagineering part of your story was surprising, honestly. Well, it's, but, you know, I get get to do super cool work and Disney being who they are. I can't tell you about any of it, which is hilarious. It's like a human side of DRM, right? You can't say something. It's a secret. That's well, it's just confidentiality.
Starting point is 00:15:58 I don't think you can get a privacy advocate to say that confidentiality is bad. Like, I'm okay with that. I just think it's just over the top. They, they, I don't think there's any reason, any rational reason for me not to tell you this. I think that it's just like for them to evaluate, uh, when it, when it is and isn't in their interest to allow people to speak on the record about the work that they do is, uh, probably worth more would, would cost them more than any gains that they would get from allowing the people who aren't going to reveal anything sensitive to talk about it in public. So I think they've just made this like, you know, self-interested, totally rational decision that rather than figuring out when it's OK for people to talk, they're just going to tell everyone they can't talk.
Starting point is 00:16:38 It's just a pain in the ass. You know, it's like Indiana Jones, right? That last scene, you make this amazing thing and then they stick it in a vault and they lock the door, you know, it definitely levels up the level of intrigue though. People, you know, now we're so curious what it is that you imagineered while you were working with them. I did, I did some really cool things that I'm really happy to have done. I'll leave it there. So I did that and I've done lots of other cool stuff. I continue to do other cool stuff with Boing Boing and whatnot. I'm a visiting professor of computer science at the Open University in the UK, and I still do some stuff with them. And I co-founded the Open Rights Group in the UK, which is kind of analog to EFF there. And I sit on their advisory board. It's a lot of work on the freedom front. I mean, to have the ideas you've had and then also be trusted by these people that have, you know, you've friended over the years and like, hey, just go overseas and do this job. Like, maybe you were qualified, maybe you weren't. I'm not. It's hard to say for sure based on your story.
Starting point is 00:17:38 But obviously you work as you've done the job. But like to to get that kind of authority so early and so easily. I mean, not so easily, maybe it wasn't easy. I mean, I'd worked for you for a couple of years at that point. And I've I've been doing that work already in the US and I started going overseas to do that work from but based in America. Right. And that was not super efficient and it was expensive and it was exhausting. And so I started, you know, I moved overseas to and it was exhausting. And so I started, I, you know, I moved overseas to do it full time. And, and so I was in 31 countries in three years representing
Starting point is 00:18:10 EFF and doing its work. So we got, we got these notes here, obviously, and you are keynoting this conference called OzCon. It's a pretty popular conference that people have heard of, I'm sure over in London, back in your in your original stomping grounds, at least, or at least your wife's and you by happenstance. Yeah, I'm going to stay in our apartment there. We rent it out. I'm going to stay in our apartment there. Nice. Saving O'Reilly on the hotel. There you go. So the keynote is titled How You Got Here. And I think just the opening to this show, I really cannot wait to hear this keynote because someone like you sharing about the open source landscape, especially around DRM and that whole philosophy of like DRM being at odds with the ideas of open source, I can only imagine what you're going to cover. So why don't you share a brief bit about that keynote?
Starting point is 00:19:00 Sure. So, you know, as I mentioned, I've got an eight-year-old and she was born in London in our flat in a pool in the living room. And she likes to hear the story of her birth. Right. And so we would tell her the story of her birth. And, you know, my wife told her she came and yanked on my arm and shouted story arm that I would tell her a story. So she, you know, runs up and she whenever she's born and she yanks on my arm and shut story arm. And and being a writer, I like to iterate when I tell stories. So I would tell it a little differently every time I'd start a little further back or go a little further forward. And what I realized was that like the interesting part of her birth story was the stuff
Starting point is 00:19:34 that led up to her birth and not like the, you know, stork stuff. Uh, but like the, the, the stuff about like how my wife and I met and became, you know, best friends and lovers and a couple and and got married and, you know, decided to have a baby like all of that stuff is unique. Right. Because like the stork stuff is the same for everybody. Right. Like what your parents did to make you is almost certainly something I can guess at with a pretty high degree of accuracy. Right. But like how your parents came to make the decision to do that and make you, everyone has a different version of that. And, you know, the open source version of how we got here, we talk about the licenses and we talk about the, you know, the packages and the milestones. But there's like this really strong social component to how we got here, because around
Starting point is 00:20:24 the same time that the open source movement was starting, it was also around the same time that the open web movement was starting, that we were sunsetting these proprietary network architectures. circuit-switched, services-centric network, or the big commercial services like CompuServe and AOL, they both kick off around the same time. And yet the open web has collapsed. The open web is almost dead. We are in a desperate and dire moment for the open web. And the free and open-source software movement has soared. Everything, including the things that are closing down the open web, is built on free and open source software. And so that is an amazing thing. And the story kind of interrogates how or the speech interrogates what the difference is and how one soared and the other sank. And what we can learn from the free and open source movement to keep the web open as we try to open it up again and i think the thing that the free and open source
Starting point is 00:21:30 software movement had going for it is this thing called a ulysses pact um the story of ulysses goes that you know ulysses was going to sail into um siren infested waters and anyone who heard the song of the sirens would be tempted irresistibly to jump into the sea and the sirens would drown them. And so normally when sailors sailed into the sea, into the siren sea, they would fill their ears with wax. But Ulysses was a hacker and he wanted to hear what the siren's song sounded like. So he had his men lash him to the mast so that he could hear it, but he couldn't get loose. So what he used was his strong self, the moment at which he was strong to predict that in a future moment, he would be weak and to take countermeasures to prevent himself from giving into that weakness.
Starting point is 00:22:16 And we use Ulysses Pax all the time. Like if you go on a diet, you should throw away your Oreos on night one, not because you're like incapable of resisting temptation, but because everyone sometimes has moments of weakness. And the strongest thing you can do is to recognize that you will have a moment of weakness in the future and take a countermeasure against it. Right. And so, you know, in the free and open source world, our Ulysses pact is the irrevocable license because the failure mode of free
Starting point is 00:22:43 and open source software, having founded a free and open source software company, I can tell you, is that there are moments in which it feels like your survival turns on being able to close the code that you had opened when you were idealistic, right? There are moments of desperation when that happens. And of course, it's ridiculous
Starting point is 00:23:00 because if you're making anything substantial out of free and open source software, you're building it on other things that other people have opened and can't close. And if they were to close off their code, your project would collapse. So every one of us wants to be the only one who can revoke a free and open source software license while all the plumbing that we built on top of stays open. And because the licenses are irrevocable, because you can't close it once you opened it, you generally don't even get the pressure from your investors or from potential acquisition suitors or from other parties who can otherwise lean on you and put a gun to your head. They don't even bother because there's no point in shouting at you to close the code if they know that it's not a course of action that's even open to you. And so even though
Starting point is 00:23:46 the same desperation that led us to close the web is present for everyone who's ever made an open source project that succeeded, that desperation can't express itself in the same failure mode that the web has had. And so my talk is about how we can build Ulysses' pact for a newly open web around two principles that will keep the web open, even in the desperation of its founders, even when the pirates who found it become admirals. And those two principles, the first one is that anytime a technology or a computer gets an order from its owner that conflicts an order that's been given to it by a remote party, the owner should 100% of the time, without exception, win. The owner always gets to override remote policy. And the second one is that any true fact
Starting point is 00:24:39 about the security of a system that you rely on should always be legal to disclose under every circumstance. And my pitch is that these two principles should be the principles that we become zealots for. That, you know, if they're not calling you an unrealistic idealist about your adherence to these two principles, you're probably not trying hard enough. And so my pitch is that the people who care about building an open web to be the nervous system of the 21st century, to have an internet of things that's not an internet of things on fire that spy on you and ruin your life, is that we need to take these principles and cherish them as much as we cherish the core principles of free and open source software
Starting point is 00:25:22 and weave them into our licenses, into our professional codes of conduct, into our membership agreements, into every single piece of what we do so that there's never any question that this will come about. And so that like, you know, there have been lots of times when governments have tried to pass laws that say in order to make software, it has to be closed. And the fact that there's all of this critical open software has meant that those laws died every time. Because, you know, you're going back to them and saying, well, okay, but what you're talking about is throwing away all the infrastructure on which the digital world is built. Like, what are you
Starting point is 00:25:59 planning to replace it with when you pass your dumb law? And, you know, like reality asserts itself. And so if we can create a reality on the ground to assert itself when governments contemplate stupid laws that say that remote parties can override local parties, whether those are crypto backdoors or DRM or lawful interception overrides or any of the other things that are have, you know, been the parade of horribles the 21st century, then we can make a difference. Fascinating. I think we need to drill down on these principles a little bit more. I also want to ask you about licenses, specifically copyright versus copyleft,
Starting point is 00:26:37 going to have the more liberal licenses, the more the GPL-style licensing. We are heading up against our first break. This is going to be a great keynote. I'm fascinated already. Adam, you're going to have to wait until they put the video online. I know. I'm not going. I'm so sad. Because I will be there with Corey at OSCON London. You can be there too. We have a discount code in the show notes. Check that out for 20% off. See that. See Eli Bixby talk about TensorFlow. Come hang out with at least one half of the changelog.
Starting point is 00:27:06 We're going to talk to Corey Moore on the other side of this short break, and we'll be right back. This message is for all those team leaders out there that are looking to easily add new developers and new designers to their team, easily scale up when you need to. You got a big push coming. You got a new area of the product you've got to go into. You've got more need
Starting point is 00:27:27 than you thought you could. You've got to go through all this hassle of putting a job out there and hiring people to find the right people. Well, that's a bunch of hard stuff that you don't need to even deal with.
Starting point is 00:27:38 Call upon my friends at TopTal. That's T-O-P-T-A-L dot com. The cool thing about TopTal is you can hire the top 3% of freelance software developers and designers. And what that means is they've got a rigorous screening process to identify the best. So when you call upon them to help you place the right kind of people into your team, then you know you're calling upon the best people out there. Once again, go to TopTal.com. That's T-O-P-T-A-L.com.
Starting point is 00:28:07 Or if you'd like a personal introduction, email me, adam at changelove.com. And now back to the show. All right, we are back with Corey Doctro talking about saving the open web, the future. And this proposal he has, he'll be giving a keynote at OSCON London upcoming in October, all about how this Ulysses pact that we have with open source licensing and software has really saved us in times of weaknesses and how the web is in this time of weakness or in dire straits, we need to save it. Corey has two principles he's trying to impart as ways that we can protect ourselves against failure. Corey, give us a little bit of a rehash. We have these two principles.
Starting point is 00:28:55 The first one is that the local party or the first party should always be able to override remote parties. If you own a device, you should always be able to tell it what to do, even if someone else who's not you gives it an order. And second one, any true facts should be legal to disclose. Any true fact about the security of a computer that someone relies on should always be legal to disclose. Okay, very good. So my first question, and maybe some of this is the work that you're doing with the EFF and is just can only be enforced. You know, we have to write laws for these things. But how do you convince everybody else that these two principles are the way to go?
Starting point is 00:29:34 Well, there's a bunch of different things. So like starting with this question of whether or not people who own computers should be able to decide how they act. There's there's different appeals. So like there's a there's a pure property appeal here, which is like when you own stuff, it should do what you tell it to do. Right. That's what that's kind of what ownership means. Right. If it's your insulin pump or your car or whatever, I mean, you know, cars can be steered into people and we make it illegal to do that. But we don't try and design cars that are incapable of being steered into people or incapable of being driven over the speed limit or anything else.
Starting point is 00:30:11 And a car is just a computer you put your body into and then, you know, pray that its software is accurate while it hurdles down the road at 100 kilometers an hour. Yeah. I mean, you know, think about that Jeep hack last summer where it turned out one point four million Jeep Cherokees could be driven over the Internet, you know, steering brakes, everything. And so, you know, that the most it's not that that car is is just a computer, but the single most salient fact about that car is that it's a computer. Right. Take the software out of that car and it ceases to work just as thoroughly as if you take the, you know, gasoline or the engine out of the car. You know, a voting machine is a computer we put a democracy inside of. And so the idea that that you should be able to tell the stuff you own to do what is in your interest, that that is a no brainer. Then there's the like, well, if you don't accept that, let's think what the consequences might be. Let's assume for the moment there are times when it's legitimate
Starting point is 00:31:08 to let other people give your device orders that you disagree with and have the device choose them instead of you. So first of all, we have an authentication problem because anything that the manufacturer can order your device to do or that law enforcement can order your device to do is also a thing that anyone who can steal the credentials of or successfully impersonate the manufacturer or law enforcement. It's also a thing that manufacturers and law enforcement who are operating in territories that we don't think of as being in accord with the rule of law get to do. So like we talk a lot about self-driving cars and whether or not the police will have a way to just send an instruction to a self-driving car
Starting point is 00:31:50 causing it to like pull over, you know, whether you could ever have an OJ Simpson car chase in the era of self-driving cars or whether the cops would just, you know, email your car and tell it to pull over and just disobey you. And the thing is that like, there's a probably a one in five chance that in the next 15 years, ISIS will form the government of a country in the territory currently occupied by Syria and Iraq. Right. And so there'll be a government. So they will have a credential that allows them to lawfully intercept cars. So any power that we create in a technology that is correlated around the world, because it's not like we have Syrian cars
Starting point is 00:32:30 and American cars, we just have cars, right? And they have firmware loads. And so if you're gonna create something in a firmware load in an American vehicle, you should expect it to show up in Russian, Syrian, and potentially caliphate vehicles. And so that's another problem. Same with, you know, there's a similar problem with manufacturers where, Syrian, and potentially caliphate vehicles. And so that's another problem. Same with, you know, there's a similar problem with manufacturers where, well, you may trust
Starting point is 00:32:49 a manufacturer today, but what happens tomorrow? You know, I was once very fond of a little company called Flickr. In fact, Flickr started because Stuart Butterfield and the woman who's now my wife and I all met at a conference. The woman who's now my wife and I fell in love and carried on a long distance relationship. And Stuart at the time was making a game called Game Never Ending that had that we were both alpha testing. And he asked me how it was going with Alice. And I said, it's great, but we have a hard time sharing photos. And he said, oh, well, we've got a photo sharing thing for Game Never Ending. I'll just bring it up in the product roadmap. And they launched it the next week and it was so successful. They shut the game down and re-titled the company Flickr and sold it to Yahoo for $30 million. Right. And, and even then it wasn't so terrible
Starting point is 00:33:35 because Stuart was working at Flickr and so was Katarina, his, his wife at the time. And it was all great. And then, you know, now like Yahoo's a dumpster fire and Flickr is terrible. And so, you know, even if you trust the manufacturer to the point where you are personal friends with the people who founded it and it was founded for your benefit, it's still not a good idea to let the manufacturer decide what you're going to do with your product. Steve Wozniak got locked out of his iPhone, right? I mean, these are, these are like real, no fooling problems that people have Bob, Bob Frankston, uh, who created, uh, the first commercial spreadsheet, um, not Lotus, uh, uh, VisiCalc, uh, they put DRM in VisiCalc. Uh, it was a defect in the floppy disk, uh, that if,
Starting point is 00:34:20 if it wasn't present, the thing wouldn't run. So they introduced a defect, a physical defect and manufacture time. And so then he wanted to extract his old spreadsheets from an emulated 8086 running VisiCalc, except it wouldn't run VisiCalc because he couldn't emulate the physical defect in the floppy disk in the emulated floppy drive. So even if you founded the company,
Starting point is 00:34:43 you may not be able to trust the company in the future. You and the company may not have the same interests in the future. So you should always be able to override. So this is the second argument is that even if you think that today there is a government or a company that you trust, if you think that Tim Cook would never block an app unreasonably, you are creating and arming a weapon that you're handing to all of Tim Cook's successors. And as a guy who was a CIO of a tech shop buying Apple hardware from John Scully, I'm here to tell you that Apple is perfectly capable of hiring some absolute clowns to run that company. And, you know, someday you may be trying to reconcile your trust in Tim Cook with a future John Scully, right? When Martin Shkreli becomes CEO of Apple in 2072 and is in charge of the app store,
Starting point is 00:35:34 you're going to have to live with the decision you made when you trusted Tim Cook. So that's another argument. Isn't that like just nature of the beast, though? We can never prevent ourselves from having future versions of somebody that we may or may not trust. I mean, that's a human issue we've had all along. That's why we shouldn't design things that require us to trust people in the future. Absolutely. Right. Right.
Starting point is 00:35:56 So it's, you know, it's never been the case that if you bought a GM car and then you didn't like GM in the future, that GM could reach into your car and brick it. But that is the case with your Nintendo 3DS. Every time you fire one of those up, it checks to see if there's any new firmware updates. And if it finds one, it installs it without user intervention. The first thing it does when it launches is checks the checksum on the previous firmware load. And if it detects any tampering, it permanently bricks the 3DS. Good times. Right?
Starting point is 00:36:24 So it's never been the case that if KitchenAid detected in the future that you were using your blender to mix paint, that they could brick your blender. But now, if you do anything that the manufacturer doesn't like, the manufacturer can reach in and brick your device. So that's the other thing, right? We don't have ownership of anything these days. We have licenses
Starting point is 00:36:40 to use. Yeah, it's a kind of feudalism. Except, you know, when you actually look at the terms of those licenses, they're rubbish. They're not really copyright licenses. Yeah, it's a kind of feudalism, except, you know, when you actually look at the terms of those licenses, they're rubbish. They're not really copyright licenses. Like if you look at, you know, for example, in music, right, if you buy a downloadable song, that song is characterized as being a license to you, which is why they can place restrictions that they wouldn't be able to place in a sale of a copyrighted work. But the way that the standard record deal goes is that people, if you have a record deal and your song is licensed to someone, you get 50% of the revenue. If your song is sold to someone, you get 7%.
Starting point is 00:37:19 So you get seven times more money if your song is licensed than if it's sold. But iTunes revenue and MP3 store revenue is characterized by the labels as a sale in its bookkeeping with artists. And it's only with you that it's characterized as a license. So it's just bullshit, right? I mean, it's like, you know, I know you said to keep it clean. It's rubbish. You know, this is it's it's these copyright licenses are not particularly well crafted and they're they're silly. And they if they're enforceable, they're not enforceable in in total. And they're they're,
Starting point is 00:38:00 you know, not in keeping with the constitutionality of copyright. So one of the things that our lawsuit to invalidate Section 12 of the DMCA turns on is that the U.S. Supreme Court handed down two rulings in the last 10 years about copyright and fair use. These were in the cases Golan and Eldred. And they said that a copyright law is only constitutional if it respects the traditional contours of copyright. So copyright historically only applies to creative works and not to functional things, and only if it allows for fair use, which is use without permission for critical purposes and other reasons. And otherwise, the copyright law is not valid. It doesn't pass constitutional muster. It conflicts with the First Amendment. Well, you know, copyright does not, the traditional
Starting point is 00:38:45 contours of copyright do not afford for copyrightable dishwashers and copyrightable doorknobs and copyrightable light sockets. The fact that the software in the light socket is copyrighted and that you can then use copyright law to tell the person whose plates they can put in the dishwasher or whose light sockets they can use with a light bulb violates this traditional contours test that the Supreme Court set out. So, you know, while there's this fiction that we can't own anything because copyright law, the legal reality, which is yet to be litigated, but which EFF has begun the long, arduous process of litigating out, is that that's not how copyright law works.
Starting point is 00:39:28 I mean, I'm definitely with you on all this. But what my question is with regards to this, it seems like the focus is on devices and almost like all things. How, I guess, let me state it this way. How does this specifically apply to the web as we know it today? Not the devices using the web, but the DNS, the HTTP, the markup. And it seems like in that case, the website owner is the local party and we're all the remote parties just visiting this other person's website. How does this help us in those circumstances? There's two parts to that. The first one is a very practical thing that's going on right now. So as apps gained ascendancy, browsers lost some of their power. They became
Starting point is 00:40:13 less significant to technology ecosystems. And this made browser vendors and the World Wide Web Consortium, which standardizes browsers, pretty desperate. And so in 2013, the W3C decided to add DRM to the core set of HTML standards for HTML5 and something called encrypted media extensions. And what these mean is that for the first time ever, the person who runs a website will be allowed to tell your user agent, your browser, how it must perform, how it must render content and whether it can content. That's never been the case, right? Like this is why ad blockers and pop-up blockers work, but also why like I have bad low contrast vision. And when I get to gray on white type, I turn on a thing that turns the gray type black.
Starting point is 00:41:00 All of that stuff is only exists because users are able to configure their user agents to display the web in the way that's convenient to them. And so what this is doing is it's setting up this regime where it's a felony to change the way your browser is configured if that conflicts with the interests of the people who serve the content to your browser. And so we have proposed to the W3C that it should take its existing policies and extend them to cover DRM. Right now at the W3C, if you join, you have to promise not to use your software patents to attack people who implement W3C standards. The W3C's position on patents is that standards are more open if you don't need to license a patent to implement them. And so they have this policy. And we're saying, OK, well, there's this new right, the right to tell people how their
Starting point is 00:41:50 browsers must work that you're creating by doing this DRM standardization. And surely a standard is more open if you don't need someone's permission to implement it because of DRM, just as surely as it is with patents. You should have the same policy for patents. And so that's coming up for a vote very shortly. And we've been joined by some pretty significant parties there. The browser vendor Brave is in with us, but so is the Royal National Institute for the Blind and Oxford University. They've all signed on, along with most of the cryptocurrency and blockchain companies that are W3C members, because as cryptographers,
Starting point is 00:42:23 they're like, yeah, of course people should be allowed to report vulnerabilities in browsers. It's terrible policy to say that companies get to decide who can report vulns. And so that is a thing that's live and underway. And if you work for a W3C member, you should talk to your rep about supporting us there, because that vote is coming up any day now. What's been the stance of the other browser vendors, notably Google, Apple, Microsoft? And Mozilla is the other big one there. Mozilla. They're all backing it.
Starting point is 00:42:55 I think Mozilla believes that DRM is a foregone conclusion that they're going to have to put DRM into the web and that if they negotiate at the W3C, they'll be able to negotiate the deal that Google gets. And if they negotiate away from the W3C, they'll be the smallest of the major browser vendors negotiating on their own. And so even though they're champions of the open web, they're not doing the right thing on this. You know, it's true that companies will try and make DRM without the W3C, whether or not the W3C standardizes it. But there's no way that they could collaborate to the extent that they're collaborating at the W3C without doing so at the W3C, because the antitrust implications of all the major
Starting point is 00:43:36 players in an industry gathering in a closed room to decide what features their products will and won't have, that's totally illegal, right? So, you know, the only way they can do this is with the W3C abetting them. So who's actually pushing for this? Like, who's pushing for DRM in the open web? Oh, well, do you remember we started off talking about Audible, and you're like, damn, I feel so bad for supporting Audible? Well, the listening audience didn't get to hear that part.
Starting point is 00:44:02 Okay. That happens. We had a pre-call, talked about Audible, DRM. Corey cory went on a rant we loved it but it didn't make it into the show we might actually release it as a teaser but feel free to share what you want cory well and then afterwards you started talking about like watching things on netflix and i was like do i tell them because netflix are the major advocates of this netflix comcastcast, Cable Labs, the MPAA, the RIAA. Those are the major people pushing for it, as well as Microsoft, Apple, Google, Firefox, Mozilla. Those are the proponents. What's their motivation? I mean, we know who they are, but what's their, I mean,
Starting point is 00:44:38 is it greed? Is it control? Netflix wants to be able to assure the parties that it licenses from that they can exert controls beyond that which copyright allows them. So, for example, in 1984, the Supreme Court ruled that you're allowed to record TV shows for personal use, right? That was the Betamax rule. You can take a VCR and record, you know, whatever you want. These are our lives. Yeah.
Starting point is 00:45:03 So in theory, there's no reason you can't record a stream video from Netflix, except that there's DRM and it's against the law to break the DRM. But that's hard to make work in browsers because, you know, browsers are under user control to an extent that apps are not. It's a much more open platform. So what Netflix gets by adding DRM that's supported across all the browsers is they are able to go back to the people they license from and say, this commercial preference that you have, that people not
Starting point is 00:45:29 be able to record their shows, we've just converted that at the stroke of a pen into a legal obligation. We made up a private law, and without ever having to ask Congress, that law was enacted for the web. And so that's what they get out of it. And Google and Apple and Microsoft, they get to tick a box that makes Netflix happy. And their browser divisions, who are worried that they're going to lose ground to their app divisions, get to assure themselves that
Starting point is 00:45:55 Netflix isn't going to boycott the web, which is a thing that they're all super worried about, because Netflix has more or less said that they would boycott the web if they didn't get DRM. What? Yeah. If this is where it begins, then where does it end up? Like if this is the, you know, the breaking ground of this issue, where do we dystopia? Yeah. I mean, I'll tell you where it ends up. So the W3C is currently enacting a merger with an ebook standardization group that says that they want to make DRM for all formatted text. And so if that was built into browsers, wouldn't the New York Times and Washington Post and everyone else who has a paywall start to put the so-called premium web behind a DRM wall as well so that you couldn't save and print?
Starting point is 00:46:38 And then what will we do the next time there was a Gulf War and we wanted to prove that the New York Times had lied in making the case for the Gulf War, we're not allowed to save the text that's in the news. And so it's and what do we do as more and more of the Web disappears into silos that are off limits to free and open source software? What's the future of desktop Linux? What's the future of free and open browsers? Where do we end up if if user modifiability is antithetical to using the network yeah this is a big deal it's a it's i think it's an existential threat to the future of the human race i think that the idea that we are going to move the control surface because remember html5
Starting point is 00:47:20 is the control surface for the future of the internet of things right the idea is we'll get rid of apps and we'll use in browser appsowser apps, native apps to control your pacemaker, your car, your thermostat. When we take those and we make them off limits to security research, and we invite the worst, most monopolistic practices with no check against them in competition, when we make competing with them a felony, then we would be insanely naive to expect anything but the worst kind of abuse, right? Entertainment technology has the potential to usher in a future of absolute censorship and control. I call it being Huxley'd into the full Orwell, you know, and it is an absolute disaster. It terrifies the hell out of me.
Starting point is 00:48:02 So we've talked about DRM. We've talked about, you know, the owner of the thing should be able to override its manufacturer's point of view, so to speak. So no updating firmware without me approving it, those kinds of things. But with regards to security, point number two you mentioned was any true fact with regards to security should be legal to disclose. Well, how does this play out? Can you give us some examples of that? So there's a couple of these things. So one is back to DRM, that disclosing defects and products that have DRM has led to security researchers in one case going to jail. The Copyright Office has heard testimony from security researchers, some of the most famous, best respected in the world, including Ed Felton, who's now deputy CTO of
Starting point is 00:48:45 the White House, who said that they found defects in things like voting machines and medical implants and that they weren't able to come forward with them because they felt that they would face too much liability under the DMCA. So that's part of it. The other part, though, is the Computer Fraud and Abuse Act. So in the 1980s, we didn't have any specific anti-hacking statutes. And it was kind of a problem because people would break into computers and raid their databases. And they'd have to be charged with like the theft of one microwatt of electricity. It was kind of embarrassing.
Starting point is 00:49:16 And it was not a sustainable thing. And so Congress decided to make an anti-hacking law. But it's hard to make a really effective anti-hacking law because hacking changes over time. You know, technology is a fast moving target. So rather than spelling out a set of things that you were and were not allowed to do, they said that anytime you exceeded your authorization on a computer that didn't belong to you, that you were committing a felony and so this has been a real problem because it's allowed companies to spell out your authorization by creating these
Starting point is 00:49:52 ridiculous terms of service these these long you know a thousand words a boilerplate and then anytime someone does something they don't like they can they can threaten them or actually sue them or have them arrested for violating the Computer Fraud and Abuse Act. And this has also been really problematic for security researchers. I mean, and other kinds of researchers, too. Your listeners will probably know about Aaron Swartz, who was this amazing open source and freedom activist who was allowed to download scientific articles using MIT's network. But the terms of service said that using a script to do it was not allowed.
Starting point is 00:50:29 And because he wrote a Python script to access files that he was allowed to access, he was charged with 35 felonies and was or 13 felonies and facing 35 years in prison. And he hanged himself. But, you know, other researchers have fallen afoul of the Computer Fraud and Abuse Act. One researcher was looking at his AT&T customer record, which had all of his financial details, and he altered the URL. He changed the number at the end of the URL. He incremented it by one and found himself looking at someone else's financial details. And all told, he was able to look at hundreds of thousands of people's financial details,
Starting point is 00:51:07 which he then went public with. He didn't publish their financial details, but he went public with AT&T's sloppiness. And AT&T had him thrown in jail for changing the URL in his browser because their user terms said that you couldn't do that. And so right now, the American Civil Liberties Union is actually suing on behalf of a bunch of different kinds of researchers and news gatherers to invalidate the Computer Fraud and Abuse Act to address this question, to make sure that these true facts about the security of computers that we rely on are legal to discover and disclose, because companies are very poor trustees of their own embarrassing truths. They are not, they can't be relied on to tell you when something that potentially could
Starting point is 00:51:53 cost them a lot of money and face is true. Here's kind of a silly question, but you tell me if it's silly or not. What if none of us ever agree to those things? Like, aren't the end user license agreements, like, aren't they as much rubbish? And aren't they so much rubbish that the reason why they can actually do that is because the violation by doing the act. But what if, but you've agreed to use the thing based on the terms, what if none of us ever agreed, we just use the products without, would that be a legal loophole or it would be great it would be great to have some limits on those terms of service you know there was an effort at one point to pass a law called usita that would have um uh limited what could go in terms of service we haven't had a lot of luck with that we haven't a lot of luck with courts limiting what
Starting point is 00:52:42 they can do the but courts have held so far that like just being in the vicinity of the terms of service, you know, clicking a Web page that at the bottom of it says you agree. Your usage is like is your agreement, basically. It's like an implicit agreement, even if you didn't explicitly agree to it. You know, by running away, shouting, no, no, no, I don't agree. You agree, you know. And so that's you know, that's another problem. And, you know, frankly, like it's getting hard to, to function in society without Facebook. I'm a Facebook vegan. And I'm here to tell you that there's a lot of stuff I don't get to do because I don't use Facebook. And the only reason I can be a Facebook vegan is because I'm not like
Starting point is 00:53:21 relatively well-off, well-known white privileged English speaking dude. And there are a lot of people who don't have that option. Right. And so, uh, I think that like, that's that, that not agreeing is not enough. I mean, there's a hilarious photo in my Flickr stream of just my, my newborn daughter's hand pressing the a button with the Nintendo Wii agreement in the background. So we used her one day old hand to agree to the terms of service because she couldn't form a contract. But, you know, the lawyers I know are like, yeah, that doesn't work. Right. You pushed her. That's what I was thinking. Like, what if we just like hire some guy in Zimbabwe and like he clicks the button for all
Starting point is 00:53:59 of us, you know, like none of us ever agree. He agrees for all of them. He then acts as your agent. Right. You know, I'm not a lawyer, but but yeah, trust me, this is a thing that people have thought of and it doesn't, it doesn't work. We need, we need other things, right? We need structural, like, so we need, we need Larry Lessig talks about the four ways we can fix these problems, right? One is with code. So we can like make things that don't have these agreements.
Starting point is 00:54:23 One is with norms. We can make companies that force these agreements down your throat into social pariahs and characterize them as having done something profoundly immoral for having done this. One is with law. So we can use law to limit what those agreements can do. And one is with markets. So we can buy things that respect our freedom. But no one of those is enough. And all four of those work together, right? The things that are technologically possible are things that you can create markets for. You can't create markets for things that can't be done. And so all of these things together work well. Sounds like, I mean, just thinking about those four things, and I agree that it's,
Starting point is 00:55:00 you gotta, you gotta kind of have them all. But if we look at the way humans are going, like even this conversation about Netflix and Audible, I'd be hard pressed to believe that Adam, even though you're probably outraged at this point, Adam, like you get back to your regular life and it's hard. Like people aren't as resolute as you are, Corey, or, you know, as we don't, we don't stick to our convictions. Like it becomes like, even you said, Facebook itself is not all that attractive to you. I'm also a Facebook vegan, I guess you call it. But that's because I don't-
Starting point is 00:55:29 Me as well. I don't care that much. But like if you took my Netflix away, like that would actually hurt me in my everyday life. And so I guess my question is, it seems like the legal front is probably the best one if we had to put our efforts behind one thing, because the social norms thing doesn't seem like it's working out that well.
Starting point is 00:55:49 I don't know the social norms to me, Jared seems like when Corey was saying that reminded me of this idea of free speech, right? Everybody has free speech, but you say something that's free for you to say, but the society at large doesn't agree with it. They're going to come down.
Starting point is 00:56:03 Sure. So you may have the freedom to say it, but not agree to do it. So to me, I feel like today's society and the way internet rage sort of comes up very easily, it'd be pretty easy if we could band together and it does happen. It is happening more and more networks that in one more communities to where
Starting point is 00:56:21 if someone doesn't play by the rules of society sets or society norms, then they get ousted in some way, shape or form. I think there's something to that, you know, and I want to caution against like the paralysis of purity. Right. Like I try really hard to, you know, spend money with companies that are trying to make the future that I want to live and not destroy the future I want to live in. But I'm not purely successful. I mean, at the end of the day, it's pretty hard not to like buy your phone service from a company that's, you know, monopolist and waiting that wants to destroy network neutrality. Right. And like, even though I buy laptops, throw the hard drive away and put a new one in and install Ubuntu on them, I'm still buying those laptops from Lenovo that have like shipped
Starting point is 00:57:04 four models in a row with spyware on them out of the box, right? So like, you know, every vegetarian eventually meets a vegan, right? And if your test for whether or not you can do anything is whether you can be as pure as the purest person you can think of, then you will do nothing. So there's another way to do this, right? And I got this from Denise Cooper, who's one of the great doyens of the free and open source movement. And she says that every month she adds up how much money she has given to companies that are working to destroy the future she wants to live in. And she gives that much money to organizations
Starting point is 00:57:39 that are working to save it so that, you know, she's at least, you know, carbon offsetting the harm that she does, you know, in, in, in things. And, you know, I like, obviously I brief for EFF because they're, they're an organization that I work for and love, and I've seen how effective they can be. And I've never seen an organization be more effective with less, but they're not the only ones, right? I mean, you know, obviously there's a free software foundation, but there's also the Software Freedom Law Center and there's Creative Commons and there's, you know, so many other organizations fight for the future and defend progress and so many different organizations that will take the money that you give them and try and fix the structural problem that has trapped you into subsidizing a future that you're horrified to be approaching and try and fix it from the other edge of things. And so that's, that's another thing you can do. And, you know, you can not fall prey to this, this argument that goes well, like how can you be in favor of doing something about climate change? When one time you got on
Starting point is 00:58:43 an airplane, those, those things, you know know people get on airplanes and they can care about climate change and uh if you say to people you're not allowed to care about climate change if you fly the world will go up in flames that's a good place to pause here we got one more break before we tell the show so let's pause here and court on the side we're going to talk a bit about the future we figured that with the mind you have as a science fiction writer who to me you dream up some really cool stuff right you must have really interesting ideas or at least
Starting point is 00:59:14 science fiction ideas about the future so when we come back we'll talk about the the great or bleak future we might have we'll be right back Linode. credit, two months free. One of the fastest, most efficient SSD cloud servers is what we're building our new CMS on. We love Linode. We think you'll love them too. Again, use the code changelog20 for $20 credit. Head to linode.com slash changelog to get started. All right, we're back with Corey Doctorow, and it's definitely been a good conversation.
Starting point is 01:00:05 Cory, you think about and you're so passionate about things I never even knew I should care so much about. And I feel like the general public has some blinders on, basically, you know, and that you've got a lot of ideas and a lot of passion around Internet freedom and such other things. But coming back to our ground roots, developers who listen to this show, people who really are passionate about open source, they're getting involved with communities, they're going to conferences, they're giving talks, they're leading the way in all shapes and ways. What advice do you give to people like that?
Starting point is 01:00:40 People who build software every single day, people who care about the future of software, more importantly, open source software, what kind of advice and whatnot can you give to those kinds of developers where they should be focusing their efforts to not so much just subscribe to this potentially bleak future that we're driving towards, but ways that they can shape the future of the open world? Right. Well, you know, I'm a science fiction writer, so I know exactly how badly qualified I am to predict the future because science fiction writers suck at predicting the future. You know that we've made a lot of predictions and our success rate is very low. We have this
Starting point is 01:01:16 hindsight bias where like we trumpet our successes. But if you take our overall hit rate, it's pretty poor. And besides that, like I said earlier, knowing what the future is going to be is pretty depressing because it suggests that the future can't be changed. And so rather than briefing for optimism or pessimism, I'm a great fan of hope. And hope is like why you tread water when your ship sinks, even though you know that in most cases you have no chance of being picked up, everyone who has ever picked up Tread and Water and Tell rescue arrived. And so it's this necessary but insufficient precondition for a better future.
Starting point is 01:01:58 And hope doesn't require that you know how you get from A to Z. Hope only requires that you know what your next step is. the first casualty of any battle is a plan of attack. So if you think you've got a plot that you can take from here right to a kind of free and open source utopia, the hours that you spend on that critical path are going to be completely wasted when the first exogenous shock comes along and blows you off of the path. And so instead, I'm a great believer in kind of iterative hill climbing. You know, you check to see whether there's a course of action that takes you closer to the future that you want to live in. And you take that one incremental step because as you ascend the problem landscape, you get a view into new parts of the territory that were off limits to you before because you were too low down. And yeah, you can reach a local maximum, which is
Starting point is 01:02:43 why sometimes you've got to try, you know, veering off into left field and trying something that you've never tried before. But I don't believe in grand plans. I believe in incremental, iterated, slow, steady, continuous progress. And so if you think that you can do a thing, one thing, it doesn't matter what,
Starting point is 01:03:02 a single thing to make things better, go do that thing. Well, what is that thing for you and the EFF when we get back off of the developers, but specifically what you're up to? I'm going to kill all the DRM in the world in a decade. That sounds like a big plan. It is a big plan, but that's where I want to be.
Starting point is 01:03:18 And my next step to do it is, I've got all these little projects. So we're getting the W3C to protect security researchers and innovators and accessibility in web standards. We're suing the U.S. government to get rid of the Section 12.1 of the DMCA. We're coordinating with activist groups around the world to launch their own campaigns on the basis that this is going forward. I'm talking to investors and entrepreneur groups about the economic opportunities for breaking DRM. We've just petitioned the Federal Trade Commission to require electronic retailers to notify people when they have products that have DRM on them.
Starting point is 01:04:00 And we asked the FCC to do this for set-top boxes to say that the new set-top boxes in their unlock-the-box order, all the manufacturers that are approved should promise never to invoke the DMCA against people who unlock those boxes for a legal reason. And so these are all projects that take us a little further up the hill. I've got some stuff on the drawing board that I'm talking about with my colleagues right now. I want to do a, a one-stop shop where you can go and complain about DRM in a product and have that complaint sent to the FTC, the FTC, your state attorney general, your Congressman, and the better business Bureau so that they, uh, so that they all get a complaint every time someone buys a thing that has DRM in it and then it bites them in the ass. And so to start building like the evidentiary record and making this normative shift as well. So that's that's those are all the little pieces that I'm doing that take me one step up the hill. And and then I know what's at the top of the hill, which is killing all the DRM in the
Starting point is 01:05:00 world. Right. That's your end goal. Makes sense. I like the idea of the disclosure, right? I like the label, the idea that if there's, because how many things can you imagine that are right around you in your office and Jared's office, my office that have DRM that you're just not even aware or that you were never disclosed that it had DRM? Sure. I mean, it's not enough, but it's good.
Starting point is 01:05:22 You know, like, so the Amazon self-publication market, the 99 cent short self-published novels, those books, the people who buy them are very prolific readers. They tend to be a sort of book a day readers. And so they're very familiar with Amazon's very cryptic interface. And so they're able to figure out which books do and don't have DRM because the DRM free books on Amazon don't say DRM free. Usually they say things like can be used on unlimited devices. And so in those marketplaces where you have very knowledgeable consumers, uh, the DRM free products outsell the DRM ones two to one. And so that's pretty cool. Like I think in some marketplaces it will make a marketplace difference. Like it's not enough, but if we're hoping that people will differentiate themselves from the competition by being DRM free, there has to be a way to tell which things have
Starting point is 01:06:13 DRM and which things don't in the marketplaces where they compete. So I like that you have a big master plan and you have a bunch of small tactical moves, you know, slowly up the hill or up the mountain and what works, you follow up on what doesn't work. Maybe you try something new. Yeah, I'm I'm like a one man scrum. Yes. Well, we're very familiar with scrum, many of us. And I think you have the you have the ear of an audience who's very open to your cause.
Starting point is 01:06:44 And so, you know, speaking personally, I very much believe in all the, many of the things that you're saying right now. And so I'm starting to think like beyond limiting my Netflix and Audible usage, but like, what are ways, like what would be a tactical next step for me, a guy who has different skills than you, a software developer who's, you know, day-to-day writing code and working for people and building websites and so on. How are ways that we can get involved and kind of what are some small tactical things that we could do and try to help push the same things forward that you're pushing forward? Well, EFF has a ton of projects on GitHub where we have open issues. So you can, you could always, you could always address some of that and do a pull request.
Starting point is 01:07:25 We've got Privacy Badger and we've got CertBot, which is part of Let's Encrypt. So we've given away a million certificates this year. That number is probably now 2 million or more. We gave away a million certificates in like the first 90 days of CertBot running. And all of those have open bugs against them and they could all use your contributions.
Starting point is 01:07:44 Joining EFF and giving EFF money actually does something. And, you know, again, I know this sounds very self-interested. I'll point out that at the very least, EFF doesn't give me any money. I get my money from MIT for being activist and residents of the Media Lab. So it's not like I pay my rent if you give EFF money. But EFF is an amazing organization and that's a thing you can do right now. Even just joining EFF mailing lists. I know that it feels useless to send a petition to your, uh, Congressman or whatever. Uh, and there've been lots of times when it was useless, but you know, the way that we killed SOPA was by 8 million people putting, putting
Starting point is 01:08:19 phone calls through to Congress in 72 hours. And the reason we were able to do that is because there were so many people who joined these mailing lists and we were able to coordinate them in a big consolidated effort that did something that politically no one thought was possible. And that still the reverberations from are being felt in DC. So joining those mailing lists. If you are a security researcher and you want to join my petition to the W3C to protect security researchers in DRM, send me an email. My email is C-O-R-Y, Corey at EFF.org, Electronic Frontier Foundation. Corey at EFF.org. I need to know what institutional affiliation you'd like listed, if any, and also what country you're in. We're trying to give them a sense of how diverse this is. So send me that, core80ff.org, your name,
Starting point is 01:09:10 your institutional affiliation, your country, if you're a security researcher. If you work for a W3C member company, you really can make a difference by going to your boss and saying, there's this thing brewing at the W3C that has the potential to make free and open source software off limits for large parts of the web. We need to do something about it. And we can. There's this EFF initiative coming up. Can we ask our rep to contact Corey, Corey at EFF.org? And I'd be happy to take it from there.
Starting point is 01:09:38 So those are all things that you can do. I wish that there was more. I wish that there was like something like Wikipedia where it's like, just go find an entry that you're interested in. If anything seems wrong, fix it. We haven't gotten there yet, but we're going to get there. We are finding new, what Tim O'Reilly calls the new architectures of participation for this all the time. And we're trying to work them out. You know, when we have this tool up for reporting, uh, DRM to the FTC and, and your congressman and so on, we're going to, um to need people to contact their friends and go tell them about it. And then there's one last thing that everybody can do.
Starting point is 01:10:13 And it's to explain this stuff to other nerds. Because like you said, there are lots and lots of people who are really deep technologically savvy nerds who this stuff just doesn't really cross their radar in any meaningful way. They work with technology all day, but they and they would get it faster than anyone else who could possibly explain it to. Right. There's tens of thousands of EFF users. There's millions of hacker newsreaders. And so if you were to go, if you and everyone listening to this were to go and explain this to two nerds that they know, people who are not technologically naive, people who are savvy, who do this all day long and say, sit down, I need to explain this to you.
Starting point is 01:10:57 We need to build a future with these two principles. Computers should obey their owners. You should always be able to tell people about defects in the products that they rely on. We're going to build that future with EFF. Here's the podcast to listen to or a video of a speech or EFF's homepage. And then go back to them in a week and say, like, that conversation we had last week, I want to follow up with you and see if you did it and whether you'll go tell two other people. Now, that's like that is a big ask.
Starting point is 01:11:29 Going and talking to two people is a huge ask, but if you want something that every technologically savvy person can do to make a difference, if we could go from tens of thousands to hundreds of thousands of people who are involved in every one of these campaigns, that could be the critical mass that takes us to a better future. And so that's my other final big ask. Two people, one week follow-up, ask them to contact two people. That's a nice list, if I do say so myself. Very well played. That's a good social proof thing, too, because everybody knows at least two people, at least most people.
Starting point is 01:11:58 And that's like the MLM way of doing things, right? The only way you grow is by telling two friends yeah yeah hate the pigeonhole the eff with mlms sorry about that huh well you know they do it because it works i once i once had this hilarious lunch do you know a book called getting things done of course oh yeah yeah it's amazing book it totally revolutionized my life so i had this lunch with the guy who wrote it once because he wanted to get advice on what kind of web stuff he could do. And I said, I have to ask you, where did you get the cool stuff that you put into getting things done? And he said, oh, well, like I just stole all the good stuff from Dianetics.
Starting point is 01:12:38 Because like, if you're going to convince people to join your weird cult, you need to give them something that works at first. And so he went and stole the like you know just because a bad person has done it doesn't make it a bad thing right you know you want to you want to steal the best tactics regardless of where you can find them totally agree with that so switching gears just a tiny little bit it kind of goes back to the original mention of how i kind of came to know you through your writing, which was Scroogled. You know, I'm sure you came up in the blogging era, the web blog era, right? Boing, boing and all that. And, you know, pre Flickr, which was the game.
Starting point is 01:13:16 So you come from an era of the Internet, which most don't touch like a country before country was cool. Right. Exactly. And a lot of people are doing personal blogging on networks that are not self-owned anymore, right? We used to blog on our own WordPress installation, an open source installation, our own whatever. And now most people do it on Facebook, Medium, Twitter, if they're like tweet renting or whatever. But someone like you who's outspoken about DRM, intellectual property, privacy, security, all these fun things, you must have some pretty deep feelings about how
Starting point is 01:13:52 the collective conversations are taking place on networks, not owned by ourselves, basically, and how that impacts our privacy. And I'm just, it's a little bit out of left field, but it kind of goes back to the beginning, which is just curious what your thoughts are around that, around this proliferation of writing on networks that aren't owned by us. Well, you know, it's this question of like, how do you re-decentralize? So normally in the history of the web, when there's been a lot of centralization, when there's been a kind of big winner, what's happened is that the people whose content that was or whose social graph it was were able to use a rival's tool to bring whatever it was they were getting from service A into service B.
Starting point is 01:14:33 That's why Web 2.0 was so exciting. It was these mashups where a company that had achieved success could be commodified by another company that did something even cooler. And so you had this, this like anti lock-in effect through both the, you know, the technological underpinnings, the code, but also through the norms that, you know, why would you use a service that didn't let you mash it up with other services? The services were better together. And the silos, the walled gardens are what the, are where the problem is. Now, you know, I can easily see a technological way to fix Facebook. I don't know if it would work, but at least I can come up with a plausible one, which is that you have a bot that logs into Facebook for you and scrapes all the stuff about Facebook that you
Starting point is 01:15:14 value every day and puts it in another context that belongs to you. And that can also merge with LinkedIn and Twitter and wherever else your friends are. And so you're looking at your dashboard. I'd sign up for that service. And then when you replied, it would put the reply in the right context for other people. You know, basically, you know, use that style federation for Facebook. And there's no reason that you couldn't write a specialist browser that logged into Facebook as a person that ran on that person's computer or ran in a cloud instance that was tasked to that person or tasked to multiple people and that did that for them. And that would be completely awesome. It would be pro-competitive. It would be pro-market.
Starting point is 01:15:55 It would let you be in control of your data and your social services. It would make it hard for surveillance mechanisms and Facebook to be so effective. And the only thing stopping us is the enforceability of Facebook's terms of service. And so what we need to do is challenge that enforceability. So that's one of the reasons I'm very excited about the ACLU's lawsuit, which is opening the door to invalidating terms of service as enforceable legal contracts and saying that users have the right to take their own data and their social graph and their interactions and use them in ways that are best suited to them, not to a giant corporation. And if we can do that, then I think Facebook's
Starting point is 01:16:38 days are numbered. And so I think that this is an area in which we could harness code and law technology and norms to make a better world. And, you know, when you're thinking about what organizations you're going to tithe to to hedge against the fact that you're giving money to corporations that are destroying the future, ACLU should be one of them. Not just for that. It's also an election season. And there's no one who's doing better work about ending voter suppression than aclu uh but but also for that one way to close this show cory is is all is to offer it back and obviously we've enjoyed this conversation with you we we think you have a unique perspective one is a writer two is a father and then just someone who cares about the future of where we're all trying to go you have some really interesting perspectives obviously but uh is there anything else that we haven't covered well enough something that that like, if you were in front of the room of hackers, as you will be soon at OSCON, given the keynote, this is a chance to maybe share something we
Starting point is 01:17:33 didn't ask you directly. What do you want to share? What would you like to close with in terms of the hackers, the open source people out there doing all the awesome stuff they're doing on GitHub and Bitbucket and everywhere else to move open source people out there doing all the awesome stuff they're doing on GitHub and Bitbucket and everywhere else to move open source forward. Well, I guess the last thing I'd like to say is that the issue here is not like whether information wants to be free or whether the Internet should or shouldn't be free or whether that's the most important issue. I know for sure that there are things that are way more important than any of those things, right? There's fundamental issues of economic justice, there's climate change, there's questions of race and gender and gender orientation that are a lot more
Starting point is 01:18:14 urgent than the future of the internet. But the thing is that every one of those fights is going to be won or lost on the internet, right? If you think that we can fight climate change without having a networked public that coordinates its efforts, that holds companies and governments to account, that does citizen science, then you're nuts. And so the reason I fight to keep the internet free and open is not because information wants to be free. Information doesn't want anything. It's an abstraction. But because people want to be free. And the internet is the nervous system of the 21st century. And the way that you make people free in the 21st century is by seizing the means of information, by having a
Starting point is 01:18:54 free, fair, and open information infrastructure, the battleground on which all those other fights can be won or lost. Well said. Well, Corey, I, as as i mentioned found you through scoogle but uh have loved your books between now and then and obviously i'm gonna miss part of your keynote uh face to face at least but i'll hit it in the in the the playback so i can't wait to hear that but uh certainly appreciated you sharing your time here today as we mentioned before to the listeners uh you can go to oscon too we have a code pccl20 which will get you 20 You can go to OzCon too. We have a code PCCL20, which will get you 20% off registration. Go to OzCon.com slash UK.
Starting point is 01:19:30 We did this show in partnership with O'Reilly. So thanks to O'Reilly for working with us and getting people like Corey on this show and Eli Bixby. As I said earlier, I said Ben Bixby because I know Ben Bixby and that's the name that popped in my head. Didn't mean it.
Starting point is 01:19:43 Sorry, Eli. Love the conversation on TensorFlow, but we will be at OSCON London. Actually, when I say we, I mean Jared because I don't cross the ocean like that so easily. I'm not going, long story short, but meet Jared. Go on
Starting point is 01:19:57 there. Make sure you say hi. But fellas, that's it for this show today, so let's call it done and say goodbye. Alright. Goodbye. Thanks again, Corey. All right. Thanks. We'll see you next time.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.