The Joe Rogan Experience - #1572 - Moxie Marlinspike

Episode Date: December 1, 2020

Computer security researcher Moxie Marlinspike is the creator of the encrypted messenger service Signal, and co-founder of the Signal Foundation: a nonprofit dedicated to global freedom of speech thro...ugh the development of open-source privacy technology.

Transcript
Discussion (0)
Starting point is 00:00:00 the joe rogan experience train by day joe rogan podcast by night all day uh so like we're gonna just sit here and talk for a long time huh yeah we were started right now we were started it has begun yes what was your question though i was gonna ask you know like what if something comes up you know like like what you know you need to like pee or something oh you can totally do that yeah we'll just pause and just run out that happens don't sweat it all right i want you to be comfortable have you ever done a podcast before first time really first time um so tell me how where signal came from what was the impetus what was how did it get started?
Starting point is 00:00:48 It's a long story. So we had time we're playing time Okay, well, you know, I think ultimately what we're trying to do with signal is Stop mass surveillance to bring some normality to the internet and to Explore a different way of developing technology that might ultimately serve all of us better. We should tell people, maybe people just tuning in, Signal is an app that is... Explain how it works and what it does.
Starting point is 00:01:17 I use it. It's a messaging app. It's a messaging app, yeah. Fundamentally, it's just a messaging app. From lofty aspirations. Yeah. Yeah, it's a messaging app, yeah. Fundamentally, it's just a messaging app. Yes. From lofty aspirations. Yeah. Yeah, it's a messaging app, but it's somewhat different somebody a message, I think most people's expectation is that when they write a message and they, you know, press send, that the people who can see that message are the person who wrote the message and the intended recipient. But that's not actually the case.
Starting point is 00:01:56 There's, you know, tons of people who are in between who are monitoring these things who are collecting data information. And Signal is different because we've designed it so that we don't have access to that information. So when you send an SMS, that is the least secure of all messages. So if you have an Android phone and you use a standard messaging app and you send a message to one of your friends, that is the of all and when it comes to when it comes to security right that's yeah it's a low bar that's that's the low bar yeah and then iPhone what is this signal yeah so iPhones use iMessage which is slightly more secure but it gets uploaded to the cloud and it's a part of their iCloud service so it goes to some servers and then uploaded to the cloud, and it's a part of their iCloud service. So it goes to some servers and then goes to the other person.
Starting point is 00:02:49 It's encrypted along the way, but it can be intercepted. Yeah, I mean, okay. Like Jeff Bezos' situation. Yeah, like Jeff Bezos' situation, exactly. Fundamentally, there's two ways to think about security. One is computer security, this idea that we'll somehow make computers secure. We'll put information on the computers, and then we'll prevent other people from accessing those computers. And that is like a losing strategy that people have been losing for 30 years.
Starting point is 00:03:18 Information ends up on a computer somewhere, and it ends up compromised in the end. The other way to think about security is information security, where you secure the information itself, that you don't have to worry about the security of the computers. You could have some computers in the cloud somewhere, information's flowing through them, and people can compromise those things and it doesn't really matter because the information itself is encrypted. And so things like SMS, the iMessage cloud backups um most other messengers Facebook messenger all this stuff you know um they're relying on this computer security model um and that uh ends up disappointing
Starting point is 00:03:53 people in the end and so you why did you guys create it like what what was unsatisfactory about the other options that were available well Well, because the way the Internet works today is like insane. You know, fundamentally, I feel like private communication is important because I think that change happens in private. Everything that is fundamentally decent today started out as something that was a socially unacceptable idea at the time. You look at things like abolition of slavery, legalization of marijuana, legalization of same-sex marriage, even constructing the Declaration of Independence. Those are all things that required a space for people to process ideas outside the context of everyday life. And those spaces don't exist on the internet today. And I think it's kind of crazy the way the internet works today. You know, that like,
Starting point is 00:04:53 if you imagined, you know, every moment that you were talking to somebody in real life, there was somebody there just with a clipboard, a stranger, taking notes about what you said, in real life, there was somebody there just with a clipboard, a stranger, taking notes about what you said. That would change the character of your conversations. And I think that in some ways, like, we're living through a shortage of brave or bold or courageous ideas, in part because people don't have the space to process what's happening in their lives outside of the context of everyday interactions. That's a really good way to put it, because you've got to give people a chance to think things through. But if you do that publicly, they're not going to. They're going to sort of, like basically what you see on Twitter.
Starting point is 00:05:40 If you stray from what is considered to be the acceptable norm or the current ideology or whatever opinions you're supposed to have on a certain subject, you get attacked, ruthlessly so. So you see a lot of self-censorship, and you also see a lot of virtue signaling, where people sort of pretend that they espouse a certain series of ideas because that'll get them some you know some social cred yeah exactly i think that communication in those environments is performative you know you're either performing for an angry mob you're performing for advertisers you're performing for you know the governments that are watching. And I think also the ideas that make it through are kind of tainted as a result, you know, that like, did you watch any of the, like the online hearing stuff that was happening over COVID? You know, where like city councils and stuff
Starting point is 00:06:38 were having their hearings online? No, I did not. It was kind of interesting to me because it's like, you know, they can't meet in person, so they're doing it online. And that means that the public comment period was also online, you know? And so it used to be that, like, you know, if you go to a city council meeting, they have a period of public comment where, you know, people could just stand up and say what they think, you know? And, like, ordinarily, it's like, oh, you got to go to City Hall.
Starting point is 00:07:00 You got to, like, wait in line. You got to sit there, you know? But then when it's on Zoom, it's just sort of like anyone can just show up on the Zoom thing. You know, they just dial in and they're just like, here's you gotta sit there you know but then when it's on zoom it's just sort of like anyone can just show up and right on the zoom you know they just dial in and they're just like here's what i think you know and uh you know it was kind of interesting because particularly uh when a lot of the police brutality still was happening in los angeles i was i was watching those city council hearings and you know people were just like you know you know they were just calling you know be like fuck, I yield the rest of my time, fuck you, you know. It was just like really brutal and not undeservedly so.
Starting point is 00:07:32 And, you know, what was interesting to me was just watching the politicians basically, you know, who just had to sit there and just, they were just like, take it. And it was just like, you know, you get three minutes and then there's someone else to get you know they're just like okay and now we'll hear from you know like and you know watching that you sort of realize that it's like um to be a politician you have to just sort of fundamentally not really care what people think of you you know uh you have to fundamentally uh just be comfortable sitting you sitting and having people yell at you in three-minute increments for an hour or whatever. And so it seems like what we've sort of done is bred these people who are willing to do that. And in some ways, that's a useful characteristic.
Starting point is 00:08:19 But in other ways, that's the characteristic of a psychopath. Yes. Yes. characteristic of a psychopath you know yes yes and i think you know what we're seeing is that that also extends outside of those environments that like to do anything ambitious today requires that you just are comfortable with that kind of um feedback like trump's tweets if you watch you know if you look at twitter and look at any of Trump's tweets, when he tweets, watch what people say. It's ruthless. They go crazy. They go so hard at him.
Starting point is 00:08:50 So I'm assuming he doesn't read them. I'm assuming he just, or maybe he does and just doesn't say anything, but he doesn't go back and forth with people at least. No, but, and I'm, I think, you know, Trump is perfectly capable of just not caring. You know, just like people, like, you know, grazing is just like, yeah, whatever, you know, I'm the best. They don't, you know, Trump is perfectly capable of just not caring. You know, there's like people like, you know, Grayson is just like, yeah, whatever. You know, I'm the best. They don't, you know. And like that's, you know, that's politics. But I think, you know, the danger is when that, you know, to do anything ambitious, you know, outside of politics or whatever, you know, requires that you're capable of just not caring, you know, what people think or whatever because everything is happening in public. I think you made a really good point
Starting point is 00:09:26 in that change comes from people discussing things privately because you have to be able to take a chance. You have to be daring, and you have to be able to confide in people, and you have to be able to say, hey, this is not right, and we're going to do something about it if you do that publicly the powers that be that do not want change in any way shape or form
Starting point is 00:09:52 they'll they'll come down on you i mean this is essentially what edward snowden was warning everyone about when he decided to go public with all this nsa. He was saying, look, this is not what we signed up for. Like, someone's constantly monitoring your emails, constantly listening to phone calls. Like, this is not this mass surveillance thing. It's very bad for just the culture of free expression, just our ability to have ideas and to be able to share them back and forth and vet them out.
Starting point is 00:10:24 It's very bad. Yeah. Yeah. Yeah. I mean, I think when you look at the history of that kind of surveillance, there are a few interesting inflection points, you know, like at the beginning of, you know, the internet as we know it in like the early to mid 90s, there were these like DOD efforts to do mass surveillance, you know, and they were sort of open about what they were doing. And, you know, one of them was this program
Starting point is 00:10:49 called Total Information Awareness. And it was, they were trying to start this office, I think called the Total Awareness Office or something within the DoD. And the idea was like, they're just gonna like collect information on all Americans and everyone's communication and just stockpile it into these databases and then they would use that to you know
Starting point is 00:11:07 mind those things for information it was sort of like you know they're their effort to get in on this at the beginning of the information age and you know it was ridiculous you know it's like they called it total information awareness they had a logo that was like you know, the pyramid with the eye on top of it. Oh, yeah. This is their logo. Oh, God. The pyramid with the eye, like, casting a beam on the earth.
Starting point is 00:11:34 That bit of Latin there means knowledge is power. Oh, wow. And interesting, this program was actually started by John Poindexter, of all people, who was involved in the Iran-Contra stuff, I think. Really? Yeah, yeah. And he, like, went to jail for a second and then was pardoned or something. So, anyway, you know. It's just so fucked up that these people are in charge of anything.
Starting point is 00:11:54 I know, but what's also just kind of comical is that, like, they were like, this is what we're going to do. Look at how crazy this is. This is our plan, you know. And people were like, I don't think so. What year was this? This was like early, mid-90s. Look at this.
Starting point is 00:12:08 Authentication, biometric data, face, fingerprints, gate, iris. Your gate. So they're going to identify people based on the way they walk? I guess your gate is that specific. Then automated virtual data repositories, privacy and security. This is fascinating. Because if you look at, I mean, obviously no one thought of cell phones back then. Exactly, right.
Starting point is 00:12:34 So this is like kind of amateurish, right? So they're like, this is what we're going to do, you know? And people are like, I don't think so. Even like Congress is like, guys, I don't think we can approve this. Like you need a better logo. Yeah, for sure. But it's just this whole flowchart. Is that what this would be?
Starting point is 00:12:53 What do you call something like this? What are these called? Flowchart, I guess, sort of. Designed to dazzle you into approving their funding. It's like baffling to figure out what it is. First of all, what are all those little color tubes? Those little ones, those little cylinders. Those are data silos. That's the universal.
Starting point is 00:13:10 They're all different colors. There's purple ones. What's in the purple data? Well, gate. That's Prince. That's where gate lives, yeah. It's all Prince's information. Okay, so this stuff all sort of got shut down, right?
Starting point is 00:13:21 Yeah. They're like, okay, we can't do this. And then instead, what ended up happening was, like, data naturally accumulated in different places, you know, that, you know, like back then, if you had been, you know, what they were trying to do is be like, our proposal is that everyone carry a government-mandated tracking device at all times.
Starting point is 00:13:40 Like, what do you guys think? You know, it'll make us safer, you know? And people were like, no, I don't think so, you know? But instead, everyone ended up just carrying cell phones at all times we are tracking your location and reporting them into centralized repositories that government has access to you know and so you know this this sort of like oblique surveillance infrastructure ended up emerging and that was what you know people sort of knew about but you know
Starting point is 00:14:02 didn't really know and that's what Snowden revealed. It was like, we don't have this. Instead, it's like all of those things are happening naturally. Gate detection, fingerprint, all this stuff is happening naturally. It's ending up in these places. And then governments are just going to those places and getting the information. And then I think the next inflection point was really Cambridge Analytica.
Starting point is 00:14:29 That was a moment where I think people were like... Explain that to people, please. Cambridge Analytica was a firm that was doing big data, that was doing like big data, using big data in order to forecast and manipulate people's opinions. And in particular, they were involved in the 2016 election. And it was sort of, you know, so it's like, you know, what's done revealed was Prism,
Starting point is 00:15:02 which was the cooperation between the government and these places where data was naturally accumulating like facebook google etc you know and the phone company and uh cambridge analytica i think was the moment that people were like oh there's like also sort of like a private version of prism you know that's like not just governments but like the data is out there and other people who are motivated are using that against us you know know? And I, so I think, you know, in the beginning it was sort of like, oh, this could be scary. And then it was like, oh, but you know, we're just using these services. And then people were like, oh wait, the government is, you know, using the data that we're, you know, sending to these services. And then people were like, oh wait, like anybody can use the data
Starting point is 00:15:42 against us. And they were like, oh shit. You, it's like I think things went from like, I don't really have anything to hide to like, wait a second. These people can predict and influence how I'm going to vote based on what kind of jeans I buy, you know. And, you know, and then, you know, sort of where we are today, where it's like, I think people are also beginning to realize that the companies themselves that are doing this kind of data collection are also not necessarily acting in our best interest. Yeah, for sure. There's also this weird thing that's happening with these companies that are gathering the data, whether it's Facebook or Google. I don't think they ever set out to be what they are.
Starting point is 00:16:24 They started out, like Facebook like facebook for example we were talking about it before it was really just sort of like a social networking thing and this was in the early days it was a business i don't think anybody ever thought it was going to be something that influences world elections in a staggering way like especially in other parts of the world, you know where Facebook becomes the sort of de facto messaging app on your phone when you get it I mean it has had massive impact on on politics on shaping culture on Mean even the genocide has been connected to Facebook in certain countries. You know, it's weird that this thing that is in, I don't know, how many different languages Facebook,
Starting point is 00:17:13 how many different languages does Facebook operate under? All of them. Yeah. I mean, this was just a social app. It was from Harvard, right? They were just connecting students together. Wasn't that initially what the first iteration of it was from harvard right they were just connecting students together wasn't that initially what the the first iteration of it was yeah okay i mean i think you can say like no one anticipated that these things would be this significant um but i also think that there's
Starting point is 00:17:37 you know i think ultimately like what we end up seeing again and again is that like bad business models produce bad technology you know that like the point you know what we end up seeing again and again is that like bad business models produce bad technology, you know, that like the point, you know, what we were talking about before, like the point, you know, Mark Zuckerberg did not create Facebook because of his deep love of like social interactions. Like he did not have some like deep sense of like wanting to connect people and connect the world. That's not his passion. You know, Jeff Bezos did not start Amazon because of his deep love of books. You know, these companies are oriented around profit. You know, Jeff Bezos did not start Amazon because of his deep love of books. You know, these companies are oriented around profit. You know, they're trying to make money. And they, you know, they're subject to external demands as a result.
Starting point is 00:18:17 They have to, like, grow infinitely, you know, which is insane. But that's the expectation. And, you know, so what we end up seeing is that the technology is not necessarily in our best interest because that's not what it was designed for to begin with that is insane that companies are expected to grow infinitely yeah I mean they're literally expect what is your expectation to take over everything to have all the money and then one day and then more you know if yeah if we extrapolate we we anticipate we will have all the money there will be no other money if you keep going that's what has to happen how can you just grow infinitely that's bizarre
Starting point is 00:18:55 yeah and that's why i mean you know the i think you know the silicon valley obsession with china is yeah you know a big part of that where people they they're just like, well, that's a lot of people there. Yes. That's a lot of people there. You can just keep growing. Yeah. There was a fantastic thing that I was reading this morning. God, I wish I could remember what the source of it was. They were essentially talking about how strange it is that there are so many people that are so anti-human trafficking, they're so pro-human rights, they're so anti-slavery, they're so... All the powerful values that we ascribe,
Starting point is 00:19:38 that we think of when we think of Western civilization, we think of all these beautiful values but then almost all of them rely on some form of slavery to get their electronics oh yeah and it was just uh eight grams of cobalt in your pocket over there yeah mined by actual child slaves someone had to stick us like literally they're they're getting it out of the ground digging into the dirt to get it out of the ground we were talking about on the podcast earlier they were like is there a way that this could that is there a future that you could foresee where you could buy a phone that is guilt-free like if i buy a pair of shoes like i bought a pair of boots from my friend Jocko's company. He's got a company called Origin. They make handmade boots. And like it's made in a factory in Maine.
Starting point is 00:20:30 You can see a tour of the factory. These guys are stitching these things together. And it's a real quality boot. And I'm like, I like that I could buy this. I know where it came from. I could see a video of the guys making it. Like this is a thing that I can feel like I am giving them money. They're giving me a product. Like there a there's a nice exchange it feels good i don't feel like that with
Starting point is 00:20:50 a phone with a phone i have this like bizarre disconnect i try to pretend that i'm not buying something that's made in a factory where there's a fucking net around it because so many people jump to their deaths that instead of trying to make things better and say we just we're gonna put nets up catch these fuckers put them back to work yeah is there is it possible we could want that we would all get together and say hey enough of this shit will you make us a goddamn phone that doesn't make me feel like i'm supporting slavery yeah i mean i think you know you're asking maybe you know too much i think you're asking i think that's the same as asking like will civilization ever decide that it wants to that we collectively want to have like a sane and sustainable way of living yeah um sane and sustainable and i hope the answer is. I think a lot of us do.
Starting point is 00:21:45 You do, right? I do. You don't want to buy a slave phone, right? Yeah. I mean, but okay. So, you know, I feel like it's difficult to have this conversation without having a conversation about capitalism, right? Because, like, ultimately, you know, what we're talking about is, like, externalities that the prices of things don't incorporate their true cost uh you know that like you know we're destroying the planet for plastic trinkets and reality television uh you know like we can have we can have the full
Starting point is 00:22:13 conversation if you like but let's let's start with phones though let's start with because when most people know the actual the the the from the origin of the materials, like how they're coming, how they're getting out of the ground, how they're getting into your phone, how they're getting constructed, how they're getting manufactured and assembled by these poor people. When most people hear about it, they don't like it. It makes them very uncomfortable. But they just sort of go, la, la, la. They just plug their ears and keep going and buy the latest iPhone 12
Starting point is 00:22:51 because it's cool. It's new. What would they do instead? Well, if there was an option. So, like, if you have a car that you know is being made by slaves or a car that's being made in Detroit by union workers? Wouldn't you choose the car,
Starting point is 00:23:09 as long as they're both of equal quality? I think a lot of people would feel good about their choice if they could buy something that, well, no, these people are given a very good wage, they have health insurance, and they're taken care of, they have a pension plan. There's all these good things that we would like to have ourselves
Starting point is 00:23:26 that these workers get. So you should probably buy that car. Why isn't there an option like that for a phone? We looked at this thing called a fare phone. We're going over it. You can't even fucking buy it in America. Like, no, America has no options for fare. They only have them in, like, Holland
Starting point is 00:23:43 and a couple other european countries yeah i mean i i think uh yeah maybe it's good to you know start with the question of phones i i think if you really examined like most of the things in your everyday life um there is an apocalyptic aspect to them yes i mean even you know even agriculture you know it's just like you know The sugar you put in your coffee. It's like I've been to the sugar beet harvest. It's apocalyptic. It's like – so I think there's just like an aspect of civilization that we don't usually see or think about. And it's a non-conscious – not non-conscious, but i mean conscious capitalism would be the idea that
Starting point is 00:24:26 you are willing you want to make a profit but you only want to make a profit if everything works like the idea of me buying my shoes from origin like knowing okay these are the guys that make it this is how they make it i got this is a good this makes me feel good i like this if there was that with everything like if you if you buy a home from a a guy who you know built the home like this is the man this is the this is the the the chief construction guy this these are the carpenters this is the architect oh okay i get it this all makes sense yeah i mean and i think that's the image that a lot of companies try to project. You know what I mean? Like, you know, even Apple will say,
Starting point is 00:25:09 you know, it's like designed by Apple in California. Sure, designed. And I think that's the same as like the architect and the builders that you know, but those materials are coming from somewhere. That's true. The wood is coming from somewhere. And, you know, it's not just wood.
Starting point is 00:25:27 There's like petrochemicals. Like, you know, that whole supply chain is apocalyptic and you're never going to meet all of those people. And so, you know, I think, sure, they're, I think it's, you know, it's difficult to be in that market. Like if you want to be in the market of conscious capitalism or whatever, because it's a market for lemons, um,
Starting point is 00:25:46 that it's cause it's so easy to just put a green logo on whatever it is that you're creating. Uh, and no one will ever see the back of the supply chain. That's a sad statement about humans. You know, that we're, that this is how,
Starting point is 00:26:04 I mean, this is how we always do things if you let us if you leave us alone if there's a way you know i mean privacy is so important when it comes to communication with individuals and this is why you created signal but when you can sort of hide all the various elements that are involved in all these different process all these different things that we buy and use and then you know as you said they're apocalyptic which is a great way of describing it if you're i mean you're at the ground watching these kids pull coltan out of the ground in africa i mean you'd probably feel really sick about your cell phone yeah and um yeah but i don't think i think it's a little more complicated than to say that just like
Starting point is 00:26:54 humans um are terrible or whatever no i don't think humans are terrible i think humans are great but i think if you put humans together and you give them this diffusion of responsibility that comes from a corporation and then you give them a mandate you have to make as much money as possible every single year and then you have shareholders and you have all all these different factors that will allow them to say well i just work for the company you know i'm not i'm not it's not my call you know i'm i just you know you got the guy carving up a stake saying, listen, I'm so sorry that we have to use slaves. But look, Apple's worth $5 trillion. We've done a great job for our shareholders.
Starting point is 00:27:32 At the end of the line, follow it all the way down to the beginning and you literally have slaves. Yeah, I fundamentally agree. You know, that's anytime you end up in a situation where, like, most people do not have the agency they would need in order to direct their life that they would the way that they would want, you know, direct their life so that we're living in a sane and sustainable way. That, yeah, I think is a problem. And I think that's the situation we're in now you know and honestly i feel like you know the stuff that we were talking about before of you know people um you know sort of being mean online um is a reflection of that you know that um that's the power that that's the only power that people have you know it's like you know like you know the only thing that if the only thing you, that's the only power that people have. You know, it's like, you know,
Starting point is 00:28:25 like, you know, the only thing that, if the only thing you can do is call someone a name, you're going to call them a name, you know? Right. And I think that it's like,
Starting point is 00:28:34 yeah, it's unfortunate. Um, but I think it is also, um, unfortunate that most people have so little agency and control over the way that the world works, that that's all they have to do. And I guess you would say also that the people that do have power that are running these corporations don't take into account what it would be like to be the
Starting point is 00:29:00 person at the bottom of the line, to be person that is there's no there's no discussion there's no like board meetings like hey guys what are we doing about slavery well no i don't i i'm sure that i'm sure that they do talk about that uh honestly but they've done nothing uh they've probably done something or they've probably done what they think is something but i think it's um even you know even the ceo of a company is someone who's just doing their job at the end of the day. You know, that they can't, they don't have ultimate control and agency over how it is that a company performs because they are accountable to their shareholders, they're accountable to the board. Right. You know, that it's like, I think there is a tendency for people to look at what's happening, particularly with technology today, and think that it's the fault of the people you know like the the leaders of these companies you know
Starting point is 00:29:49 and you know like I think it goes both ways you know Slavoj Zizek always talks about when you look at the old political speeches you know if you look at the fascist leaders you know they would give a speech, and when there was a moment of applause, they would just sort of stand there and accept the applause, because in their ideology, they were responsible for the thing that people were applauding, you know. And if you watch the old communist leaders, you know, like when Stalin would give a speech, and he would say something, and there would be a moment of applause, he would also applaud. Because in their ideology of historical materialism,
Starting point is 00:30:27 they were just agents of history. They were just the tools of the inevitable. It wasn't them. You know, they had just sort of been chosen as the agents of this thing that was an inevitable process. And so they were applauding history, you know. And sometimes when I see, like, the CEOs of tech companies give speeches and people applaud, I kind of feel like they should also be applauding.
Starting point is 00:30:46 That it's not them. Right. Technology has its own agency, its own force that they're the tools of in a way. That's a very interesting way of looking at it. Yeah, they are the tools of it. And at this point, if we look at where we are in 2020, it seems inevitable. It seems like there's just this unstoppable amount of momentum behind innovation and behind just the process of creating newer, better technology
Starting point is 00:31:20 and constantly putting it out and then dealing with the demand for that newer, better technology technology and constantly putting it out and then dealing with the demand for that newer better technology and then competing with all the other people that are also putting out newer better technology and then yeah yeah look what we're doing we're we are helping the demise of human beings because i feel and i've said this multiple times and i'm going to say it again, I think that we are the electronic caterpillar that will give way to the butterfly. We don't know what we're doing. We are putting together something that's going to take over. We're putting together some ultimate being, some symbiotic connection between humans and technology,
Starting point is 00:32:03 or literally an artificial version of life, not even artificial, a version of life constructed with silicone and wires and things that we're making. If we keep going the way we're going, we're going to come up with a technology that is going to be ex machina. It's going to pass the Turing test,
Starting point is 00:32:23 and it's going to literally be something that's better than what we are, a better version of a human being. I think we're a ways away. Yeah, we're a ways away, but how many ways? 50 years? The moment that I can put my hand under the automatic sink thing and have the soap come out without waving around, then I'll be worried.
Starting point is 00:32:45 That's simplistic, sir. How dare you? Well, here's a good example. You know the Turing test. The Turing test is if someone sat down with, like in Ex Machina. Remember that was one of my all-time favorite movies where the coder is brought in to talk to the woman. He falls in love with a robot lady,
Starting point is 00:33:04 and she passes the Turing test. He falls in love with the robot lady. And she passes the Turing test because he's in love with her. I mean, he really can't differentiate. In his mind, that is a woman. That's not a robot. Was it Alan Turing? What was the gentleman's name? Alan Turing.
Starting point is 00:33:19 Alan Turing that came up with the Turing test. You know, he was a gay man in england in the 1950s when it was illegal to be gay and they killed himself chemically castrated him because of that and he wound up killing himself that's only 70 years ago oh yeah yeah this fucking insane i mean just think that this this man back then was thinking there is going to be a time where we will have some kind of an a creation where we we imitate life the current life that we're aware of we're going to make a version of it that's going to be indistinguishable from the versions that are biological and that very guy by whatever twisted ideas of what human beings should or shouldn't do whatever expectations of culture at the time is forced to be chemically castrated and winds up committing suicide
Starting point is 00:34:20 just by the hand of humans fucking strange man like really strange i mean uh worse than strange oh yes horrible yeah but i mean but so bizarre that this is the guy that comes up with the test of like how do we know when something is like what when it passes when you have an artificial person that passes for a person and then what kind of rights do we give this person what is this like what what what is it like if it has emotions what if it cries are you allowed to kick it you know like what do you do like that's but i made it. I turned it on. I could fucking torture it. But you can't. It's screaming. It's in agony. Don't do that.
Starting point is 00:35:07 Yeah. I mean, you know, I don't think about this stuff that often, but it is, you know, it's an empirical test, right? So it's like, it's a way to avoid having to define what consciousness is, right? Which is kind of strange. We're conscious beings and we don't actually really even know what that means. And so instead we have this empirical test where it's just sort of like, well, if you can't tell the difference,
Starting point is 00:35:26 um, you know, without being able to see it, then, uh, then we'll just call that. I think that is really a lot closer than we think. I think that's,
Starting point is 00:35:38 I think that's 50 years. I think that if everything goes well, I think I'm going to be a 103 year old man on my dying bed being taken care of by robots. And I'm going to feel real fucked up about that. I'm going to be like, oh, my God. I can't believe this. I'm going to leave. And then all the people that I knew that are alive, they're the last of the people.
Starting point is 00:35:59 This is it. The robots are going to take over. They're not even going to be robots. They're going to come up with some cool name for them. robots are going to take over. They're not even going to be robots. They're going to come up with some cool name for them. Yeah.
Starting point is 00:36:06 I mean, I think that there's a lot of, most of what I see in like the artificial intelligence world right now is not really intelligence. You know, it's just matching. You know, it's like you show a model, an image of 10 million cats, and then you can show it an image, and it will be like, I predict that this is a cat. And then you can show it an image of a truck, and it'll be like, I predict that this is a cat. And then you can show it an image of a truck and it'll be like, I predict that this is not a cat.
Starting point is 00:36:29 And I don't, I think there's one way of looking at it that's like, well, you just do that with enough things enough times and that's what intelligence is. But I kind of hope not. And I, you know, the way that, the way that it's being approached right now, I think is also dangerous in a lot of ways because what we're doing is just feeding information about the world into these models, and that just encodes the existing biases and problems with the world into the things that we're creating. And that, I think, has negative results. But it's true. This ecosystem is moving, and it's advancing, whatever. results but but i mean yes it's true like this ecosystem is moving and and it's advancing whatever and i think the thing that i think is unfortunate is that like right now that ecosystem this sort of um like really capital-driven investment uh startup sort of ecosystem has a
Starting point is 00:37:17 monopoly on like groups of young people trying to do something ambitious together in the world and uh in the same way that i think it's unfortunate that like grad school has a monopoly on groups of people learning things together, you know? And so part of what we're trying to do different with Signal is it's a nonprofit. So because we want to be for something other than profit. And so we're trying to like explore a different way of like groups of people, you know, doing something mildly ambitious. Has anyone come along and go,
Starting point is 00:37:46 I know it's a nonprofit, but would you like to sell? Well, you can't do that. It's like the structure, there's nothing to sell. Right. It's kind of amazing though,
Starting point is 00:37:56 that you guys have figured out a way to create like basically a better version of iMessage that you could use on Android. Because one of the big complaints about Android is the lack of any encrypted messaging services. Or just good messaging services. Yeah, they've just recently come out with their own version of iMessage, but it kind of sucks. You can't do group chats. There's a lot of things you can't do with it, and it's encrypted.
Starting point is 00:38:22 But is the new – and I don't think it's rolled out everywhere too, right? It's not everywhere and i don't think it's rolled out everywhere too right it's not everywhere i don't think it's rolled out at all actually oh you could get a beta is that what it is yeah i don't i don't know what the you're right so it's like um that you know android so google for android makes an app called messages which is just the standard sms texting app and they put that on the phones that they make, like the Pixel and stuff like that, you know. And then there's the rest of the ecosystem, you know, there's like, you know, Samsung devices, Huawei devices, you know, all this stuff. And it's sort of, it depends, you know, what's on those things. And so they've been trying to move from this very old standard called SMS
Starting point is 00:39:02 that you mentioned before to this newer thing called RCS which actually I don't know what that stands for I think in my mind I was thinking of it as standing for too little too late but they're trying to move to that and so they're doing that on the device at the part of the ecosystem that they control which is the devices that they make and sell and they're trying to get other people on board as well and they originally rcs didn't have any facility for antenna encryption and they're actually using our stuff the signal protocol uh in the uh the new version of rcs that they're shipping so i think they've announced that but i don't know if it's if it's on or not i have two bones to pick with you guys yeah two things that i don't know if it's on or not. I have two bones to pick with you guys. Two things that I don't necessarily like.
Starting point is 00:39:46 One, when I downloaded Signal and I joined, basically everyone that I'm friends with who was also on Signal got a message that I'm on Signal. So you ratted me out. You ratted me out to all these people that are in my contact list. Why do you want it to be difficult for people to communicate with you privately? Well, me personally, because there's a lot of people that have my phone number that I wish didn't have my phone number.
Starting point is 00:40:10 And now all of a sudden they got a message from me that I'm on Signal. And then they send me a message. Hey, I'd like this from you. I want you to do that for me. How about call me about this? I got a project. So I just wish you didn't like rat me out. I wish there was like a way that you could say,
Starting point is 00:40:26 do you want everyone to know that you just joined Signal? Yes or no? I'd say no. Another one. Those little dot, dot, dots, the ellipsis. Yeah. Yeah. Can you shut that off?
Starting point is 00:40:36 Because I don't want anybody to know that I'm responding to a text. You can turn it off. Can you turn that off? Oh, okay. So it's in the settings? Yeah, privacy settings. You can turn off typing indicators. You can turn it off.
Starting point is 00:40:44 You can turn it off. That's a big problem with's in the settings? Yeah, privacy settings. Typing indicators you can turn off. Read receipts you can turn off. That's a big problem with iMessage. People get mad at you. Like if you see the dot dot dots and then there's no message. Like, hey, you were gonna respond and then you didn't. Like, why don't you just relax? Just go about your life and pretend that I didn't text
Starting point is 00:41:00 you back yet. Because I will. But it's not like the dot dot dots like people are like, oh, it's coming like the dot dot dots like people are going oh it's coming here comes the message and then there's no message yeah you can turn that off you can also turn off read receipts so people don't don't even know if you've read their message yes that's good too yeah yeah my friend saga has it set up so that if you uh he texts you you have 30 minutes bitch and then they all disappear all the messages disappear he's got yeah that's that's kind of a sweet move i like that yeah with the um like the discovery question of like can
Starting point is 00:41:32 you don't want people to know that you're on signal um it's kind of so we're working on it but it's it's a more difficult problem than you might imagine because um you want some people to know that you're on you know it's like you want i'll text him so you want nobody to know well me personally i have a unique set of problems that comes with uh anything that i do like with messaging and stuff it's like i just there's too i have no too many people i've changed my number once a year it's just i've and i have multiple phone numbers yeah i got a lot of problems yeah but this is just a unique problem with me that it's like all of a sudden i'm getting like how the fuck does he know and then i had to ask someone to go oh no when you sign up it sends
Starting point is 00:42:16 everybody on your contact list that's on signal a message that says you're on signal yeah oh well we don't send that actually uh just as a i know you don't care but the we don't actually know who your contacts are you know right signal does though the app does that the app on your phone does and it doesn't even send a message to those people it's just that those people know that you're on signal they those people know your phone number and the that app now knows that that phone number is on signal and did you do that just to get more people to use Signal? Was it an idea of the reason why, like, why when you sign up for Signal, does it send all the other people in your contact list on Signal a message?
Starting point is 00:42:54 A lot of people like it. So a lot of people like knowing who that they can communicate with. And so we're trying to make, so, and the other thing is like, we try to square the actual technology with the way that it appears to work to people. So right now, with most technology, it seems like you send a message and the person who can see it is the person you sent the message to, the intended recipient. And that's not how it actually works. And so a lot of what we're trying to do is actually just square the way the technology actually works with what it is that people perceive. way the technology actually works with what it is that people perceive uh and so like fundamentally right now you know signal is based on phone numbers if you register with your phone number
Starting point is 00:43:30 like some people are going to know that they can contact you on signal there's it's very difficult to make it so that they can't you know that like uh if we if we didn't do that you know they could hit the compose button and see just that they could send you a message you know they would just see you in the list of contacts that they can send messages to you know and then if we like didn't do that, they could hit the compose button and see just that they could send you a message. They would just see you in the list of contacts that they can send messages to. And then if we didn't display that, they could just try and send you a message and see whether a message goes through. It's always possible to detect whether it is that you're on signal the way that things are currently designed. It's interesting also how it works so much differently with Android than it does with iMessage. With Android, it will also send an SMS.
Starting point is 00:44:08 I noticed that I can use Signal as my main messaging app on Android. And it'll send SMS or it'll send a Signal message. It doesn't do that with iPhones. Yeah, Apple doesn't let you. Yeah, I found that pretty interesting. Because I tried to send people messages. I thought it would just send it as an SMS, and it didn't. We would if we could, but Apple doesn't let you. Yeah, I found that pretty interesting because I tried to send people messages. I thought it would just send it as an SMS, and it didn't. We would if we could, but Apple doesn't allow it.
Starting point is 00:44:31 It doesn't allow it. Interesting. Because Apple's scared of you. Say it. Say it. They're fucking scared. No, I mean, I... They should be.
Starting point is 00:44:40 Apple is... It's a better version of what they've got. How about that? I agree But yeah I mean they have a much more complicated answer But maybe you can distill it down to that You guys need to just develop your own version of AirDrop
Starting point is 00:44:53 And then no one will need Apple ever again That's what's holding people back Like a universal AirDrop AirDrop keeps a lot of fucking people on Apple You think? Oh it's the best You make a video Like a long video Like a couple minutes long And you can just AirDrop keeps a lot of fucking people on Apple. You think? All right. Oh, it's the best. You make a video, like a long video, like a couple minutes long, and you can just AirDrop it to me.
Starting point is 00:45:09 Whereas if you text it to me, especially if I have an Android phone, oh, it becomes this disgusting version. Don't downsample it. It looks terrible. Yeah. No, that's true. That's true. Yeah.
Starting point is 00:45:20 Photographs are not too bad. I think that it does a downsample photographs as well, but not too bad. It's it does a downsample of photographs as well, but not too bad. It's like you could look at it, it looks like a good photograph, but video is just god-awful. It's embarrassing when someone sends you a video and you have it on an Android phone.
Starting point is 00:45:35 You're like, what the fuck did you send me? This is terrible. What did you take this with? A flip phone from the 90s? It's so bad. But I mean, a lot of that is like, I think the reason why it is that way is kind of interesting to me, which is, you know, it's like these are protocol. You know, it's like when you're just using a normal SMS message on Android, you know, that was like this agreement that phone carriers made with each other in like you know 2002 before that really you know 96 yeah yeah exactly you know and then have they've
Starting point is 00:46:13 been unable to change the way that it works since then because um you have to get everyone to agree right and is apple holding back some sort of a universal standard? Because if they did have a universal standard, then everyone would have this option to use. You could use a Samsung phone or a Google phone. You could use anything, and everybody would be able to message you clearly without a problem. One of the things that holds people back is if you switch from an iPhone to an Android phone,
Starting point is 00:46:40 you lose all those iMessages. Sure, sure, sure. Yeah, they're probably doing that intentionally because they... Fucking weasels. They want people to continue to use iPod. Don't they have enough money? Like, Jesus Christ.
Starting point is 00:46:51 There's never enough. That's the problem. That is the problem, right? Yeah. And I think, I mean, it's like, I think the thing that everyone's worried about right now with Apple is like, you know, Apple, you know, what I said before,
Starting point is 00:47:03 like bad business models produce bad technology you know thus far Apple's business model is much better than you know Google or Facebook or Amazon or you know like their their business is predicated on selling phones selling hardware you know and that means that they can think a little bit more thoughtfully about the way that their software works than other people. And I think what people are concerned about is that that business model is going to change. You know, that they're, you know, approaching an asymptote of how many phones that they can sell. And so now they're looking at, like, software. You know, they're like, what if we had our own search engine?
Starting point is 00:47:43 What if we had, you know, our own thing? What if we had, you know they're like what if we had our own search engine what if we had you know our own thing what if we you know and the moment that that starts to happen then they're sort of you know moving in the direction of the rest of big tech uh which you know who knows how they do it but that that's what i think people are concerned they've done a better job at protecting your privacy though in terms of like particularly apple apple maps like the map app is far superior in terms of sharing your information than say like the google maps but the argument you could make is that google maps is a superior product because they share that information like google maps is also ways now right like they bought ways which is fantastic lets you know where the cops are you know there's an accident up ahead all kinds of shit right but apple maps was not
Starting point is 00:48:32 that good it's i use it because i like the ethic behind it i like their idea behind it they they delete all the information after you make you know if, if you go to a destination, it's not saving it, sending it to a server and making sure it knows, like, you know, what was there and what wasn't there and how well you traveled and sharing information. They're not doing that. They're not sharing your information. Right? We don't know. You know, it's like I'm sure that they have a policy. I haven't read the policy. and maybe the policy says that um but supposedly so you know but and you're also sort of you're
Starting point is 00:49:14 still in the in the world of like um trying to make computers secure right you know it's like there's probably data the data is probably accumulating somewhere and maybe people can compromise those places you know we don't know but I and and for sure the the intent behind the software that they have constructed I think has been much better than a lot of the other players in big tech I think the concern is just that as that software becomes a larger part of their bottom line that that might change I wonder if they could figure out a way to have an i don't give a fuck phone or i care phone like you want to have an i don't give a fuck phone this phone is like who knows who's making it but look it's really good it's got a 100 megapixel
Starting point is 00:49:59 camera and all this jazz and a 5 000 milliampamp battery. And then you've got an iCareFone. And the iCareFone, it's like an iPhone 10. But what's different about the iCareFone? The iCareFone, you get like a clear line of distinction. You get a real clear path. This is where we got our materials. These are the people that are making it. This is how much they're getting paid.
Starting point is 00:50:27 Everyone is unionized. They're all getting healthcare. They all have 401k plans. Yeah, it costs a little bit more. It's not as good. If you truly encapsulated all of the social costs with producing that phone, I think it would cost more than a little bit more.
Starting point is 00:50:44 How much more do you think it would cost? I think some astronomical number. I'm sure Apple would prefer not to have child slaves mining cobalt for the batteries that are in their phone. Is that a thing you can say when a company is worth as much as most countries?
Starting point is 00:51:01 They have so much cash. Can you really say that they they would rather not use slaves no i'm sure imagine i'm sure that i don't want to go broke i have only have 14 trillion what am i gonna do what am i gonna do i need slaves i need someone to dig the the cold tan what would i do if i was them well first of all it could never be them it would never work uh but if I was I would say hey um why don't we open up a factory in America and why don't you gotta mind the cobalt isn't in America why don't right why don't we get all of our cobalt from recycled phones is that possible because that if it was going to recycle
Starting point is 00:51:43 them that's a good question. I think that's what the Fairphone is trying to do, right? Aren't they using all recycled materials? Yeah. I mean, I don't... Any image I've seen of electronic recycling is equally apocalyptic. There's just piles of shit like in some next to a lake in China
Starting point is 00:52:00 where people are... You're bumming me out, man. How about... But I think if you were the ceo of apple and you were like this is a priority we're going to spend you know however many trillions of dollars it takes to do this um your shareholders go hey fuck face you're fired yeah out yeah exactly yeah yeah you would have to be the grand poobah of apple you'd have to be the ultimate ruler but it's not like um even then if you were just like
Starting point is 00:52:25 you know i'm willing to take the hit you know uh i'm gonna do no one can oust me or whatever i'm the grand poobah you know yeah then it's like your share price plummets which means that um your employee retention plummets because those people are also working for the equity right you know so it's like now they're working lessons and then they get poached away by these other companies you know it's like it's dirty companies come and steal your clean employees this is what apple's website says now it says they're committed to one day sourcing 100 look at that it's completely recycled every bit as advanced one committed to one day sourcing one day we're planning on the year 30,000.
Starting point is 00:53:06 I mean, you know, it's like, I don't, they're not, they're not like sitting around twirling their mustaches. You know what I mean? It's just like everyone likes good things and not bad things. Maybe they are. Let me read that again, Jamie. It says 100% recyclable and renewable materials across all of our products and packaging
Starting point is 00:53:22 because making doesn't have to mean taking from the planet. Oh, come on, you guys. It's like Nike. It's the same thing too, right? They're all committed to Black Lives Matter and all these social justice causes, and they're using slave labor too. Aren't they in China?
Starting point is 00:53:38 They're using slave labor to make Nikes. Probably. So go back to that thing. What are they trying to do what's they have like this row i remember seeing a robot they have that can do like lighter on the planet right out of the box the pieces out of it at a very fast rate than like probably human hands can oh okay so that's what i was trying to dig through here but i found that well that would be good i think that's the robot that's the the peace-taking robot daisy this is daisy don't name her name her you got a problem
Starting point is 00:54:07 there you go 2030 entirely clean energy which isn't quite as it's you know 2030 means transitioning hundreds of our manufacturing suppliers to 100 renewable sources of electricity. Well, that's interesting. If they can actually do that, 100% resource, use all the, if they can figure out a way to do that and to have recyclable materials and have all renewable electricity, whether it's wind or solar,
Starting point is 00:54:42 if they can really figure out how to do that, I think that would be pretty amazing. But who's going to put it together? Are they going to still use slaves to put it together? I mean, I guess the people that are working at Foxconn are technically slaves, but would you want your child to work there? You know? Yeah. I mean, I think you can say that about a lot of the aspects of our economy, though. You know, who would willingly go into a coal mine? Yes, right. Yeah. You know, there's some element of coercion to a lot of what keeps the world spinning. Right. And that's the, when you get into these insidious arguments about, or conversations about conspiracies, like conspiracies to keep people impoverished.
Starting point is 00:55:24 They're like, well, why would you want to keep people impoverished they're like well why would you want to keep people impoverished well who who's going to work in the coal mines you're not going to get wealthy highly educated people to work in the coal mines you need someone to work in the coal mines so what do you do what you do is you don't you don't help anybody get out of these situations so you'll always have the ability to draw from these impoverished communities these poor people that live in appalachia or wherever their coal miners are coming from like there's not a whole lot of ways out like i have a friend who came from kentucky and he's like the way he described it to me goes man you've never seen poverty like that like people don't want to
Starting point is 00:56:03 concentrate on those people because it's not as glamorous as some other forms of poverty goes but those communities are so poor like yeah 40 million americans right yeah americans are living in poverty yeah um i mean i don't know if that conspiracy is accurate but that's the one that people always want to draw from right they always want to i mean i don't think you need a conspiracy. You know, you just, you have. You have poor people. Structural forces.
Starting point is 00:56:28 Yeah. Yeah. Yeah. That's why it's rare that a company comes along and has a business plan like Signal where they're like, we're going to be nonprofit. We're going to create something that we think is of extreme value to human beings uh just to civilization in general the ability to
Starting point is 00:56:54 communicate anonymously or or at least privately that this is uh it's a very rare thing that you guys have done that we decided to do this and to do it in a non-profit way like what what was the decision that led up to that and then was there any how many people were involved uh now um yeah okay well now there's uh you know 20 something people uh which do you think that's a lot or a little um i think that's a little okay yeah yeah i think you know it's always interesting talking to people like a lot of times i'll um i'll meet somebody they're like oh yeah you're the person who did like signal or something like oh yeah yeah they're like okay cool what are you doing now you know i'm like oh i'm still working on signal you know they're like oh what is there like another signal that you're gonna do you're gonna do like signal, you know, like I think it's hard for people to understand that there's that software is never finished.
Starting point is 00:57:50 You know, there's this which is something that I really envy about, like the kind of creative work that someone like you does. You know that like I envy artists, musicians, writers, poets, painters, you know, people who can create something and be done you know that like you can record an album today and 20 years later you can listen to that album and yeah it'll be just good you know it's like uh software is never finished and if you stop it'll just like float away like dandelions what happens if you stop because software is not um it's very hard to explain this. It's not, it doesn't exist in isolation. It's a part of the ecosystem of like all software.
Starting point is 00:58:33 And that ecosystem is moving and it's moving really fast. You know, there's a lot of money behind it, a lot of energy in it. And if you aren't moving with it, it will just, you know, stop working. And also it's like, you know, a project like this is not just the software that runs on your phone but the service of like you know moving the messages around on the internet and that requires a little bit of care and attention and if you're not doing that then it will dissipate and if you're doing something non-profit the way you're doing it how do you pay everybody like how does it work yeah well okay so you know the history of this was um i think before the internet really took over our lives in the way that it has
Starting point is 00:59:11 there were the kind of um social spaces for people to experiment with different ideas outside of the context of their everyday lives you know, like art projects, punk rendezvous, experimental gatherings, the embers of art movements, you know, that these spaces existed and were things that I found myself in and a part of, and they were, like, important to me in my life. You look like a dude who'd go to Burning Man. I am not a dude that goes to burning maybe you're missing it maybe i've been once i went in yeah 2000 i think yeah uh and uh early adopter i can well it's funny because at the time
Starting point is 00:59:58 that i want people like oh man it's not like it used to be and now people are like have you been i was like i went once in 2000 like wow wow that's when it was like the real deal i don't think so uh it's one of those things where it's like you know there's like day one and then on day two they're like ah it's not like day one right like of course it just gets worse uh but uh yeah i don't know those things those spaces were important to me and like an important part of my life and as more of our life started to be taken over by technology um you know me and my friends felt like those spaces were missing online you know and so we wanted to demonstrate that it was possible to create spaces like that and um there had been a history of people thinking about um cryptography in particular and uh and which is kind of funny
Starting point is 01:00:49 in hindsight right uh so in the like 80s so the history of cryptography is actually not long like at least in outside of the military you know uh and you know, it really starts in the 70s. And there were some really important things that happened then. And in the 80s, there was this person who was just sort of this lone maniac who was, like, writing a bunch of papers about cryptography during a time when it wasn't actually that relevant because there was no internet. You know, the applications for these things were harder to imagine.
Starting point is 01:01:23 And then in the late 80s, there was this guy who wrote a... who was a retired engineer who discovered the papers that this maniac David Chom had been writing and was really fascinated. Was he doing this in isolation or was he a part of a project or anything? No, I think David Chom was... I think he's an academic. I'm embarrassed that I don't know. But he, he did a lot of the notable work on using the primitives that had had already been developed. And he had a lot of interesting ideas. And there's this guy who was a retired engineer, his name was Tim May, who was kind of a weird character. And he found these papers by David Chom, was really enchanted by what they could represent for a future. And he wanted to write like a sci-fi novel about,
Starting point is 01:02:14 that was sort of predicated on a world where cryptography existed and there was a future where the internet was developed. And so he wrote some notes about this novel. And he titled the notes, The Crypto Anarchy Manifesto. And he published the notes online, and people got really into the notes. And then he started a mailing list in the early 90s called the Cypherpunks mailing list. And all these people started, you know, joined the mailing list, and they started communicating about, you know, what the future was going to be like, and they needed to develop cryptography to live their crypto anarchy future.
Starting point is 01:02:49 And at the time, it's strange to think about now, but cryptography was somewhat illegal. It was regulated as a munition. Really? Yeah. So if you wrote a little bit of crypto code and you sent it to your friend in Canada, that was the same as shipping Stinger missiles across the border to Canada. Wow. So did people actually go to jail for cryptography? There were like some high profile legal cases. Nobody, I don't know of any situations where
Starting point is 01:03:17 people were like tracked down as like munitions dealers or whatever, but it really hampered what people were capable of doing. So people got really creative. There's some people who wrote some crypto software called Pretty Good Privacy, PGP. And they printed it in a book, like a MIT Press book in a machine-readable font. And then they're like, this is speech. You know, this is a book. You know, it's like, I have my First Amendment right to print this book and to distribute it. And then they shipped the books to Canada and other countries and stuff.
Starting point is 01:03:50 And then people in those places scanned it back in to computers. And they were able to make the case that they were legally allowed to do this because of their First Amendment rights. And other people moved to Anguilla and started like writing code in anguilla and like shipping around the world uh there were a lot of people who were fervently interested why anguilla uh because it's close to the united states and uh there were no laws there about producing photography so uh i think that was something people they have like three cases of covid there ever. Oh, really?
Starting point is 01:04:26 Yeah, it's a really interesting place. Yeah, I used to work down there. Really? Okay, international traffic and arms regulation. It's a United States regulatory regime to restrict and control the export of defense and military-related technologies to safeguard U.S. national security and further U.S. foreign policy objectives. ITAR. Yeah, they were closed
Starting point is 01:04:48 and Gila was closed until like November. They wouldn't let anybody in. If you want to go there, I was reading all these crazy restrictions. You have to get COVID tested and you have to apply and when you get there, they test you
Starting point is 01:05:03 when you get there. They have no deaths. Yeah, keeping it real. Because they have no deaths. Yeah, yeah, yeah. Yeah. That's cool. Yeah, I like New York. It's an interesting place.
Starting point is 01:05:12 Yeah, this is what I was reading. They're inviting companies to come move here. Like, come work here. Oh, interesting. Yeah. Come, we'll test the shit out of you, and you can't go anywhere, but come here. It's beautiful. It is beautiful.
Starting point is 01:05:23 I used to work on boats down there. Yeah? Yeah. What did you do on boats? I was like really um i don't know i i for a while was like really into sailing and i had a commercial license i was moving boats around and stuff my parents lived in a sailboat for a while oh really yeah yeah they just decided to just check out. And this was like, I want to say like early 2000s, somewhere around then. They just lived on a sailboat for a few years until my mom got tired of it. They go around the world?
Starting point is 01:05:53 They went. They were in the Bahamas. They were all around that part of the world. They were in California for a little while on their boat. They just decided, let's just in california for a little while on their boat you know they just they just decided like let's just live on a boat for a while yeah it's pretty crazy i i i discovered sailing by accident where i was like um working on a project with a friend in the early 2000s and we were looking on craigslist for something unrelated and we saw a boat that was for sale
Starting point is 01:06:21 for four thousand dollars and i thought a boat was like a million dollars. I was just like, what? The sailboats are $4,000? And this is just some listing. There's probably even cheaper boats. And so we got really into it. And we discovered that you can go to any marina in North America and get a boat for free. Every marina has a lean sail dock on it where people have stopped paying their slip fees. And the boats are just derelict and abandoned.
Starting point is 01:06:41 And they put it on these docks. Really? Yeah. You get a boat for free? Yeah. They have an auction uh there's usually like a minimum bid of you know 50 bucks 50 bucks or whatever you know and most times it doesn't get bid on and they chop the boat up and uh throw it away really and if you show up functional boat well functional that's the problem right you know you gotta maintain the shit out of boats yeah so you know if you put some work into it though you can get it going and uh so we started
Starting point is 01:07:10 doing that we were like you know getting boats fixing them up sailing as far as we could and then uh eventually i got a commercial license and started sailing other people's boats wow all this on a whim of how much does a boat cost you can get a boat for four grand holy shit next thing you know you're working on boats yeah yeah i mean i was it's a really it's a whole world you know it's just like you know finding that link on craigslist was like you know opening a door to another reality right where it's like yeah because it's pretty amazing you know uh you know me and some friends used to sail around the caribbean and um you know the the feeling of like you know you pull up an anchor and then you sail like you know 500 miles to some other country or whatever and you get there and you drop the anchor and you're
Starting point is 01:07:55 just like we it was just the wind the wind that took you know like there was no engine there was no fuel it's just the wind you know and you catch fish and you know it's just like if you want to go real old school you got to use one of them what are those fucking sextants of course do you use one of those no you didn't did you really i was like really into like you know no electronics like it's just complicated you know they're expensive or whatever so we had a taffrail log you see one of those things oh what's that it's like a little uh propeller on a string uh that you connect to a gauge and as it turns the gauge keeps track of how far you've traveled what?
Starting point is 01:08:32 a propeller on a string? so it's just a thing that turns a string at a constant rate depending on how fast you're moving so it can gauge how much distance you've traveled so is the string marked? No, no, no. Like, it's just a constant length that's always spinning, and it's always turning the gauge.
Starting point is 01:08:53 And then it reads a number? So it says how many miles? So there's just, like, a dial and a number of how many nautical miles you've traveled. Wow. And so then you're just like, okay, well, we started here, and then we headed, you know, on this heading, and, you know, we did did that and we traveled 10 months we must be here you know and then you know once a day you can take a site with your sextant and
Starting point is 01:09:11 then you can you know do some dead reckoning with the compass and wow dude you went old school yeah i once had a job actually who did you do this with just friends yeah and you gotta have some fucking committed friends because like the friends had to friends yeah and you gotta have some fucking committed friends because like the friends had to be you know you gotta be all on the same page because they could be like hey man let's get a fucking gps you guys are assholes i don't want to die i'm not gonna get eaten by a shark how much food do we have people die out here man this is the ocean we didn't really have any money so it was like you know it wasn't like uh much of a decision i mean we you know it's like let's put things in perspective like you know we took a trip to
Starting point is 01:09:48 uh through the caribbean once from florida the way that we got to florida was like riding freight trains to the you know like hopped trains to get there you know it's like this was like low low budget uh traveling you guys were hobos no but you know that's a hobo move it was low bagger for sure uh but like yeah i was i was also like uh just weirdly ideological about it where like um i had a job once in the caribbean that was like um i was almost like a camp counselor basically where there was this camp that was like a sailing camp but it was like 13 teenagers mostly from north america showed up in saint martin and then got on a boat with me and uh another woman my age and we were like the adults and then it was just like we sailed from saint martin to trinidad over the course of six weeks with these like 13 kids on a 50 foot sailboat
Starting point is 01:10:37 who left their kids with you that's what i want to know man it was like is this you oh yeah so we made a uh me and my friends made a video called hold fast uh that was trying to demystify sailing bro you've been rocking this wacky hair for a long time dude i know you know pandemic you know wow uh uh well you had tornadoes out there yeah and you caught fish yeah yeah so you lived off the fish that you caught basically yeah fish cock seaweed wow seaweed yeah so when you prepare seaweed what do you do you boil it you're gonna sharpen your fucking knife son i know that's ridiculous what are you using a pencil to try to kill that poor fish this whole video is embarrassing but
Starting point is 01:11:23 so thank you for that because you kind of didn't know what you were doing. And here's you with your, what are you doing here? You're mapping out where you're at? This is dead reckoning, yeah. Dead reckoning. That position was 50 miles off. 50 miles off? So where you thought you were versus where you actually were was 50 miles difference?
Starting point is 01:11:42 Yeah. And you're going how many miles an hour? Very slow. If you're doing really well, you know you're making. And you're going how many miles an hour? Very slow. If you're doing really well, you know you're making five knots, five nautical miles an hour. Five miles an hour. Yeah. Jesus Christ.
Starting point is 01:11:52 So you're walking. You're basically walking on the ocean. Yeah. Yeah. Not walking. It's slow going. But you never stop. That's the thing.
Starting point is 01:11:59 You can sail all night. You can just keep going. You're a light jog. You're jogging on the ocean. Anyway, I was a tyrant with these kids. We had a nice boat, and I disabled all of the electronics. sail all night you can just keep going you're a light jog you're jogging on the ocean anyway i was a tyrant with these kids where like we had like a nice boat and i just disabled all of the electronics and i like disabled the electric uh anchor windlass how long was this boat how long was this boat uh this was 50 feet 50 feet with 14 kids you said i think 13 yeah 50 is a big boat
Starting point is 01:12:20 that's actually a big boat yeah but it doesn't seem like a lot of room for all these kids yeah people are like sleeping on the deck and then the cockpits oh my god that's insane did you feel weird that you mean you're responsible for their food you're responsible for making sure they don't fight with each other yeah i mean i i actually enjoyed it i think it was fun yeah and what seemed like it like it's a yeah you have to make it work there's no other solution you're on this boat with these kids. Yeah, that's true. Do you still keep in touch with those kids?
Starting point is 01:12:48 No. That was sort of like pre-social media. Right. So not really. They're going to reach out to you now. They'll go, man, I remember that. That was fucking crazy. I can't believe my parents left me with you.
Starting point is 01:13:01 I can't believe they did either, man. So did you have to sign any paperwork or anything like how did you take care of these kids i'm sure i had to sign something i don't remember you don't remember yeah wow was there any time where you were like halfway into this trip you're like what have i signed up for oh sure all the time yeah but i was i you know i i'd never really been in a situation like that either. Who has? I don't know. It's like I don't even have siblings, you know. Oh, really?
Starting point is 01:13:28 Yeah. And I was pretty, you know, it was interesting. I feel like I learned a lot. But I was pretty, like, tyrannical in a lot of ways, you know. But in a way that I was trying to, like, encourage. It was fun to see particularly teenagers who had, like, a really sort of North American affect about how to be. Just, like, let all of that go over a few weeks, you know, on the ocean. Where it's just, like, you know, it's just us.
Starting point is 01:13:58 We're here. There's nobody else watching. You know, we're sleeping next to each other. You know, it's like kids just getting comfortable with themselves, you know we're sleeping next to each other you know it's like the kids just getting comfortable with um themselves you know and and you know i would try and like so i was like um i am uh really into rock paper scissors uh how into it are you i'm i'm undefeated and so is that possible so whenever they wanted anything i would be like all right rock paper scissors you know they were like can we like do this thing i'd be like all right we'll do rock
Starting point is 01:14:29 paper scissors if you win you can do this thing if i win and then i would like pick the thing that was like their sort of deepest fear you know it's like the really shy person had to like write a haiku about every day and then read it aloud at dinner you know like the you know the person who was like really into like um having like a manicure like wasn't allowed to shave her legs for the rest of the trip you know like that kind of thing wow and so then by the end of it it was just like you know everyone had lost you know so everyone was like reading the haiku at dinner and do you know how are you so good at rock paper scissors it's just you know skill muscle intuition intuition can we play right now you want to
Starting point is 01:15:06 play yes but i only play for stakes okay what do you want to play for okay how about uh if i win i do the programming on your show for a week. No. That's worth a lot of money. You can fuck off. What kind of money? I'm not saying the ads or whatever.
Starting point is 01:15:34 You mean programming? Who's going to be on? Yeah, who's going to be on? That's not possible. We're booked up months and months in advance. You were so confident until just now. No, no, no. We can pick a few months off.
Starting point is 01:15:43 That's ridiculous to flip a coin on that. There's no chance. I mean, what would be the equivalent? Because then you'd make me have conversations. Listen, the whole reason why this show works is because I talk to people that I want to talk to. That's why it works. Yeah.
Starting point is 01:15:55 The only way- You've got to do something to play this game. That's not a risk. That's just one week of your life. No, that's abandoning the show. That's one week of your life. No, you could bring some assholes on here that I want to talk to, and then I'm like, what am I doing? No, no, no. Impossible no no impossible all right well do you think that there's something
Starting point is 01:16:08 of equivalent value no there's nothing nothing that i know there's nothing that you could give me that would be worth a week of programming on the show what are you going to give me you'd have to day program you'd have to give me a spectacular amount of money i sent you a like but i wouldn't but that's the only way i would be like the only way like if you ever put a monetary equivalent to that it would have to be a spectacular amount of money for me to let someone else program the show i've never let anybody do that before not even for one day no that was the big one of the big things about doing this show on spotify they could have no impact at all on who gets on,
Starting point is 01:16:46 no suggestions, no nothing. The only way it works. What was up with that dude in the suit outside with the clipboard that was telling me from Spotify? Oh, he's from the government. He's from the CIA.
Starting point is 01:16:55 There's no one out there. He's joking. But the only way the show works, I think the way it works is I have to be interested in talking to the people. That's it. So it's, it has to be to be i get a i've like all these suggestions for guests i go oh that kind of seems cool oh that might be what if i let me read up on this what if it's like for a week i give you the list of suggestions no no no input no no it's not that. That's a ridiculous. Stand real. Stand real. Okay. Okay. All right. Impossible. In any case.
Starting point is 01:17:25 How about five bucks? No. No? No, it's got to be stakes. Come on, man. It's got to be real stakes. 20 bucks? 20 bucks.
Starting point is 01:17:31 I got 20 bucks in my pocket. Money is off the table. We can't do money. Money's off the table? Forget that. All right. Sounds like someone's scared to lose at rock, paper, scissors. It sounds like someone else is scared to lose at rock, paper, scissors.
Starting point is 01:17:40 No, you're asking me for something that's ridiculous. You don't have anything. You don't have anything that's worth a week of programming on this show you don't have it that's rough it doesn't exist that's rough no it doesn't literally doesn't exist like there's not like you don't there's nothing that you can have that you could offer me that i couldn't buy myself i'll make your i'll make i'll no no it'll be interesting no no no will be interesting. No, no, no. You can't. No. Alright, fine. But that doesn't do anything for me. That does something for you. That does zero for me.
Starting point is 01:18:09 No, no, of course, of course, you would have, if you win, you would name your steak. I don't have a steak. There's nothing I want from you. Alright, well. What you asked for me is a crazy thing. Then we can't play the game? Yeah, we can't play Rock, Paper, Scissors then, huh? Interesting. Anyway, we were talking about something else before all of this. We were talking about the evolution of cryptography. Sailing with children.
Starting point is 01:18:29 Sailing with children. Well, first we were talking about Anguilla. Yes. And the fact that people are moving to Anguilla. Yeah. So how did you learn how to do all this stuff? Was it trial by fire? When you were learning how to use all this, I mean, I don't want to call it ancient equipment,
Starting point is 01:18:42 but mechanical equipment to figure out how to... Oh, sailing stuff? Yeah. Yeah. Yeah. Secret is to begin. To start. Yeah, I mean, so... Like a sextant. Where the fuck does one learn how to operate
Starting point is 01:18:55 a sextant and then navigate in the ocean? Yeah, just I would... You know, I started... Me and some friends got a boat and um we started fixing it up and making a lot of mistakes and then you know started taking some trips and then getting lost yeah i got lost a bunch i took a solo trip from san francisco to mexico and back uh on a little 27 foot boat with no engine and. How long did that take? A few months.
Starting point is 01:19:26 And the way you did it, did you stay close? I can see there's the shore, so if everything fucks up, I can kind of swim. Yeah, well, no, you can't swim. I learned that lesson, too. No? Why? It's, I mean, the closest I ever came to death in my life
Starting point is 01:19:43 was just in the bay. In the San Francisco Bay, I was on a boat that capsized, and I was probably 2,000 yards away from shore, and I almost drowned. And I mean, I didn't make it to shore. And yeah, it's just the water's so cold, you know? You didn't make it to shore? No. Yeah, it's a long story. I was like, a friend of mine was living in San Francisco, and he wanted to learn how to sail. And I was like, you know what you should do is you should get like a little boat, like a little sailing dinghy, you know? And then you can just anchor it like off the shore in this area that no one cares about.
Starting point is 01:20:17 And, you know, you could sort of experiment with this little boat. And so he started looking on Craigslist, and he found this boat that was for sale for 500 bucks up in uh up in the north bay and uh every time we called the phone number we got an answering machine that was like hello you've reached dr ken thompson honorary i'm unable to take your call you know and we were like what is that like honorary like a fake doctor is he like a judge you know like what is it you know and so finally we got in touch with this guy. We go up there and it's this, this is the kind of situation where like we pull up and there's like the trailer that the boat's supposed to go on and it's just full of scrap metal.
Starting point is 01:20:50 Oh boy. And, you know, and you know, this guy comes out, he's like, oh yeah, this is the trailer. We were going to do a metal run, but if you want the boat, you know, we'll take the metal off, you know? And we're like, okay. You know, and he's like taking us around. He's like, okay, the master's over here.
Starting point is 01:21:03 And it's like under some leaves and, you know, it's like, and then, you know, the, the hole is in the water here. And it's, he has like taking us around he's like okay the master's over here and it's like under some leaves and you know it's like and then you know the the hole is in the water here and and it's he has like a dock behind his house and the tide is all the way out so the boat's just sitting in the mud you know and i'm like well how do we get this out of here he's like oh you'd have to come back at a different time uh you know and then you take it over there and we're like you told us to come now like at this time you know what anyway so we go through all this thing you know and and he and you know my friend who knows nothing about boats is like, all right, Moxie, like, what do you think? You know, should I get this? And I was like, okay.
Starting point is 01:21:28 Oh, and we were like, so what's it, you know, doctor of what? He's like, oh, self-declared, you know, we're like, oh, okay. He's a self-declared doctor? Honorary. Honorary self-declared doctor. You can do that? I guess so. Why not?
Starting point is 01:21:40 It's just an answer. Jamie? Yes. Doctor? Yes. I think we should become doctors. I just an answer. Jamie. Yes. Doctor? Yes. I think we should become doctors. I just became one. I tried that for a while, actually.
Starting point is 01:21:50 Yeah? Did you really? Yeah, I don't know. I mean, I never went to college, so. Did Hunter S. Thompson ever get an honorary degree, or did he just call himself Dr. Hunter S. Thompson? Because he was calling himself Doctor. Same trick.
Starting point is 01:22:01 Same trick. Hunter S. Thompson for a while. Edward Benitez did the same thing. Well, Bill Cosby became a doctor for a little bit they took it back though that's you know you fucked up yeah yeah they take back your fake doctor degree yeah yeah so this guy was like you know my friend's like what do you think maxine i'm like all right dr ken i would have to consider i'm not sure that i would do it but i would consider taking this boat for free i'd have to think about it but i would consider that you know and he's like i might be amenable to that you know so we've gone from like
Starting point is 01:22:30 you know 500 to free uh and so we got this boat you know and it was we had to deal with the metal and all this stuff we got the boat and um we were just trying to do like a little, we're just trying to anchor it. Did you bring life vests? Yeah, I was wearing a PFD, a type 2 PFD. And we took it to this boat ramp and it was the end of the day and the wind was blowing kind of hard and the conditions weren't that good. But I was like, oh, we're just doing this little thing, this little maneuver. And we were in two boats. I built this little wooden rowing boat. And my friend was going to go out in that with one anchor and i was going to sail out built it yeah out of uh plywood it's stitching glue uh but yeah uh and
Starting point is 01:23:15 but it you know not the sturdiest vessel so uh and so you know he's going to go out in this little rowboat and i was going to sail out this uh this little catamaran and we had two anchors and we're gonna anchor and then we're gonna get in the rowboat and row back and um it seemed a little windy and you know i got in the boat first and i got out around this pier and was hit by the full force of the wind and realized that it was blowing like 20 knots it was way way too much for what we were trying to do but i had misrigged part of the boat so it took me a while to get it turned around um and by the time i got it turned around my friend had rowed out um around the pier and he got hit by the force of the wind and just got blown out into the bay so he's rowing
Starting point is 01:23:54 directly into the wind and moving backwards oh shit and i was like fuck and i'm on this little hobie cat and it was moving so fast like it was way too windy to be sailing this thing i've got just my clothes on i don't have a wetsuit on or anything like that i have a life jacket and just my clothes and we don't have a radio you know we're unprepared oh it's starting to get dark we don't have a light um and i'm sailing back and forth trying to like help my friend uh and it got to the point where i was like all right i'm just gonna tack over i'm going to sail up to this boat that was called the Sea Louse. Sail up to the Sea Louse. I'm going to get my friend off of it.
Starting point is 01:24:30 We're just going to abandon it. And then we're going to sail this Hobie Cat back if we can. And so I go to turn around. And right as I'm turning around, a gust of wind hit the boat and capsized it before I could even know that it was happening. You know, just it's just like you just it's one moment you're on the boat and the next moment you're in the water, you know? And the water was like 50 degrees, um, like super, uh, it like is a shock, you know, it hits you.
Starting point is 01:24:56 And the boat was a little messed up in a way where it, I couldn't ride it. Uh, it had capsized and then the whole then it capsized all the way and then sank. So it was floating like three feet underwater, basically. And so I'm in the water, but I'm still a little bit out of the water, but in the water. And I had a cell phone that just immediately was busted. And I look at my friend, and he's a ways away now, and he didn't see me.
Starting point is 01:25:27 And I was yelling as loud as I could could but the wind is blowing 20 knots and it just you know you can't hear each other you know just takes your takes your voice away and um he just I mean I was screaming I was waving he wasn't wearing his glasses uh and he just very slowly rode away. Oh, my God. And so then I was just like floating there. I was starting to get dark. He rode away. Did he notice that your boat had capsized? No, he didn't even see me.
Starting point is 01:25:53 He thought that I had just sailed somewhere else. Because in his mind, he was, I was the person with the experience. Do you still talk to this dude? Yeah, all the time. I'd be like, you motherfucker. I don't blame him. In his mind he was the person that was in trouble right you know and i understand and he thought i just sailed
Starting point is 01:26:09 somewhere else but that's crazy yeah sailed out of vision yeah and then you know it basically got dark i could see the shore i wasn't far away there's nobody on shore there's nobody around and um the wind was blowing directly offshore so you have to swim you know swim into the wind and into the wind wave and all that stuff and eventually I tried swimming and I swam you know directly upwind and I was because I was I was like okay like if I get separated from this boat and I don't make it to shore then I'm definitely dead you know like there's just no saving me so I was trying to go directly upwind so that if I felt like i couldn't make it i would float back down when it hit the boat again and so i tried you know i swam for probably like 20 minutes upwind and made no progress it didn't feel like any
Starting point is 01:26:53 progress you know my you know in 50 degrees you have 30 to 60 minutes before you black out my arms were just you know it's like i consider myself a strong swimmer. Like I free dive, you know, all this stuff. And I just, you know, it's like you read these stories about, uh, how people die, you know, of just like they succumb to hypothermia on a local hike or they drown in the bay,
Starting point is 01:27:14 you know? And the story is always like, well, Timmy was a strong swimmer, but he, and you're like, really, was Timmy really a strong swimmer because he drowned in the bay,
Starting point is 01:27:20 you know? And like floating there, you know, it just all came to me. I'm like, wow, this is how this happens. You know, you just make a series of pretty dumb small decisions until you find yourself like floating in the dark in the bay there's no one around oh shit and it's a really slow process
Starting point is 01:27:35 too you know it's like it's not like you know you just come to terms with the idea that like you're not gonna make it and it's not it's not sudden it's not like someone shot you or you got hit by a bus or something like that it's like this hour-long thing that you're not going to make it. And it's not, it's not sudden. It's not like someone shot you or you got hit by a bus or something like that. It's like this hour long thing that you're getting dragged through all alone. And you, and you realize like, no one will ever even know what this was, you know, how this happened. And you think about all the people like Joshua Slocum, Jim Gray, people who are lost at sea. And you realize they all had this thing that they
Starting point is 01:28:03 went through, you know, this hour longlong ordeal of just floating alone, and no one will even ever know what that was or what that was like, you know. And eventually, I realized I wasn't going to make it ashore. I looked back. The boat was, like, way far away from me. I started, you know, drifting back towards it.
Starting point is 01:28:18 I was still trying to swim. I realized at some point that I wasn't going to hit it. I wasn't going to hit the boat on the way back downwind. And I had to just give it all that I had to try to connect with the boat, you know, to stop myself from getting blown past it. And in that moment, too, you realize that, like, uncertainty is the most unendurable condition, you know, that, like, you imagine yourself making it to shore and relaxing, you know, just knowing that it's resolved right and in that moment of like i might not make it back to this boat you're you're like tempted to give up because it's the same resolution you know it's the feeling of just
Starting point is 01:28:55 knowing you know that the uncertainties have been resolved you know and you have to really remind yourself that it's not the same you know that like you have to give it everything you have in order to survive you know and that that feeling that you're sort of longing for is not actually the feeling that you want, you know. And I just barely got the end of a rope that was trailing off the back of the hole, pull myself back on it almost threw up. Then I had to, then I was just floating there with the with the whole, you know, three feet underwater, I tied myself to it. I started to get tunnel vision. And the, with the, the whole, you know, three feet underwater. I tied myself to it. Um, I started to get tunnel vision and really at the last minute, um, I, a tugboat, uh,
Starting point is 01:29:37 started, uh, coming through the area and it was coming straight at me actually. And I, I realized that it probably just wouldn't even see me. It would just run me over and not even know that I had been there. You know, it's totally possible. Um know that I had been there you know it's totally possible um and I was just you know I was trying to like wave I could barely lift my arm I was trying to scream I could barely make any noise and somehow they saw me and uh they they like it took them like 15 minutes to like get a rope around me and they started pulling me off the side of the boat and it was like lining every tugboat is um tires like tires usually um as like a fender you know and i got like wedged in the tires as they were like pulling me up and i knew what was happening you know and i was like all i have to
Starting point is 01:30:14 do is stick my leg out and push against the hull of the boat you know to like go around the tires and i couldn't do it wow uh and i knew and i could barely see and they like swung me around it eventually pulled me up they like put me in uh next to the engines in the engine room i couldn't you know i couldn't even feel the heat uh and they called the coast guard and the coast guard came and got me it was like really embarrassing and the coast guard like you know the coast guard guy is like you know he's like got all these blankets over me and he's like trying to like talk to me to like keep me like alert you know and uh and you know he's's like, so is this your first time sailing? And I have a commercial, like a 250-ton master's license.
Starting point is 01:30:56 You need 600 days at sea to get this license. And I was like, no, I have a master's license. And he was like, what? He's like, you're a fucking idiot, man. Everything changed. The tone totally changed. Oh, my god, dude. That's insane.
Starting point is 01:31:11 Yeah. Did that change your appreciation for comfort and safety and just life in general? Yeah, totally. I mean, it changed. they're like yeah totally i mean it changed well you know for sure the next day i was like um you know it's just like any near-death experience i feel like you're just like what are we doing here you know like what's the why are we wasting our time with this you know at the time i was working at twitter uh and you know you know your co-workers are like oh we got this problem with the slave lag on the database and you're just like what are we doing man you know, your coworkers are like, oh, we got this problem with the slave lag on the database. And you're just like, what are we doing, man?
Starting point is 01:31:46 Shouldn't we be doing something else? But you can't, I feel like, you can't live like that for long. The what are we doing, man? You know, it's like it's impossible. The world will, like, suck you back into it. Yeah. the world will like suck you back into it. Yeah.
Starting point is 01:32:06 Yeah. Unless you go to Anguilla. I mean, a lot of those early crypto people are actually still in Anguilla. Really? Yeah. That's funny. Yeah. Yeah.
Starting point is 01:32:17 That's what we were talking about sailing Anguilla. So those, those people, the people who moved to Anguilla, you know, were part of this moment of like, how much did that shift your direction in your life though did it did it change like the way like it seems almost i mean i haven't
Starting point is 01:32:31 had a near-death experience but i've had a lot of psychedelic experiences and in some ways i think they're kind of similar and that life shifts to the point where whatever you thought of life before that experience is almost like oh come on that's nonsense yeah i mean it changes your perspective or it did for me and um you know because also in that moment you know it's like you know i think you go through this sort of embarrassing set of things where you're like oh i had these things i was going to do tomorrow like i'm not going to be able to do them you know and then you're like wait why what why is that the thing that I'm concerned about you know it's like it's sort of trivial thing yeah true that you know we're just
Starting point is 01:33:11 like I was gonna see that person tomorrow you know or it's just like you know you're like I feel like I remember I was supposed to meet somebody the next day I remember being worried that they would think that I like stood them up or something like that I you have the awesomest excuse ever I mean just tell them that story the way you just told it to me and they're gonna be like a good dude it's we're good shit fuck glad you're all right yeah my god that kind of stuff you know and then and then you you know get more into the yeah it changes the way you think about things and I and certainly you know it's like I was working at Twitter at the time and I think, um, it made me think about how I was spending my life and, you know, I was, you know, it's like the, I mean, even,
Starting point is 01:33:56 I remember the first day that, uh, I was at Twitter, um, the, at the time the most popular person on Twitter was Justin Bieber he had more followers than any other person you know was that when you guys were trying to rig it so that he wasn't trending number one always because they did do that right I don't remember that I was conveniently me and I were talking about that one day because they had to do something. Because if they didn't do something, Justin Bieber would be the number one topic every day, no matter what was happening in the world. I can believe that they wanted to change that. Because the problem was, at the time, Twitter was held together with bubble gum and dental floss.
Starting point is 01:34:41 So it's like, every time Bieber would tweet, like, you know, the lights would dim and like the building would kind of shake a little bit. Here it goes. So they block me from trending. This is 2010. I'm actually honored, not even Matt. He's also 12. Then I get on and see yet again,
Starting point is 01:34:58 my fans are unstoppable. Love you. But okay, so there's, you know, people talk about like invisible labor, like the invisible labor behind that tweet is just kind of comical because it's like when he did that, you know, people like, you know, it's like my first day there, you know, it's like he tweeted something and, you know, the building's like kind of shaking and like alarms are going off. People are like scrambling around, you know, and it was just this, you know, it's like this realization where you're just like never in my life did I think that anything justin bieber did would like really affect me in any like deep way you know and then here i am just like scrambling around to like facilitate what are your thoughts on curating what trends and what doesn't trend and whether or not social media should have any sort of obligation in terms of how things whether or not people see things like shadow banning
Starting point is 01:35:48 and things along those lines like i'm very torn on this stuff because i i i think that things should just be and if you have a situation where justin bieber is the most popular thing on the internet that's just what it is it is what it is but i also get it i get how you would say well this is going to fuck up our whole program like what we're trying to do with this thing what do you mean fuck up our whole program well what you're trying to do with twitter i mean i would assume what you're trying to do is give people a place where they could share important information and, you know, have people, you know, I mean, Twitter has been used successfully to overturn governments. I mean, there's, there's, Twitter has been used to break news on very important
Starting point is 01:36:40 events and alert people to danger. And there's's there's so many positive things about twitter and if if it's overwhelmed by justin bieber and justin bieber fan accounts and justin bieber if it's overwhelmed and the top 10 things that are trending are all nonsense i could see how someone would think we're going to do a good thing by suppressing that yeah I see what you're saying well why do you think they did suppress that what do you think you worked there why do you think they kept him from trending well I mean I don't know about that specific situation I mean I think I think you know looking at the larger picture right like in a way you know, looking at the larger picture, right, like,
Starting point is 01:37:31 in a way, you know, it's like, if you think about, like, 20 years ago, whenever anybody talked about, like, society, you know, everyone would always say, like, the problem is the media. It's like, the media, man, you know, if only we could change the media. And a lot of people in who were interested in, like, a better and brighter future were really focused on self-publishing. Their whole conference is about an underground publishing conference, now the Allied Media Conference. People were writing zines. People were getting their own printing presses. We were convinced that if we made publishing more equitable, if everybody had the equal ability to like produce and consume content that the world would change and in some ways like what we have today is like the fantasy
Starting point is 01:38:14 of you know those dreams from 20 years ago um but in a couple ways you, you know, like one, you know, it was it was it was the dream that if, you know, a cop kills some random person in the suburbs of St. Louis, that like everyone would know about it, you know, just, you know, everyone knows. And also that anybody could share their weird ideas about the world, you know. weird ideas about the world you know and i think in some ways um we were wrong you know that we thought uh like you know the world we got today is like yeah like if a cop kills somebody in the suburbs of st louis like everybody knows about it i think we overestimated how much that would matter uh and i think we also believed that the things that everyone would be sharing were like our weird ideas about the world and instead we got like you know flat earth and uh like you know uh anti-vax and like you know all this stuff right uh and so it's like in a sense like i'm glad that those things exist because
Starting point is 01:39:15 they're like they're sort of what we wanted you know uh but i think what we did what we underestimated is like how how important the medium is like the medium is the message kind of thing and that like what we were doing at the time of like you know writing zines and sharing information we I don't think we understood how much that was predicated on like actually building community and like relationships with each other and that like just like what we didn't want was just like more channels on the television. And that's sort of what we got, you know? And so I think, you know,
Starting point is 01:39:48 it's like everyone is like on YouTube trying to monetize their content, whatever, you know, and that it's the same thing. Like bad business models produce like bad technology and bad outcomes. And so I think there's concern about that, but I think, I think like, you know, now that there's, there i think like you know now that there's there's like you know these two simultaneous truths that everyone seems to believe that are in contradiction with each other you know like one is that like um everything is relative uh everyone is entitled to their own opinion all opinions are equally valid and two like um our democracy is impossible without a shared understanding of what is true and what is false.
Starting point is 01:40:28 The information that we share needs to be verified by our most trusted institutions. Like, people seem to simultaneously believe both of these things. And I think they're in direct contradiction with each other. And so in some ways, I think most of the questions about, you know, social media in our time are about like trying to resolve those contradictions and but i think you know um i think it's way more complicated than the way uh that the social media companies are trying to portray it yeah i think there's simplistic methods that they're using to handle complex realities. Like, for instance, banning QAnon. This is a big one, right?
Starting point is 01:41:13 Because QAnon's got these wacky theories, and then, like, Jesus Christ, these are weaponizing all these nutbags. We're just going to ban QAnon. But then, well, where do you... Because you think what they're saying is not true and not correct. But, like, how far do you go with that?
Starting point is 01:41:30 You've sort of set a precedent. And where does that end? Because, you know, are we going to ban JFK theories? Because JFK murders are probably still relevant today. Some of those people are still alive. Do we ban – there's theories about the challenger this the space shuttle challenger there's a lot of wacky conspiracy theories about the this conspiracy theories about the space being fake have you ever seen hashtag space is fake yeah okay go on there if you want to if you want to really fucking just lose all faith in humanity look up space is fake but i think oh my god there's so
Starting point is 01:42:08 many people yeah and i think that people get something out of that yeah they do well people get something out of mysteries and and maybe being on the inside and knowing things where the rest of the world is asleep this is the reason why people love the idea of red pilled you know somebody even suggested i call this room the red pill my friend radio rahim said call it a red pill i'm like ah there's a lot riding on that term too bad because i'm a giant fan of the matrix but that that term has been co-opted uh forever yeah but this idea that you're just going to ban people from discussing stupid ideas, where does that end? Does it end with flat earth? Are you going to ban that?
Starting point is 01:42:53 They're going to go, oh, they're suppressing us. And then they're going to find these – that's the thing about all these weird alternative sources of social media, whether it's Parler or Gab. They become shit fests. If you go to those, especially Gab, it's just like, god damn, what have you guys done? It's not even what have they done, it's what have the people done that have all been kicked out of all these other places
Starting point is 01:43:16 and then if you have a place that says, we're not going to kick you out, and then all these fucking cretins come piling into these places. And I'm sure there's a lot of good people on Gab, don't get me wrong sure there's a lot of good people on gab don't get me wrong there's a lot of people that are they just didn't want to be suppressed by social media parlor doesn't seem to be nearly as bad uh i've looked at that as well it's more like just super right-wing information type stuff and there's some reasonable people on parlor but
Starting point is 01:43:42 but i i think that there there's a subtle thing there because I don't know how those things work. But I think part of what... If you set aside all of the takedown stuff, all the deplatforming stuff, if you say like, okay, Facebook, Twitter, these companies, they don't do that anymore.
Starting point is 01:44:06 They've never done that. They're still moderating content. They have an algorithm that decides what is seen and what isn't. And in a way... But how is that all algorithm programmed? For Facebook and for YouTube and a lot of these things, it's done to encourage viewership. It's done to encourage interaction, right? It's done to encourage time spent looking at the screen.
Starting point is 01:44:31 Yeah, so that's how they monetize it. They want more clicks and more ad views and all that jazz. But when it becomes an ideological moderation, that's when things get a little weird, right? But it is by definition an ideological moderation that's when things get a little weird right but it's it is by definition an ideological moderation you know that the that the if you optimize for time spent looking at the screen you're going to be encouraging certain kinds of content and not others and okay but that's not always true like i'll give you an example for us um we did a podcast
Starting point is 01:44:59 with kanye west kanye west was running, right? And if you were running for president and you were outside the norm, like for instance, Twitter banned Brett Weinstein's, Brett Weinstein had a, he had a Twitter account that was set up for, it was Unity 2020. And the idea was like, instead of looking at this in terms of left versus right, Republican versus Democrat, let's get reasonable people from both sides, like a Tulsi Gabbard and a Dan Crenshaw, bring them together. And perhaps maybe put into people's minds the idea that like this idea, this concept of it has to be a republican vice president a republican president maybe that's nonsense and maybe it would be better if we had reasonable intelligent people together what is this there's their video yeah um well it's a very rational perspective it's not conspiracy theory driven they got banned from twitter and for nothing just because they were promoting a third
Starting point is 01:46:06 party because they were trying to come up with some alternative the idea was this could siphon off votes from biden we want biden to win because trump is bad this is the narrative right yeah i mean i think that there's man there's a lot here, but the, the, well, I was going to say the con, I got sidetracked. I'm sorry. Let me finish. Yeah. The Kanye West thing. So we had a podcast with Kanye West.
Starting point is 01:46:30 Um, it got, I don't know how many millions of views, but it was a lot, but it wasn't trending. And so Jamie, you contacted the people at YouTube and asked them why it's trending. What was their answer?
Starting point is 01:46:44 It's not trending. Um, it's trending. What was their answer? It's not trending. Like it didn't meet the qualifications they decided for trending or something like that. No, like it didn't include everything you would assume. Like you just said, all the interactivity comments. It had more comments than any video we had. That's what I mean.
Starting point is 01:47:01 Massive amounts of comments, massive amounts of views, but yet nowhere to be seen on the trending. But I don't think there was a person involved. Like there was an algorithm involved that was trying to optimize
Starting point is 01:47:12 for certain things. No. This specific case, yeah. There's a team there. There's separate teams at YouTube of my understanding. Yeah, and the separate team had made a distinction.
Starting point is 01:47:23 And I don't even know if they told the person who told me that what it was. So that person may not know either. this separate team had made a distinction and i don't even know if they told the person who told me that what it was so that person may not know either so they just decided this is not worthy of trending so you have arbitrary decisions that are being made by people most likely because they feel that ideologically kanye west is not aligned with mean, he was wearing the MAGA hat for a while. So they just decided this is not trending. But it is trending. It's clearly trending.
Starting point is 01:47:51 It got millions and millions and millions of people were watching it. It's like whether there are, but I think this is the point. It's like whether it's people, whether it's algorithms, there are forces that are making decisions about what people see and what people don't see and they're based on certain objectives that I think are most often business objectives. But not in this case. In this case, the business objective was if they wanted to get more
Starting point is 01:48:15 eyeballs on it, they would want it to be trending and people say, oh shit, Kanye West is on the Jerry. Do people that like Kanye click on ads or not? There's a lot in there that we don't know oh that's horseshit come on bro i don't know i mean maybe you know maybe they're making millions and millions when you have a video that's being viewed by that many people there's going to be a lot of goddamn people clicking on ads no matter what the other thing that these platforms want is for their for the content to
Starting point is 01:48:43 be ad safe you know it's like maybe advertisers don't, you know, it's like, I don't know. But I think actually focusing on the like the outlying cases of like this person was deplatformed. This person was intentionally ideologically not promoted or, you know, de-emphasized. Shadow banning. Yeah, that kind of stuff. de-emphasize shadow banning yeah that kind of thing i think that that like obfuscates or you know uh draws attention away from the larger thing that's happening which is that like um those things are happening just implicitly all the time and that like it it almost like serves to the advantage of these platforms to highlight the times when they remove somebody because what
Starting point is 01:49:22 they're trying to do is reframe this is like okay well yeah we've got these algorithms or whatever don't don't talk about that the problem is there's just these bad people you know and we have to decide there's a bad content from bad people and we have to decide you know what to do about these bad this bad content and these bad people um and i think that distracts people from the the fact that like the platforms are every at every moment making a decision about what you see and what you don't see that um is not as apparent you know i see what you're saying um so there's more than one problem there's a problem of deplatforming because in many ways he's deplatforming decisions are being made based on ideology it's a certain specific ideology that the people that are deplatforming the other folks have that doesn't align with the people that are being deplatformed these people
Starting point is 01:50:11 that are being deplatformed they have ideas that these people find offensive or they don't agree with so they say we're going to take you off yeah or sometimes they just find themselves in a trap you know that like a trap well i know, that like... A trap. Well, I think that there's like, okay, so like, there's, I think, a tendency for a lot of these platforms to, you know, make, try to define some policy about like, what it is that they want, they don't want, you know, and I feel like that's sort of a throwback to this, like, modernist view of science and how science works and we can like objectively rigorously define these things um and i just don't think that's actually how the world works and that like once what do you mean how so um i feel like we're just past that you know that it's not like um first of all i think you know science is not about truth it's just not it's about utility what do you mean um okay you know it's like i was taught newtonian physics in high school okay why it's not true you know that's not how the universe works um but it's still useful and that's why it's taught because you can use it to
Starting point is 01:51:19 predict motion outcomes that kind of thing what's incorrect about newtonian physics in the sense that they shouldn't be teaching it? I mean, today, people believe that the truth is that there's relativity, like gravity is not a force. There's these planes and stuff, whatever. There are other models to describe how the universe works. And Newtonian physics is considered outmoded. But it still has utility in the fact that you can use it to predict the...
Starting point is 01:51:47 So you're talking about in terms of quantum physics and string theory and a lot of these more... Yeah, it's like relativity at the large scale, quantum physics at the small scale. And even those things are most likely not true in the sense that they aren't consistent with each other and people are trying to unify them and find something that does make sense at both of those scales.
Starting point is 01:52:08 The history of science is a history of things that weren't actually true you know boar's model of the atom the tonian physics like you know people have these uh you know copernicus's model of the solar system people have these ideas of like how things work and the reason that people are drawn to them is because they actually have utility that it's like oh we can use this to predict the motion of the planets oh we can use this to send a rocket into space so we can use this to uh you know have better outcomes you know for some medical procedure or whatever but it's not actually um i don't i think it's not actually truth like the the point of it isn't truth the point of it is that like we have some utility that we find in these things. And I think that, you know, when you look at like, the emergence of science, and you know, people conceiving of it as a truth, it became this new
Starting point is 01:52:55 authority that everyone was trying to appeal to, you know, if you look at like, all of the like, 19th century political philosophy. I mean, okay, I think the question of truth is like you know it's even a little squishy with the hard sciences right but once you get into like soft sciences like social science psychology like then it's even squishier you know that like these things are really not about truth they're about like some kind of utility and when you're talking about utility the important question is like useful for what and to whom you know and i think that's just always the important question is like useful for what and to whom you know and i think that's just always the important question to be asking right um because you know when you look at like all the 19th century political writing it's all trying to frame things in terms of science in this way that
Starting point is 01:53:33 it just seems laughable now but you know like at the time they were just like we're going to prove that communism is like the most true like social economic system in the world you know like their whole disciplines of that people in uh you know people had like phds in the world like their whole disciplines of that people in uh you know people had like phds in that you know their whole research departments in the soviet union people doing that and we laugh about that now but i don't think it's that different than like social science in the west you know um and so i think you know it's like if you lose sight of that then you can try then you try to like frame social questions in terms of truths like you know it's like this is this is the kind of content that we want and we can rigorously define that and we can
Starting point is 01:54:10 define why that's going to have like the outcomes that we want it to but once you get on that road it's like you know you're like okay well terrorist stuff we don't like terrorist stuff so we're gonna like rigorously define that and then we have a policy no terrorist stuff and then you know china shows up and they're like we've got this problem with terrorists the uyghurs you know we need you know we see you have a policy you need to you know and if instead it was just like i think if people from the beginning acknowledged that uh this isn't some you know that all of objectivity is just a particular worldview and that like we're not going to rigorously define these things in a way of like what is true and what isn't then i think we would have better
Starting point is 01:54:48 outcomes but that's my weird take i mean i think you know from the perspective of signal you know it's like do you know what's trending on signal right now no nothing no okay but that's it's on the social media platform but isn't it there's a weird thing when you decide that you have one particular ideology that's being supported in another particular ideology that is being suppressed and this is what conservative people feel when they're on social media platforms they're almost all they almost all of them other than the ones we talked about before, Parler and Gab and the alternative ones, they're all very left-wing in terms of the ideology that they support. The things that can get you in trouble on Twitter. What did you say?
Starting point is 01:55:37 But then the President of the United States just constantly violated every policy that they had. But he's ridiculous. That's a ridiculous example, right? Because he's one person, and they've actually discussed this, that he and his tweets are more important. It's more important that they allow these tweets to get out. First of all, you can understand how fucking crazy this guy is. And second of all, it's newsworthy.
Starting point is 01:56:03 He's the leader of the the you know and also it would be um very costly from a business perspective if yes very likely and kind of amazing that he didn't do anything along the way while he was witnessing people get deplatformed and particularly this this sort of bias towards people uh on the left and this discrimination discrimination against people on the right there's people on the right that have been banned and shadow banned and and blocked from posting things and you run into this situation where you wonder what exactly is a social media platform. When it's just a small private company and maybe you have some sort of a video platform
Starting point is 01:56:51 and there's only a few thousand people on it and you only want videos that align with your perspective. Okay, you're a private company, you can do whatever you want. But when you're the biggest video platform on earth, like YouTube, the biggest video platform on earth like youtube and you decide that you are going to take down anything that disagrees with your perspective on how covid should be handled and including doctors like this is one of the things that happen doctors that were stating like there's more danger in lockdowns there's more danger in this than there is in the way we're handling it. There's more danger in the negative aspects of the decisions that are being made than it would
Starting point is 01:57:34 be to let people go to work with masks on. And then those videos just get deleted. Those videos get blocked. There's people that are opposed to current strategies with uh all sorts of different things and those videos get blocked so there's an ideological basis in censorship and so you have to you have to make a decision like what are these platforms are these platforms simply just a private company or is it a town hall is Is it the way that people get to express ideas? And isn't the best way to express ideas to allow people to decide based on the better argument what is correct and what's incorrect? Like this is what freedom of speech is supposed to be about. It's supposed to be about you have an idea, I have an idea, these two ideas come together, and then the observers get to go, hmm, okay, well, this guy's got a lot of facts behind him.
Starting point is 01:58:28 This is objective reality. This is provable, and this other guy is just a crazy person who thinks the world's hollow. Okay, this is the correct one. There's going to be some people that go, no, there's a suppression of hollow earth, and hollow earth is the truth, and hollow earth facts, and hollow earth theory. But you've got to kind of let that happen. You've got to kind of have people that are crazy. Remember the old dude that used to stand on the corners
Starting point is 01:58:56 with the placards on, the world is ending tomorrow? They're still there. Yeah, but those are on Twitter now, right? But those people, no one said, you've got to get rid of that guy. You would drive by and go, look at this crazy fuck. Those crazy fucks making YouTube videos, those videos get deleted.
Starting point is 01:59:12 I don't know if that's good. I kind of think that you should let those crazy fucks do that because it's not going to influence you. It's not going to influence me. It's going to influence people that are easily influenced. And the question is, who are we protecting and why are we protecting these people? Well, okay. But I think, in my mind, what's going on is, like, the problem is that it used to be that
Starting point is 01:59:34 some person with very strange ideas about the world wearing a sign on the street corner shouting was just a person with very strange ideas about the world wearing a sign on the street corner shouting. Now, there's somebody, you know, with very strange ideas about the world wearing a sign on the street. Right. Now there's somebody, you know, with very strange ideas about the world, and those ideas are being amplified by a billion-dollar company because there are algorithms that amplify that. And what I'm saying is that instead of actually talking about that, instead of addressing that problem, those companies are trying to distract us from that discussion by saying,
Starting point is 02:00:07 we're just going to remove that person's content. It's like, oh yeah, amplifying that, that was probably a bad idea. We're going to remove their content. Instead of looking at how to change the algorithm so that we're not amplifying things that ultimately don't serve us well. Would the correct way to handle it, would it be to make algorithms illegal in that respect? Like to not be able to amplify or detract?
Starting point is 02:00:29 To not be able to ban, shadow ban? Or just to have whatever trends, trend. Whatever is popular, popular. Whatever people like, let them like it. And say, listen, this thing that you've done by creating an algorithm that encourages people to interact, encourages people to interact on Facebook, encourages people to spend more time on the computer. What you've done is you've kind of distorted what is valuable to people. You've changed it and guided it in a way that is ultimately, perhaps detrimental to society so we are going to ban algorithms you cannot use algorithms to dictate what people see or not see you give them a fucking
Starting point is 02:01:13 search bar and you if they want to look up ufos let them look up ufos but don't shove it down their throat because you know they're a ufo nut don't don't curate their content feed. Yeah, I mean, I think it's okay. It's complicated because, one, I have no faith in, like when you say ban or make it illegal or whatever, I have zero faith in the government being able to handle this. Yeah, nor do I. Every time I see a cookie warning on a website, I'm like, okay, these people are not the people that,
Starting point is 02:01:43 this is what they've given us after all this time. You know, it's like, these people are not going to solve this for us, you know. And also, I think a lot of what it is that the satisfaction that people feel and the discomfort that people feel and the concern that people have is a concern about power. That right now, these tech companies have a lot of power. And I think that the concern that is coming from government is like have a lot of power. And I think that the, the concern that is coming from government is like the concern for their power. You know,
Starting point is 02:02:10 that like the right has made such a big deal about deplatforming. And I think it's because what they're trying, they're trying to, um, they're trying to, you know, put these companies on notice, you know,
Starting point is 02:02:23 that it's like, you fuck with us, you know, like we will take power power and but they've done nothing about it don't you think that they've actually made a big deal about deplatforming because the right has been disproportionately deplatformed um i think the right is like doing fine uh how so i i don't know i don't know what the numbers are but i i feel like it's like the fact that trump because you're on the left was still on yeah but that's trump that's trump he's it's not he's an anomaly you you can't really you know i okay i i think i guess maybe let me just reframe this to say that like i think it's interesting that we are we've hit an inflection
Starting point is 02:03:03 point right where like the era of utopianism with regards to technology is over. Yeah. That it's just like, you know, after 2016, it was just like big tech has zero allies anymore. You know, on the left, everyone's just like, you just gave the election to Trump, you know? And on the right, they're just like,
Starting point is 02:03:17 you just removed somebody from YouTube for calling gay people an abomination. Fuck you. You know, like it's, they have no allies. No one believes in the better and brighter. No one believes that Google is organizing the world's information. No one believes that Facebook
Starting point is 02:03:31 is connecting the world. I think that that is there's an opportunity there. We're in a better situation than we were before. All the cards are on the table. People are more and more understanding how it is that these systems function i think you know we're increasingly see that people understand that this is really about power it's about authority and that like we should be trying
Starting point is 02:03:54 to build things that limit the the power that people have if you had your wish if you could let these social media platforms, whether it's video platforms like YouTube or Facebook or Twitter, if you had the call, if they called you up and said, Moxie, we're going to let you make the call, what should we do? How should we curate this information? Should we have algorithms? Should we allow people? Should we just let it open to everything? Everything and anybody? What should we have algorithms should we allow people should we just let it open to everything everything and anybody what should we do well i mean this is what we're trying to do
Starting point is 02:04:29 with signal you know it's like but it's different right because you're just a messaging app we're just a messaging app but no i don't say that it's a very good messaging app that i use no i understand what you're saying but i think you know the way that messaging apps are going you know there's like a trajectory where a project like signal becomes more of a social experience um and that like the things that we're building extend beyond just like you know sending messages particularly i think it's more and more communication moves into group chats and things like that um and that you know the foundation that we're building it on is a foundation where we know nothing you know it's like if i looked up your signal account record right now of like all the information that we're building it on is a foundation where we know nothing. You know, it's like if I looked up your signal account record right now of like all the
Starting point is 02:05:06 information that we had about you on signal, there's only two pieces of information, you know, the date that you created the account and the date that you last used signal. That's it. That's all we know. You know, if you looked on any other platform,
Starting point is 02:05:18 it would be, your mind would be blown. No, it's admirable what you're doing. And that's one of the reasons why I wanted to talk to you. But that, and so I think that foundation gives us um it's like now that we have that foundation there's a lot that we can build on it right and would you do a social media
Starting point is 02:05:34 app well i think you know some of the stuff that we're working on now of just like moving away from phone numbers you can have like you know usernames that you can like post that more publicly and then you know we have groups now you have group links and then you know maybe we can do something with events and we could you know that's like we're sort of moving in the direction of like an app that's good for um communicating with connections you already have to an app that's also good for creating new connections would you think that social media would be better served with the algorithms that are in place and with the mechanisms for determining what's trending in place and for their trust and safety or whatever their content monitoring policy they have now or have it wide open wild west i mean, I think it depends on when you say like better, you know, better for what, right?
Starting point is 02:06:29 Better for humanity. Yeah, no, I think. Censorship is better? No, no, no. I think the problem, I think bad business models create bad technology, which has bad outcomes. You know, that's the problem we have today, right? So the problem is that there's a financial incentive for them to yeah that if if if we you know if you look at like the the metrics you know that we've talked about like you know what facebook cares about it just like time that
Starting point is 02:06:52 you spent looking at the screen on facebook you know like if if we were to have metrics if signal were to have metrics you know our metrics would be like what we want is for you to use the app as little as possible for you to actually have the app open as little as possible, but for the velocity of information to be as high as possible. So it's like you're getting maximum utility. You're spending as little time possible looking at this thing while getting as much out of it as you can. How could that be engineered, do you think? That's what we're trying to do.
Starting point is 02:07:16 So you're trying to do that with a social media app as well? Well, I mean, we're sort of moving in that direction, right? And I think once you start from the principle of like, well, we don't have to have infinite growth, we don't actually have to have profit, we don't have to return, we're not accountable to investors, we don't have to, you know, satisfy public markets. We also don't have to build a pyramid scheme where we have like, you know, 2 billion users, so that we can monetize them to like, you know, a few 100,000 advertisers so that we can monetize them to a few hundred thousand advertisers. We don't have to do any of that. And so we have the freedom to pick the metrics that we think are the ways
Starting point is 02:07:52 that we think technology should work, that we think will better serve all of us. So what would better be served is a bunch of wild hippies like yourself that don't want to make any money at all put together a social media app. If you work at Signal, you get paid. Oh, yeah yeah I'm sure I mean I don't mean yeah the company itself yeah as a as a corporation you get paid but that's it yeah I mean how do you generate the income well you know we do it by like tying ourselves to a community of users instead of advertisers right you know so but where's the money coming from
Starting point is 02:08:25 though from people who use signal so similar to like uh do they pay for it no no it's like donation based it's similar to like wikimedia oh you know it's like you know wikipedia exists there's no company there's no well that would be great if they could figure out a way to develop some sort of a social media platform that just operated on donations and could rival the ones that are operating on advertising revenue. Because I agree with you that that creates a giant problem. And that's what we're
Starting point is 02:08:54 working on, slowly. Do you think that what... So you just look at it in terms of bad business model equals bad outcome. That's how you look at all these and you don't and it's also by the way why we have people mining cobalt in yeah yeah and you don't think that they can regulate their way out of this situation with technology i'm i'm not super
Starting point is 02:09:19 optimistic yeah just based on you know and even even the hearings you know just like so do you think that yes the hearings were amateur hour yeah when uh yeah there was some ridiculous questions yeah i mean it's just like they're talking to the wrong people they don't understand how stuff works you know that's not google that's apple for this like yeah don't you have a team of people who yeah come Come on. Yeah. It's fascinating to watch, right? It's like your dad who doesn't know how to, how do I get the email? It's like these people are not going to save us, man. You know, and it's like anything that they do will probably just make things worse. Do you think that it's a valid argument that conservatives have, though, that they're being censored and that their voice is not being heard?
Starting point is 02:10:02 and that their voice is not being heard. I know what you said in terms of, you know, that if someone had something on YouTube that said that gay people are unhuman and they should be abolished and banned and delete that video, I get that perspective. But I think there's other perspectives like the Unity 2020 perspective,
Starting point is 02:10:22 which is not in any way yeah i mean i don't know what happened with that but i feel like what i i think it could be a part of this thing of just like well we create this policy and we have these you know we define things this way and then a lot of stuff just gets caught up in it you know where it's just like now you're like taking down content about the weakers because you wanted to do something else you know that if people just be more honest about like there is not really an objectivity and, you know, we're looking for these specific outcomes and this is why that I think, you know, maybe we would have better results. Well, how does one fix this, though? How does one, like, you worked at Twitter.
Starting point is 02:10:55 You kind of understand these better than most, these social media platforms. How would one fix this? If they hired you, if they said, hey, Moxie, we're kind of fucked. We don't know how to fix this. Well. Is there a way? Because it seems like they make so much money. Yeah.
Starting point is 02:11:13 If you came along and said, yeah, well, you got to stop making money. They'd be like, get rid of that fucking nut. Exactly, exactly. Look at him, this goddamn sailor. Yeah. What's he talking about? What is he talking about? Fuck out of here.
Starting point is 02:11:23 Stop making money. Yeah. What, you want to play rock paper scissors you're crazy man how do you fix this i mean one thing i'm actually a little encouraged by is like um the organizing union unionization stuff that's been happening in the tech industry uh so there's been um a couple of walkouts, and there's some increased communication among tech workers. Normally you think about...
Starting point is 02:11:50 I'm not totally aware of this. What have they been organizing and unionization about? Well, normally you think about unionization as a process for improving material conditions for workers. And there's some aspect of this in the organizing that's been happening. Where are they, where have they been doing this? Google is the big,
Starting point is 02:12:11 where a lot of the activity has happened, but it's happening across the industry. What are their objectives at Google? At Google, there were some walkouts. The objectives, you should talk to Meredith Whitaker about this, actually.
Starting point is 02:12:25 She's really smart and has a lot to say. Shout out to Meredith. Yeah. She and other people were working there, and they were organizing for, like, one, trying to apply the protections that full-time workers and benefits of full-time workers there had to a lot of the temporary workers, like the people who work in security, the people who are working in the cafeteria, the people who work, you know, driving buses and stuff like that, who are living a lot more precariously. But also for creative control over how the technology that they're producing is used. So Google was involved in some like military contracts that were pretty
Starting point is 02:13:05 sketch yeah yeah like applying machine learning ai stuff to military technology and then uh finally uh there had been a lot of um high profile sexual harassment uh incidents at google where the the perpetrators of sexual harassment were usually paid large severances in order to leave. And so they had a list of demands, and a lot of people walked out. I don't know what the numbers were, but a lot of people, they managed to organize internally and walked out.
Starting point is 02:13:39 And I think stuff like that is encouraging because we look at the hearings, and it's like the people in Congress don't even know who's the right person to talk to. You know, it's like, you know, old people talking about technology they don't understand. The people who really do understand technology are the people who are working in these companies. And a lot of times they don't feel good about the way that what they're creating is being applied, how what they're creating is being applied. But isn't that another issue where you're going to have people who have an ideological perspective and that may be opposed to people that have a different ideological perspective, but they're sort of disproportionately represented on the left in these social media
Starting point is 02:14:23 corporations. When you get kids that come out of of school they have degrees in tech or they're interested in tech they tend to almost universally lean left maybe but i think most like when it comes to the technology i don't think people are um i think you know what almost everyone can agree is the amount of money and resources that we're putting into surveillance, into ad tech, into these algorithms that are just about increasing engagement, that they're just not good for the world. And if you put a different CEO in charge says, no, this is what we want. This is how we want to allocate resources. This is how we want to create the world. Then you can't fire all those people. I understand what you're saying. So they'd have to get together and unionize and have a very distinct mandate, very clear that this is what we want to use this.
Starting point is 02:15:20 We want to go back to do no evil or whatever the fuck it used to be. Right. Yeah. Well, they don't really have that as a big sign anymore um do you think that would really have an impact though i mean there's it seems like the amount of money when you find out the amount of money that's being generated by google and facebook and uh and youtube and all it's the the numbers The numbers are so staggering that to shut that valve off, to shut that spout, good luck. It's almost like it had to have been engineered from the beginning, like what you're doing at Signal. Like someone had to look at it from the beginning and go, you know what? If we rely on advertiser revenue, we're going to have a real problem.
Starting point is 02:16:04 Yeah. But I think it's, yeah, exactly. I mean, you know, I think you're right. And there's, you know, part of the problem with just relying on tech workers to organize themselves is that they are shareholders of these companies. Right. They have a financial stake in their outcome. And so that influences the way that they think about things.
Starting point is 02:16:23 But, you know, I think another aspect to all of this is that I think people underestimate just how expensive it is to make software. And another thing that I think would really improve things is making software cheaper. You know, right now it's moving in the opposite direction. It's getting harder, more expensive to produce software. How so? it's getting harder more expensive to produce software um and how so it used to be that if you made if you like wrote a piece of software you just wrote it once you know for the computer and then that was your software you know now if you want to write a piece of software you have to write it at least three times you have to write it for iphone you have to write it for android you
Starting point is 02:16:56 have to write it for the web maybe you need a desktop client uh so it's like you need three or four times the energy that you used to have. And the way that software works, not worth going into. But it's just getting more expensive. What do you personally use? Are you one of those minimalist dudes? I notice you have a little tiny notebook here. Oh, yeah. And then you have two phones.
Starting point is 02:17:25 Yeah, I'm like, I have to have... I try to be like... I just want to... You're also one of those renegades with no phone case. Oh, yeah, man. I feel like that's like... You and Jamie should get together and talk about it. He's radical.
Starting point is 02:17:38 I mean, it's like people... You know, it's like industrial designers put all of that effort into creating that thing. And then just wrap a weird thing around fucking glass and it costs a thousand bucks. If you drop it with this thing on it, it doesn't get hurt. And see this? This little thing right here? See, I stick my finger in there and I can use it. I can text better.
Starting point is 02:17:55 Really good. Yeah. And also, if I want to watch a video, that'll prop it up. Ta-da. You know how that works? Ta-da. Isn't that that works? Ta-da. Isn't that better? Isn't that better than no case?
Starting point is 02:18:09 I mean, some things I actually want to make more difficult for myself. But I have two phones just because I'm trying to. I always just want to keep tabs on how everything works everywhere. So you have an Android and an iPhone. Do you keep things stripped down? No, I'm pretty. I mean, I don't actually use
Starting point is 02:18:27 TikTok. Well, okay, my problem is that I spend all day, I think, you know, sometimes I go through this thing where cryptography will be in the news or something. There'll be some geopolitical thing that's happening and someone like Vice or something will get in touch with me and they'll be like,
Starting point is 02:18:43 hey, we want to do a thing, like video where like we follow you around for a day like a day in the life you know it's because it's so exciting sounds good for them annoying for you well the thing i'll usually write back is like um okay uh here's the video me sitting in front of a computer for eight hours and they're like oh we can't make that video like no one would want to watch that. Yeah, what you need to do is take you to a yoga class, and you go to an organic food store, and you talk to people about their rights, and then...
Starting point is 02:19:13 Yeah, exactly. That's what they want. Unfortunately, I don't even want to watch the movie of my own life. Yeah. But so that is my life. So it's like I spend so much time like you know looking at a computer for work that i it's hard for me to continue like looking at screens and stuff yeah i can only imagine yeah but i try to be like a normal like um there's just like in the history of people who
Starting point is 02:19:38 were like doing or like building cryptography stuff like that um there was this period of time where the thesis was basically like all right what we're going to cryptography stuff like that um there was this period of time where the thesis was basically like all right what we're going to do is develop really powerful tools for ourselves and then we're going to teach everyone to be like us you know and that didn't work uh because uh you know we didn't really anticipate the way that computers were going so i try to be like as normal as possible you know i just like have like a normal setup i'm not like you know i haven't you know i used to have a cell phone where i'd like you know soldered the microphone differently so there was like a hard switch that you could turn it off whatever it's like really you did that yeah because it's
Starting point is 02:20:11 like you know whatever you start thinking about like how all this stuff works do you ever fuck around with like linux phones or anything like that no no i'm just i try to be like normal you know okay yeah i still do run linux on a desktop just because I've been doing it forever and you keep a Moleskine for what? just notes you don't put them on your phone? sometimes I do I like writing more I guess so you just do it just because you enjoy it
Starting point is 02:20:37 yeah but I guess you're right maybe I feel the forces of darkness are not going to compromise yeah does it I feel the forces of darkness are not going to compromise that yeah does it
Starting point is 02:20:49 do you feel like you have extra scrutiny on you because of the fact that you're involved in this messaging application that Glenn Greenwald and Edward Snowden and a bunch of other people that are seriously concerned with security and privacy that maybe people are upset at you that you've created something that allows people to share encrypted messages?
Starting point is 02:21:19 I mean, maybe. I mean, I think. Because you've kind of cut out the middleman, right? You've cut out the third-party door. Yeah. And I think... But in some ways, that means that there's less pressure on me because, you know, it's like if you're the creator of Facebook Messenger
Starting point is 02:21:34 and your computer gets hacked, like, that's everyone's Facebook messages are, you know, gone. And, you know, for me, if, like, my computer gets hacked, I can't access anyone's signal messages, whether I get hacked or not, you know for me if like my computer gets hacked i can't access anyone's signal messages whether i get hacked or not you know right and so it's i have sort of less liability in that sense there was like a weird period of time where um it was very difficult for me to fly commercially like on a airplane um and i don't know why uh i think it had something to do with uh a talk that someone gave about wiki leaks, and they mentioned my name.
Starting point is 02:22:06 You were getting flagged? Yeah, it was very annoying. I would go to the airport, and I wouldn't be able to print a ticket at the kiosk. I had to go talk to a person. They had to call some phone number that would appear on their screen and then wait on hold for like 45 minutes to an hour to get approval, and then they would get approval to print the ticket. So you had to anticipate this when you traveled?
Starting point is 02:22:25 So you had to go there way in advance? Way in advance. And then anytime I traveled internationally, on the way back through customs, they would seize all of the electronics that I had. Jeez. And confiscate them. The U.S. government would do this?
Starting point is 02:22:35 Yeah. Customs and Border Protection. They would seize your shit, and would you get it back? They would eventually send it back, but you just had to throw it out because it's not... Who knows what they did to it? I would want to want to give it to someone and go hey tell me what they did yeah yeah could you do that is it possible to back engineer with whatever i never i never spent time on it how much time did they have your shit for uh it'd be like weeks weeks yeah yeah
Starting point is 02:23:01 weeks did you have to give them passwords and everything well that's the thing. They would stop you, and they would be like, hey, we just need you to type in your password here so that we can get through the full disk encryption. And I would be like, no. And they would be like, well, if you don't do that, we're going to take this, and we're going to send it to our lab, and they're going to get it anyway. And I would be like, no, they're not.
Starting point is 02:23:18 And they would be like, all right, we're going to take it. You're not going to have your stuff for a while. You sure you don't want to type in your password? I would be like, no. And then it would disappear, and it would'd come back weeks later and then it's like how bizarre yeah and with there was no i mean they didn't have like a motive there was no that's the thing you never know why you know no but i'm saying they didn't they didn't say hey you were you're thought to have done this or there's some no they would always just be like oh no this is
Starting point is 02:23:46 just random or whatever but there would be two people at the exit of the plane with photographs of me you know waiting for me to step off the plane and they would escort they wouldn't even wait for me to get to the uh so did you have to have like a burner laptop i just wouldn't travel with electronics you know because it was just even your phone yeah even my phone oh fuck yeah yeah if that was only internationally though because they can't do that because it was just... Even your phone? Yeah, even my phone. Oh, fuck. Wow. That was only internationally, though, because they can't do that domestically. So domestically, you just had long waits,
Starting point is 02:24:13 and then they would eventually give you a ticket? Yeah, they would eventually give you a ticket, and then you'd get the selective screening where they would take all the stuff out of your bag and, like, you know, feel through your card. They'd touch your dick, too, right? And then at every connection, the TSA would come to the gate of the connecting thing, even bag and like, you know, feel through your car. And then every connection, the TSA would come to the gate of the connecting thing,
Starting point is 02:24:28 even though you're already behind security and do it again at the connection. Really? Yeah. I don't know. It was weird. It was just like a connections too. Yeah. Yeah.
Starting point is 02:24:37 So they're trying to fuck with you. I think so. Yeah. I don't know. And how long did that last for? That was a couple of years. Yeah. And when did it go away the
Starting point is 02:24:45 day it went away were you like oh yep yeah one day i just stopped that was it's really really did change the game what year did it go away when trump got into office no it's way before that yeah i forget uh yeah i forget yeah i was thinking actually on i was thinking on the way here it's funny how like i remember after the last election everyone was talking about like california leaving the united states like california seceding you remember that hilarious yeah and now everyone's talking about leaving california like after this yeah imagine that president newsom yeah locked down a communist state but do you remember people discovered that the cal exit the whole cal exit movement was started by a guy that lived in russia oh it was one of those uh ira things internet research agency scams but it wasn't i actually
Starting point is 02:25:30 tracked the guy down oh yeah in moscow one time you tracked him down he's just some guy well because he do it for goof no he like really believes uh california should leave yeah he he like he he lived in lived in California and had been for years trying to foment this Cal Exit thing. And he has all the stats on why it would be better for California and all this stuff. And then he sort of thought,
Starting point is 02:25:55 well, this isn't working. And he really liked Russia for some reason. So he moved to Russia just before the election not knowing what was going to happen. And then when Trump won, people were like, wait a second, fuck maybe california should get out of here and they just found this like campaign that already existed and everyone sort of got behind it and he was just like oh shit and he lives in russia now you know and and but he like didn't really understand um optics
Starting point is 02:26:19 i think where he like he like the re the way that people everyone found out that uh he lived in russia was that he opened a california embassy in moscow so they like announced like you know calix it has opened the first california embassy like in a foreign country but it was in moscow and this was right as all the like russian like stuff was happening you know uh yeah so if you're conspiratorially minded you'd have drawn some incorrect conclusions yeah yeah he was just i think i i met with him i like hanging out with him for a day i think he really genuinely just so what was your motivation to hang out with this guy for a whole day i mean i was just fascinated you know because here's this guy he's like doing this kind of ambitious thing and it's just the optics seem so bad you know yeah i think he reminded me of like
Starting point is 02:27:02 that hannah arendt quote that's like um you know if the essence of power is deceit does that mean that the essence of impotence is truth you know that like he sort of believed that um just like the facts were enough you know it's just like the stats of just like yeah we spend this much money on like defense spending if we like you know if we stop you know it's like we would have like you know so much money yeah california was a country and we would still have like the fourth largest military in the world and we you know we would have like uh you know it's just like the numbers actually are compelling you know and it was just sort of like that's you know people will just see the truth you know and i was like dude i think maybe you should like not live in russia anymore you know it was yeah why
Starting point is 02:27:42 did he go to russia i don't, he just, he had been teaching English and I think he just sort of ended up liking Russia. And so, yeah, he just decided to move there. And that was, I was on the way with a friend to Abkhazia. Have you ever heard of that place? No. It's an autonomous region of the country of Georgia. And it's kind of interesting. There's all these autonomous regions in the world that georgia uh and uh it's kind of interesting there's all these
Starting point is 02:28:06 autonomous regions in the world uh that are essentially their own countries you know um but they're not recognized by the un or other countries you know like texas you're in one right now uh i mean these places are like you know militarized border like they have their own like you know uh but they're not recognized by the un yeah uh and so they all recognize each other and it's like you know it's like if you want to be a country like it's kind of interesting you need a lot of stuff you know you need like a flag you need like a national bird you need like an anthem whatever and you need a soccer team you definitely have to have a soccer team you know interesting so these countries all have their own soccer teams but they can't play
Starting point is 02:28:43 in fifa because they're not recognized by the un so fifa can't recognize them so they have their own league it's like the league of unrecognized states and stateless peoples uh and they have their own world cup and they have really in abkhazia how many different countries are there that are like this there are a lot how many i mean i don't know i don't know how many people how many teams are in this league called Canifa. I mean, it's 20 plus. So there's 20 plus unrecognized countries
Starting point is 02:29:11 or autonomous regions. And also stateless people. So like the Kurds, you know. There's people from Chagos Islands were basically evicted for a U.S. military base and they're a diaspora. There's places like Somaliland,
Starting point is 02:29:25 Transnistria, South Ossetia, Laplandia. It's kind of interesting. So I went with a friend to Ocasio for the World Cup of all the unrecognized states. How was that? It was awesome. Yeah? It was like, yeah, it was really interesting.
Starting point is 02:29:41 I mean... The smile on your face. This is the biggest smile you've had the entire show. It sounds like it was really i mean the smile on your face this is the biggest smile you've had the entire show it sounds like it was a great time i mean it just is so fascinating to me and i think it's like an interesting you know it's like in a way that i feel like you know society moves by like pushing at the edges you know that like it's it's the fringes that end up moving the center i feel like um you know looking at the margins of the way politics works is an interesting view of how everything else works.
Starting point is 02:30:11 Going to Abkhazia, it was so crazy getting there. We travel all through Russia. We get to this militarized border. You go through these three checkpoints that aren't supposed to exist, but obviously exist. You get to the other side, and it's just the same as where you just were you know you're like you guys fought a brutal civil war you know with like genocide like full-on you know like crazy shit uh and it's just kind of
Starting point is 02:30:38 the same you know like was it worth it like what's the deal you know and i feel like it's this thing you see again and again of like um like the institutions that we're familiar with in the world that exists are, like, the institutions of kings, you know? It's, like, you know, police, military, illegal apparatus, tax collectors, you know? And that, like, every moment in history since then has been about trying to, like, change ownership of those institutions. And it's always sort of dissatisfying you know and like you know just seeing that happen again and and just like you know realizing that it's like maybe what we should be doing is actually
Starting point is 02:31:14 trying to get rid of these institutions or change these institutions in some way don't you think there's a very slow rate of progress but ultimately progress like if you follow pinker's work it looks at all the various metrics like murder rape racism crime all these different things it's over time there's we're clearly moving in a better direction maybe i mean and do you think it's just like you know i was listening to this podcast today. We were talking about religion, and it was discussing the Bible, and they were talking about all the different stories that are in the Bible, many of them that are hundreds of years apart, that were collected and put into that.
Starting point is 02:32:06 put into that. Just stop and think about a book that was written literally before the Constitution was drafted, and that book is being introduced today as gospel, and that there's a new book that's going to be written 200 years from now, and that will be attached to the new version of the Bible as well. And then one day someone will come across this, and it will all be interpreted as the will and the words of God that all came about in one particular era. It all came down from God. But now we know that these things are... You're dealing with giant spans of time.
Starting point is 02:32:40 Yeah, yeah, yeah. But today, these spans of time are far slower like going from alan turing in 1950 being chemically castrated for being gay to in my lifetime seeing gay marriage as being something that was very fringe when i was a boy living in san francisco to universal across the United States today, at least mostly accepted by the populace, right? That this is a very short amount of time where a big change has happened
Starting point is 02:33:13 and that these changes are coming quicker and quicker and quicker. I would hope that this is a trend that is moving in the correct direction. Yeah, certainly there are some things that are getting better, yeah. And I feel like, to me, it's important to, you know, for a lot of those things, the correct direction yeah i'm certainly there are some things that are getting better yeah and i feel like to me it's important to you know for a lot of those things like the things you mentioned
Starting point is 02:33:30 like gay marriage i think it's important to realize that like a lot of those a lot of that progress would not have happened without the ability to break the law honestly you know right right that like how would how would anyone have known that like we wanted to allow same-sex marriage if no one had been able to have a same-sex relationship because Saudi laws had been perfectly enforced, you know? Yeah. How would we know that we want to legalize marijuana if, like, no one had ever been able to consume marijuana? Right. Right, yeah. So I think, you know, a lot of the fear around, like, increased surveillance, surveillance data or whatever is is that like, these, that those,
Starting point is 02:34:05 that space dissipates. Yes. Yeah. But you know, on the other hand, you know, it's like, we're living in the apocalypse, you know, that it's like, if you took someone from 200 years ago, who used to be able to just walk up to the Klamath River, and dump a bucket in the water and pull out, you know, 12 salmon, and that was, you know, their food. And you were like, Oh, yeah, the way it works today is you go to Whole Foods, and it's $20 a pound. And it's, you know, pretty and that was you know their food and you were like oh yeah the way it works today is you go to whole foods and it's 20 a pound and it's you know pretty good you know they'd be like what have you done oh my god you used to be able to walk across the backs of the salmon you know across the whole river well we're trying to avoid slipping even further into that apocalypse i don't know if you've uh follow what's going on the bristol bay of alaska
Starting point is 02:34:41 with the pebble mine no oh it's, it's crazy. They're trying to – they're trying – and, you know, according to what Joe Biden said when he was running for office that – when he's in office, that will not happen. But they're trying to do essentially the biggest mine in the world that would destroy the salmon population. It would destroy the salmon population it would destroy it it would it would literally wipe out a gigantic not just a gigantic industry but a gigantic chunk of the
Starting point is 02:35:13 salmon i think it's i forget which kind of salmon it is um i don't want to say it's chinook i forget what kind of salmon it is but it's's the biggest population of them, certainly in America, but I think in the world. I think it's responsible for an enormous number of jobs. Apparently, there's fucking billions of dollars worth of gold and copper down there. Earthworks, what's at stake? An average of 40 to 50 million wild salmon make the epic migration from the ocean to the headwaters of the Bristol Bay every year. Like no place on earth. The Bristol Bay watershed.
Starting point is 02:35:55 They've been working to try to make this mine a reality for I think a couple of decades now. I think a couple of decades now, and people have been fighting tirelessly to educate people on what a devastating impact this is going to have on the ecology of that area and the fact that the environment will be permanently devastated. There's no way of bringing this back, and there's no way of doing this without destroying the environment because the specific style of mining that they have to employ in order to pull that copper and gold out of the ground involves going deep deep into the earth to find these reservoirs of gold and copper and there's sulfur they have to go through and then they have to remove the waste and mining companies have invested hundreds of millions of dollars in this and then abandoned it yeah so they were like we
Starting point is 02:36:41 can't this is we can't fucking do this and then people are like we can do it and then they've got and it's other companies that are i don't believe the company that's currently involved in is even an american company i think it's a it's a foreign company that's trying to i think they're from canada that are trying to do this uh spectacular maybe cat i don't know which company it is but it's's my friend Steve Rinella from the Meat Eater podcast. I want to recommend this podcast because he's got a particular episode on that where he talks about it. Let me find it real quick because it's pretty epic where he talks to this one guy who's dedicated the last 20 years of his life trying to fight this. Let me just find it real quick because it's really, it's pretty intense.
Starting point is 02:37:29 And it's terrifying when you see how close it's come to actually being implemented and how if it happens, there's no way you pull that back. Like once they do it. Yeah, it's like all that Standing Rock shit, you know, where they were like, no, the pipeline's going to be fine. No way that it leaks into the water or whatever, you know. It's like all that Standing Rock shit, you know, where they were like, no, the pipeline's going to be fine. No way that it leaks into the water or whatever, you know. It's like, sure enough.
Starting point is 02:37:48 Exactly. Unfortunately, I've already listened to it, so I'm having a hard time finding it in this app. It's motherfuckers. Did you find it? They have a bunch of articles about it. I hear previously played. Yeah, A Half-Life of Never never it's the october fifth episode that's a
Starting point is 02:38:09 good title yeah and the the gentleman's name is tim bristol which is kind of crazy just that is his birth name his name is tim bristol and he's dealing with his bristol bay yeah situation i mean it's just a random coincidence um and you read all that shit about the... Episode 241. Like when they were building all the dams in California and it's just like the salmon just bashed themselves to death. Dead, dead. They had to set them on fire.
Starting point is 02:38:36 Seattle, yeah. Same thing that happened up in Seattle. These knuckleheads, they didn't understand migration. These salmon won't go anywhere else. They have one specific river where they were born and that's where they will die and spawn oh it's crazy but these assholes that just want copper and gold are willing to do this and there was this one politician in particular that has as a gigantic windfall if he can pull this off um or lobbyist or whatever the fuck he is
Starting point is 02:39:02 but he stands to make i think they said 14 $14 million if he can actually get the shovels into the ground. That's how much he earns. So what are we going to do about it? Kill that guy. Assassination politics? Yes. Kill them all. No.
Starting point is 02:39:19 I'm kidding. Don't get me in trouble. You can get banned off of YouTube for saying something like that. I'm joking. What should we do? We should make people aware of it and make people aware that there are real consequences to allowing politicians to make decisions that will literally affect human beings for the rest of eternity. Because you will never have that population of salmon coming to that particular location that have been going there for millions and millions of years.
Starting point is 02:39:47 And the reason why you won't have them there is because someone is greedy. It's really that simple. I mean, we are getting along fine without that copper and without that gold, and we are using the resource of the salmon, and people are employed that are enjoying that resource, and they're also able to go there and see the bears eating the salmon and seeing this incredible wild place. Alaska is one of the few really, truly wild spots in this country.
Starting point is 02:40:18 Yeah. And someone might fuck that up. And if you get enough greedy assholes together and they can figure out a way to make this a reality and with the wrong people in positions of power that's a hundred percent possible yeah yeah you might even say we've organized the entire world economy to fuck that up yeah like yeah but i you know i think that it's like the the question of agency of like you know what is how do we affect these processes? Yeah.
Starting point is 02:40:45 It's tough. Well, just, I mean, I was joking, obviously, about killing that person. But there was a recent, one of the Iranian scientists was assassinated. And this brought up this gigantic ethical debate. And we don't know who did it, whether it was Israeli army. Mossad held a press conference to say, we didn't do it, while wearing t-shirts that said, we definitely did it where there was a Israeli army. Mossad held a press conference to say we didn't do it while wearing T-shirts that said we definitely did it. Assassinated Iranian nuclear scientist shot with remote-controlled machine gun.
Starting point is 02:41:12 Holy fuck. Holy fuck. It was in mid-daylight, which is what I was hearing about. Oh, my God. Dude, we're killing people with robots now, right? That was the other Iranian guy that got killed, Soleimani, who was also killed with a drone. I mean, essentially. This is out of another car, but whatever.
Starting point is 02:41:36 Oh, so a car was driving by and there was a remote-controlled machine gun? Mm-hmm. Fuck. It says he was in a bulletproof car too wow I don't know he was in a bulletproof like they knew they were going to kill this guy
Starting point is 02:41:50 yeah they did man damn so this is the question we got out of the car oh well there you go you fucked up stay in that bulletproof car
Starting point is 02:41:57 if you if you know that a man is going to like what if someone did that to Oppenheimer you know what if someone said hey we see a man is going to, like, what if someone did that to Oppenheimer? You know, what if someone said, hey, we see where this is going,
Starting point is 02:42:08 and we need to find that Oppenheimer gentleman, and we need to prevent Big Boy from dropping down and killing how many people? Like half a million people. What? He got shot by that remote. It was 164 yards away. Shot him and his bodyguard, and then the car they were in exploded. Lasted for three minutes.
Starting point is 02:42:33 Like, the whole thing was three minutes. Wow. So there's this ethical dilemma. Like, if someone is actively trying to acquire nuclear weapons, and we think that those people are going to use those nuclear weapons, is it ethical to kill that person? And if that person a scientist they're not a yeah right yeah i mean i think the causality stuff is really hard to figure out you know um but i think most of the time it's not about the one person you know that it's not you know maybe sometimes it is but i think most it's just like i feel like assassination politics
Starting point is 02:43:05 in the tech arena does not work you know that it's like you can get rid of all the people at the top of these companies and that's not what's going to do it you know that they're like these structural reasons why these things keep happening over and over again yeah i think they're trying to slow it down though right like this is the reason why do you remember when they employed uh sucks net sucks net yeah yeah you know i mean that was uh for the same reason right they were trying to Do you remember when they employed Suxnet? Stuxnet? I mean, that was for the same reason, right? They were trying to disarm the Iranian nuclear capabilities. That was the same thing.
Starting point is 02:43:35 But that was kind of crazy. They were like, we didn't do it while wearing T-shirts. They were like, we definitely did this. But they did that with computer virus, right? Which is pretty fascinating. Yeah. Yeah. And people didn't have a problem with that. They're like, well, that's... Well, I think people did.
Starting point is 02:43:48 Some people had a problem with that, obviously, but... Well, Iranians. Yeah, but also just like, okay... You know, you go down that road and... Yeah. And, you know, where things can happen, too. You know, a great example is... So, one of the things that came out
Starting point is 02:44:06 in a lot of the documents that Snowden released was that the NSA had worked with a standards body called NIST in order to produce a random number generator that was backdoored. Random numbers are very important in cryptography
Starting point is 02:44:22 and if you can predict what the random numbers are going to be, then you win. And so the NSA had produced this random number generator that allowed them to predict what the random numbers would be because they knew of this one constant that was in there. They knew a reciprocal value that you can't derive just by looking at it, but they know because they created it. And they had what they called a nobody-but-us backdoor. No bus. Nobody-but-us backdoor. And they got NIST to standardize
Starting point is 02:44:54 this thing, and then they got a company called Jupyter, who makes routers and VPNs and stuff like that. Juniper, sorry, to include this in their products. And so the idea was that the NSA would have these capabilities, they had developed, you know, these vulnerabilities that they could exploit in situations like this, you
Starting point is 02:45:12 know, that they could, like, take advantage of foreign powers and stuff like that in ways that wouldn't boomerang back at them. But what happened was, in, I think, you know, 20 early teens, Juniper got hacked, and somebody secretly changed that one parameter that was like basically the backdoor to a different one that they knew the reciprocal value to. And it's most likely China or Russia that did this. to uh and it's most likely china or russia that did this and uh then what's kind of interesting is there's a big incident where the opm the office of personnel management i think was compromised and they have records on you know foreign intelligence assets and stuff like that that their systems were compromised it seems like maybe China. And what's sort of interesting is that they were running the Juniper networking gear that had been, you know, hacked in this one specific way. And so it's kind of possible that, like, you know, the NSA developed this backdoor that they were going to use for situations like this, you know, against foreign adversaries or whatever.
Starting point is 02:46:20 And the whole thing just boomeranged back at them. And the OPM was compromised as a result. Wow. series or whatever and that the whole thing just boomeranged back at them and uh the office of opm was compromised as a result wow but and this is like um i don't know i think it's you know it's easy to look at things like sex net and stuff like that and just be like yeah this is harm reduction or whatever you know but um like in the end it can have real consequences and this is also why people are so hesitant about, the government is always like, well, why don't you develop a form of cryptography
Starting point is 02:46:49 where it works except for us? We can access the content. And it's like, well, this is why. Because if you can access it, if anybody can access it, somehow that's going to boomerang back at you. Well, I remember when there was a terrorist attack in Bakersfield,ifornia is that where it was i think it was bakersfield yeah where uh yeah uh san bernardino san bernardino thank you yeah and there was uh an iphone involved
Starting point is 02:47:16 and apple wouldn't open it for them yeah it wouldn't allow the fbi to have access to it people were furious and they were like if this this starts here yeah this does not end well yeah and i kind of saw their point but i kind of saw the fbi's point too like did you just open this one this guy's clearly a murderer has killed a ton of people and created this terrorist uh incident yeah but i mean it was it was a little disingenuous too right where it's like the f like the fbi had their entire icloud backup for this device like the only thing they didn't have was like the previous two hours or something like that and the reason they didn't have it is because they fucked up and like uh approached it in the wrong way and got themselves locked out of it oh and so
Starting point is 02:48:00 it's like they had it was their own mistake that led to the situation where they didn't have the icloud backup so then it's like what are you really going to get off this phone you know it's like they had, it was their own mistake that led to the situation where they didn't have the iCloud backup. So then it's like, what are you really going to get off this phone? You know, it's like the actual possibility of what was there was like extremely marginal. So do you think what they really want is the tools to be able to get into other people's phones? You know, where they've just been waiting, you know, for like the moment of like, okay, here we go. We got terrorists. We got, you know, like, yeah. That makes sense.
Starting point is 02:48:24 What did you think like when the State Department or whoever it was banned Huawei phones? Yeah. Do you think there was? I mean, yeah, it's mostly political, right? Like, it's complicated, right? Because there's, like, you know, companies like Huawei and, you know, the Tencent. OnePlus. People make TikTok.
Starting point is 02:48:48 Oh, okay. Yeah. Like, they're, yeah, they're doing, like, all the sketchy shit. But it's the same sketchy shit that, like, all of Silicon Valley is doing, you know? Like, it's not. Is it really? Are there, is that a valid comparison to what they're doing in Silicon Valley? Like, Huawei did have routers
Starting point is 02:49:06 that had third-party access apparently and they were shown that information was going to a third party that was not supposed to be right wasn't that part of the issue i am reading this wrong uh well okay i think there's like a couple um there there have been incidents where it's like yeah there's like data collection that's happening yeah well there's data collection happening like all western products too you know uh like i mean and actually the the way the western products design are designed are really scary i mean uh in the telecommunications space um there's a legal requirement called communications and law enforcement act or something like that that requires telecommunications and law enforcement act or something like that that requires telecommunications equipment to have um to have eavesdropping like surveillance stuff built into it like when you produce the hardware in order to sell it in the united states you have to like which hardware
Starting point is 02:49:55 like phone switches and stuff you know it's like when you make a normal phone call it has to have uh i forget what they call it like uh the ability to tap yeah they call it. The ability to tap. Yeah, they call it something else. But it has to have this ability to record conversations, intercept, lawful intercept. How does a signal call work? So signal calls work not using the traditional telecommunications infrastructure. It is routing data over the internet. And that data is end-to-end encrypted, so nobody can eavesdrop on those calls including us and but so communication equipment that is produced in the united states has to have uh this lawful intercept so-called lawful intercept capability but what's crazy about
Starting point is 02:50:37 that is that's the same you know it's like these are u.s companies and they're selling that all around the world so that's the shit that gets shipped to uae and yeah you know so it's like it's the secondary effect thing of like the united states government was like we're going to be responsible with this or whatever we're gonna have warrants or whatever and even that's not true and then that same equipment gets shipped to tyrants and you know repressive regimes all over the place and they just got it a ready-made thing to you know just avail everyone's phone calls uh so it's like i don't know uh. It's hard to indict Huawei for acting substantially different than the way, than, you know, whatever, the U.S. industry acts.
Starting point is 02:51:13 It's just certainly they have a different political environment, and, you know, they are much more willing to use that information to do really brutal stuff. Well, it wasn't just that they banned huawei devices but they also banned them from using google that's when i thought like wow this is really like what do they know or what has been google yeah didn't allow yeah well google has um no so you know android you're talking about like so-called Android devices. They can't use the Android operating system anymore. They have to now, they've developed their own operating system and now they have their own ecosystem.
Starting point is 02:51:52 They have their own app store, the whole deal. Yeah. But that's also, that's a business thing, you know, where it's like Google's control over, you know, Google is producing the software Android and it's just's just free you know they're um releasing it but they want to maintain some control over the ecosystem because it's their thing that they're producing uh and so they have a lot of requirements about it's like oh you know it's like okay you can run android oh you want all this other stuff that we make that's not part of just like the stock free thing you know like play services and you know all the google stuff and increasingly more and more of android is just getting shoved into this proprietary bit you know and they're
Starting point is 02:52:30 like okay you want access to this then it's going to cost you in these ways you know and i think it probably got to the point where huawei was just like um we're not willing to pay you know even either monetarily or through whatever compromise they would have to make. And they were just like, we're going to do our own thing. I thought it was because of the State Department's boycott. Oh, it could have also been that there was a legal requirement that they stop doing that, yeah.
Starting point is 02:52:56 Yeah, I think I might be, Jamie will find out. I think I might be right, but I'm not sure though. But it just made me think, like, I understand that there's a sort of a connection that can't be broken between business and government in China. And that business and government are united. It's not like, you know, like Apple and the FBI, right? In China, they would just give them the phone. Oh, yeah, of course, yeah.
Starting point is 02:53:21 They developed the phone. Yeah, they would have the tools to already get into it they wouldn't have to have this conversation yeah exactly they just send it to the directly to the people what we're terrified of is that these relationships that business and government have in this country they're getting tighter and tighter intertwined and we look at a country like china that does have this sort of inexorable connection between business and government, and we're terrified that we're going to be like that someday. Yeah. Yeah.
Starting point is 02:53:51 Is it just it? It is what it is? Yeah. I mean, and that's, I think, you know, a lot of what Snowden was revealing. Yes. It was like, you know, that there are already these relationships, you know. You know, the NSA called it PRISM, and, you know, tech there are already these relationships, you know. You know, the NSA called it PRISM. And, you know, tech companies just called it, like, the consoles or whatever they had built for these, you know, for these requests.
Starting point is 02:54:16 But it's, that's, yeah, it's happening. And I don't, also, you know, it's sort of like, I think a lot of people, a lot of nations look at China and are envious, right? I think a lot of people, a lot of nations look at China and are envious, right? Where it's like, they've done this thing where they just, um, you know,
Starting point is 02:54:34 they built like the great firewall of China and that has served them in a lot of ways. You know, one surveillance, obviously like they have total control of everything that appears on the internet. Uh, so not just surveillance, but also content moderation,
Starting point is 02:54:43 uh, propaganda, but then also uh like it allows them to have their own internet economy you know where like china is large enough that um they can have their own ecosystem where like you know google people don't use google there you know people don't use uh they have their own chat apps they have their own search engines they have their own social networks they have their own everything and i think a lot of nations like look at china and they're just like huh that was kind of smart you know it's like you have your own ecosystem your
Starting point is 02:55:10 own infrastructure that you control and you have like the ability to do content moderation and you have the ability to do surveillance and so i think the fear is that there's going to be like a balkanization of the internet where you know russia will be next and then every country that has an economy large enough will go down the same road. Was it, Jamie? There was a... It seemed like there's a couple things that happened that are what you're saying, but directly seems to be related to this.
Starting point is 02:55:35 Crackdown on facial recognition tech. House and Senate Democrats on Tuesday rolled out a legislation to halt federal use of facial recognition software and require state and local authorities to pause any use of the technology to receive federal funding the facial recognition and biometric technology memorial memorial moratorium act introduced thursday marks one of the most ambitious crackdowns on fate this has to do with that
Starting point is 02:56:01 it said it was part of this boycott that had to do with google's like uh antitrust suit that also had to do with facebook and they were looking into it this is from like a month ago i mean i think this is this is connected to what you're saying just in the sense that like um you know the people who are producing that facial recognition technology it's not the government it's you know volunteer whoever's services to the government and then you know the government is then deploying this technology that they're getting from industry. And in kind of crazy ways, like there's a story of the Black Lives Matter protester who they like the police, like NYPD, not like the FBI. NYPD tracked him to his house using facial recognition technology.
Starting point is 02:56:43 And so, yeah. How did they do that uh there's a story about in there's i've been finding stories no one knows what these things are there's things supposedly all over new york city and manhattan that are tracking everybody's face as soon as they go in there and people i've watched news videos from local new york local media asking people have you seen these what are they they get no answers well here's what's hilarious crime has never been higher new york city crime right now is insane that shit's not doing anything yeah well everyone's wearing a mask too that's also part of the problem but i
Starting point is 02:57:17 think you know the fear is that like so there's this like you know circle of like industry producing technology that is going into government. Stuff like facial recognition technology just makes existing power structures much more difficult to contest. Do you use facial recognition on your phone? No, I don't have any apps or anything that use it. You don't know with your iPhone? Oh, no, I just have a pen. Yeah. Oh, you don't use it.
Starting point is 02:57:47 What's going on, Jimmy? New York City Police Department uses facial recognition software to track down a Black Lives Matter activist accused of assault after allegedly shouting into a police officer's ear with a bullhorn. That's it? What about that guy who punched Rick Moranis? You fucks. They found him. They did? Yeah. Like last week. Right in You fucks. They found him. They did? Yeah.
Starting point is 02:58:06 Like last week. Right in jail. But they did find him. How'd they find him? They have facial recognition, Joe. But he wore a mask. I don't know. Anyway.
Starting point is 02:58:18 Listen, I think what you're doing is very important. And I love the fact that you approach things the way you do. And that you really are this idealistic person that's not trying to make money off of this stuff and you're you're doing it because you think it's the right thing to do and if there is a resistance people like you are very important you know like what you've done by creating signal it's very important there's not there's not a lot of other options and there's no other options that i think are as secure or as viable. Thank you. Thanks.
Starting point is 02:58:48 I appreciate you saying that. And I support it and I try to tell other people to use it as well. Last word. You have anything to say to everybody before we wrap this up? It's a lot of pressure. Sorry. Can I put out a public plea for a project i'm trying to work on sure okay i'm vaguely obsessed with uh this thing that happened in the 60s um are you familiar with uh the soviet
Starting point is 02:59:16 space dogs so the first animal in space oh it was a dog yeah like a uh like a died in space uh sadly the second animal in space was a dog called stralka uh struck a winter space made it back to earth and had puppies uh whoa those puppies can read minds when khrushchev came to visit jfk in 1965 he brought with him the ultimate insult gift which was one of the puppies that's an insult oh dude it's like oh do you have anything that's been to space we have extra puppies you know do you want one you know that's an insult dude it's the ultimate insult gift like the united states had no space program had never been the soviet union was like way ahead of them they're like oh we've just got extra animals that have been to space like here have one you know it's a puppy
Starting point is 03:00:02 stop being so personal that's what I would tell Kennedy. Just take the puppy. Well, Kennedy took the puppy. Kennedy took the puppy. The puppy had a Cold War romance with one of Kennedy's dogs, and they had puppies. Oh, snap. That the Kennedys called the Pupniks.
Starting point is 03:00:17 And the Pupniks captivated the imagination of children across America because Jackie Kennedy said something. She was like, I don't know what we're going to do with the dogs. And that ignited a spontaneous letter-writing campaign from children across America
Starting point is 03:00:31 who all requested one of the puppies. Jackie Kennedy selected two children in America whose names were Mark Bruce and Karen House. And she delivered two puppies to each of these people. One of them lived in Missouri. The other lived in Illinois. And I have sort of been obsessed with the idea that those puppies had puppies
Starting point is 03:00:54 and that those puppies had puppies and that somewhere in the American Midwest today are the descendants of the original animals in space, the first animal to go to space and survive. They've probably been watered down so heavily. Maybe, but but like chihuahuas and german shepherds and shit well they were all um they were there they are right there they were mutts they were um random dogs they found from around uh the like spaceport because they thought that they would be like tougher oh wow uh but they were small and uh so yeah i've been obsessed with the idea that these dogs could still be out there and i've been trying to find
Starting point is 03:01:32 the dogs so i've been trying to track down these two people uh notably karen house because she got the female dog and i think she's still alive and i think she lives in the chicago area but i can't get in touch with her because i'm not i don't know i'm not an investigative journalist i like don't know how to do this whatever so if anybody knows anything about the whereabouts of karen house or the descendants of the soviet space dogs i'm very interested my my goal is just to meet one you know how should someone get in touch with you i'm on the internet okay moxie just like that yeah i'm on the internet my name is moxie i love it thanks man I really appreciate it I really enjoyed our conversation
Starting point is 03:02:09 thank you bye everybody

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.