Pirate Wires - TRAPPED at Burning Man! & Elon Musk vs. ADL | PIRATE WIRES EP# 13 🏴‍☠️

Episode Date: September 8, 2023

EPISODE THIRTEEN: In this episode, Liv Boeree joins the show to give us an on the ground, in depth report, about what really happened at Burning Man. People Trapped? Many Deaths? EBOLA?! Or maybe none... of that at all. We'll also get into Elon Musk suing the ADL and the history of the group. Then we discuss Time Magazine's ridiculous cover of the top 100 in AI and Solana & Liv get into a debate around AI. Featuring Mike Solana , Brandon Gorrell, River Page, Liv Boeree Subscribe to Pirate Wires: https://www.piratewires.com/ Topics Discussed: https://www.piratewires.com/p/rage-against-the-wooden-man Pirate Wires Twitter: https://twitter.com/PirateWires Mike Twitter: https://twitter.com/micsolana Brandon Twitter: https://twitter.com/brandongorrell River Twitter: https://twitter.com/river_is_nice Liv Twitter: https://twitter.com/Liv_Boeree TIMESTAMPS: 0:00 - Intro 0:35 - Welcome Liv To The Pod! 1:10 - What ACTUALLY Happened At Burning Man? 34:10 - Elon Musk vs. ADL 54:45 - Time Magazine Top 100 In AI Cover 1:00:00 - Solana And Liv Debate AI 1:22:10 - Follow Liv On Social Media! See You Next Week! Pirate Wires Podcast Every Friday

Transcript
Discussion (0)
Starting point is 00:00:00 I was told there were men in hazmat suits there quarantining the entire playa. Potentially like many, many, many people were going to die. What was your experience on the ground? I think we had the most insane party I've ever had. We partied, just like danced for eight hours straight in one of our tents. If anything, I would say it became even more fun. He was at my camp. He was at your camp? Yeah.
Starting point is 00:00:33 Didn't have Ebola. Did not have Ebola. Welcome back to the pod. Today, we've got special guest Liv Burie, a former professional poker player, current host of the Win-Win Pod. And I guess I can say just ex-risk researcher, philanthropist, all around fun person on Twitter. We met online a couple of years ago, like all of my best friends. I think everyone here in this chat, I met online first, pretty much. Maybe Brandon. No, I think Brandon, I saw online before I met him. And we've got around today. We're going to talk about a lot of stuff. We're going to talk about the ADL versus Elon
Starting point is 00:01:11 stuff. We're going to talk about the Times AI 100 list, but we're also going to talk about Burning Man and Liv was there. I've been there. I went there years ago, back in 2015. I have a lot of thoughts on it, but that is where we're gonna start because live i was surprised to see i was i think monday when you tweeted and um i was shocked to see that you were alive um you had not you had not gotten lost in the mud you had not starved to death you had not been eaten by giant fairy shrimp you had not had your skin burned off by acid mud. Didn't have Ebola. Did not have Ebola, which was shocking to me because I was told that there were men
Starting point is 00:01:55 in hazmat suits there quarantining the entire playa. What was your experience on the ground? First of all, is there any truth? While you were gone, let's say Saturday morning about Friday night, Saturday morning, a meme sort of flourished. And it was multi-partisan. You saw it on the right. You saw it on the left. You saw people in the media talking about it. You saw regular-ass normie people talking about it. It was on the regular local news. It's kind of where it ended up by midweekend. And there was a sense that you guys were in the middle of an actual apocalypse level scenario, potentially many, many, many people were going to die. That was kind of what we were seeing. What was it like on the ground? And I'm wondering if there was any truth to any of that sort of leading up to the i would say like crazy misinformation this area um yeah so from my perspective uh yes the the city definitely
Starting point is 00:02:57 ground to a halt um in terms of you couldn't you know the way people usually get around a burning man is via bike um or on like an art car less than sort of three percent of people are walking um and you could not ride a bike on the mud the mud is insanely you know so it's usually on this this like dusty plier which is this very like baked hard completely flat surface but when it gets water added to it it's like this like incredibly fine silt that turns into um glue for want of a better word uh so it did ground the city to a halt but in terms of like the the atmosphere there if anything i would say it became even more fun because like the whole ethos of burning man is um about like expecting the unexpected and you know making the best of of strange situations which this certainly was and most crucially like you have to be self you know as
Starting point is 00:03:56 self-reliant as possible but also bring excess whether it is your excess food excess expertise effort and so on and so people you know just like turned up what they normally do which is like making sure that everyone's okay and taking care of each other um and you know so we couldn't go out and party like in the plier like you normally do so we just had the most insane on the friday night when the mud got absolutely like like at its worst and the rain was coming down really heavily i think we had like a couple of inches of rain. I think we had the most insane party I've ever had. We partied, just like danced for eight hours straight in one of our tents. But friends even hiked across Playa. A friend of mine hiked over for an hour and a half through the thick mud to get
Starting point is 00:04:41 to the... She figured out we'd probably be partying and came over so yes the city ground to a halt in a conventional sense but still burning man carried on it was just like generally more localized and there was no one on bikes um and then in terms of like was this you know if the rain had continued for like six days which would be such an unprecedented event that'd be like a like four sigma event something like that then maybe shit would start to get a bit real because then genuinely people would start running out of supplies but most people what people what people don't seem to understand is that most people weren't planning on leaving till monday anyway sunday or monday
Starting point is 00:05:16 and this happened on friday so it wasn't like there was suddenly a shortage um one of the most astonishing things about burning man is that you get 70,000 or 80,000 people together in incredibly dense, inhospitable conditions, whether it's rain or shine, for a week. And there's all kinds of altered minds and all sorts of stuff going on. And I have never, in my eight years going, seen a single act of violence like not one i've seen people bicker you know sure like tensions get a little frayed now and then but like it's i feel safer in that city than i do in any city on earth walking around as a woman and you know walking around as anyone you know and and nothing in that regard changed so it was yes the was written, you know, and don't get me wrong, some media were actually quite accurate. They were like, yeah, like there's been a load of mud. And
Starting point is 00:06:10 so people can't really move around right now. And that was the extent of it. Those media did a great job, but there was so much hype given like it's, it's apocalyptic out there, diseases spreading, blah, blah, blah. And it was just all bullshit. I think when the news started to break for the first time, it felt like a lot of people were talking about this who really had no sense of what Burning Man was because your description of the rain and how people celebrated and it was just fun and actually people saw it as a challenge,
Starting point is 00:06:37 they wanted to meet it and whatnot. I think for people who have never been there or who really don't know anything about it, it could sound in maybe two different ways. One, like you're making that up it couldn't possibly be like that or two like uh you know you're being a little bit precious about like you like these sparkle sparkly like perfect like environmentally conscious people who are like do-gooders or whatever um but it just is actually facts like this is a group of people who are looking for
Starting point is 00:07:03 first of all so many of these people are looking for, first of all, so many of these people are looking for an interesting challenge. And I remember before I went, this was a huge part of it that was drilled into me was, you know, the preparation for this, you know, you're going out there, the environment's actually quite extreme. Self-reliance is prized. People who are building shit, that's prized. When I first heard about it, I was in my early 20s. Burning Man is not like a music festival. I think people think of Burning Man and they're like, oh, people are out there doing drugs, dancing, and it's like Diplo is performing and whatnot, but there are no headliners. It's not even allowed. You don't have musicians billing their performances or whatnot.
Starting point is 00:07:36 Some go, like Diplo went to have fun and probably was performing in an art car or whatever, but- He was at my camp. Was that your camp? Yeah. But he was not like, you're not like a starred, like billed guest or whatever. That's not what Burning Man is. Burning Man, I don't even know if you could say
Starting point is 00:07:54 it's an art festival, it's a camp out. It's a very radical sort of subversive type thing that started 1986, beaches of San Francisco. The kind of first version of this were just people celebrating a solstice, the summer solstice, and they lit a giant sort of wooden man on fire. 20 people were there. This group, the Cacophony Society, this is a San Francisco area-based group. They're sort of dedicated to like mischief making and interesting pranks, or they were. Super, super countercultural group. They were actually responsible for the first SantaCon. This is before it became a
Starting point is 00:08:29 frat boy thing. It was just, John Law described it to me as, we got a bunch of Santas out on the Golden Gate Bridge to freak people out. And that's just like, wouldn't it be weird, he said, if you were driving down the Golden Gate Bridge and there were just 50 Santa Clauses standing there? That would be fucking weird. And it was, and that's how it started. And the idea was they took it to the desert and this was going to be from the beginning. It was a kind of synthesis of technology, of art. It was just like smart, strange, like hippie culture and punk culture coming together and doing something quite like radical, which was building a temporary city in the middle of the desert from nothing and then
Starting point is 00:09:07 dissembling it and vanishing without leaving a trace. And that was what was compelling to me as a kid in my early 20s when I first heard about it, just because we can't build anything today at all in the real world. And so how could you build? And if you see some of these pictures of Burning Man structures and the actual layout of the city and the organization there, it is, like you said, it is a 70 or 80,000 person city. And that's just fucking cool. And the kind of people who are attracted to that tend to be a little equipped. They also tend to be kind of radical and crazy. And I was thinking, man, I bet it it is really really fucking fun right now actually i think everyone is going to look back on this and this is going to be you know their their favorite
Starting point is 00:09:50 burning man first before i because i want to talk about some of the reaction that was on social but like my take on you know burning is that you've been there you said i think you say you were there eight times now yeah so you're way more into this than i am is that kind of roughly how you would describe burning man uh what have i missed maybe those that you could talk about which i didn't yeah i mean the main what's so interesting about it is that basically it's as it goes as hard as possible on like allowing for freedom as it can without falling completely into anarchy and it's like a really like tough dance to to have with any kind of society, right? It's like, what is the minimum number of rules that you can have that maximize
Starting point is 00:10:31 freedom without falling into complete chaos and anarchy? And it feels to me, at least for something of this size and in this environment, that they figured that out. They have basically 10 rules. They're not even rules, they're principles. I won't list them all, but things like- Inclusivity. Yeah, inclusivity. Everyone is welcome, whatever their creed, color, background, beliefs, you name it. And it truly is diverse. That's the thing. People are like, oh, it's all hippies or it's all rich people. No, it is truly... I mean, it tries as hard as it can to be as diverse as possible in the highest
Starting point is 00:11:07 sense of the term diversity as well. Ideological diversity, because you mentioned a good point. There's a lot of kumbaya hippie types, but there's also a bunch of fuck your burn, fuck you, fault of the earth, proper punks who will come and like, you know, basically like fuck up your shit a little bit. And I love that. I love that tension between those two, you know, those two sort of categories and everything in between. But then it's also stuff like radical self-reliance, you know, don't expect someone to get you out of trouble, even though probably someone will, because people are kind, you know, bring, you know, don't expect to just leech off the community. Bring surplus so that you can help someone else out and someone else will help you out.
Starting point is 00:11:49 There's no buying of anything. There's no bartering either. It's all done on gifting and surplus. Which is another, I want to just double click on that really quick because this is the kind of thing, when I went to Burning Man, I really expected to hate everybody there. I was going because I thought the history was interesting and I wanted to see a temporary city. That idea was enough and I just really wanted to see that.
Starting point is 00:12:12 And I was convinced that I was going to go there and some asshole was going to be like, sorry, man, we don't believe in money here. And here's a fucking soda and you got to give me something. It was actually, it was uncomfortable for me for about a half a day in which you're walking through the city and people are just handing you things. Like food, lunches, like whatever, free henna tattoos. Come in here and let's do a class on, I was next to a science camp back then called the Phage, run by, or partly run by a friend of mine who's a molecular biologist. Or a synthetic biologist, our synthetic biologist i'm sorry and um it's for real people are bringing stuff to the to this experience just to give it
Starting point is 00:12:52 to other people and it's just i can't explain how just it was nice and i i hate to sound like it's like i feel like i'm gonna fuck up my brand over here by complimenting this kind of behavior but like it was like really quite beautiful. And, um, everyone was primed for just interesting human connection with very, very strange, very,
Starting point is 00:13:12 very radical people, uh, with lots of interesting ideas. It was a, a idea festival, I think almost before all the other stuff. Yeah, I would completely agree. Uh,
Starting point is 00:13:21 it's, it's, um, and I mean, that's not to say that it doesn't come without tensions. One of the things it's sort of like, if you can take the community in aggregate is trying to do is resist falling into any kind of monoculture. Because I think that's part of the problem. When something becomes too kumbayari or
Starting point is 00:13:45 too um fuck your burn or whatever then it doesn't work and it's just like it finds a way to like coexist with these tensions in a way that something like you know online you know twitter does not you know we fall into for some reason diversity makes us go into culture wars there but in burning man diversity literally makes people stronger. And yeah, but anyway, what was it? Where was it before? Like, yeah, these 10 principles. Yeah. I mean, I don't want to list them all. One of them is like, leave no trace. Don't leave trash, basically. You know, everything you bring in has to bring out. There is no one who's going to pick your trash up for you. What else? Decommodification. That's an important one you know you touched on it basically do not come to like pimp your pimp your company here like that is one thing that everyone will
Starting point is 00:14:32 slam down because it's actually sort of saying like the commodification of self that you see on social media by that that i mean you know selling yourself online you know i'm taking lots of instagram photos and i'm sharing them with the world, but really I want attention and I want to earn likes and faves and comments and things like this. You see that online and I think that's the exposure that a lot of people have to Burning Man is just like this relentless, I think they're called like sparkle ponies on the planet. Like we're just like in the showy outfits and they're like, you know, constant posts like live, laugh, loving my way through Burning Man or whatever, really frowned upon. Like you're not even supposed to add your former out really at burning man it's it's like very yeah close to like there are like there aren't many rules but just culturally you feel this real strong resistance
Starting point is 00:15:12 to it and yeah i have to say it's one i struggle with myself like for so for example this year it was the first time i ever did anything where i like i decided i was gonna do two talks at it and like my ego and i sort of like went back and forth and it's like, aren't you down like self-promoting if you advertise that you're going to do these talks? And I'm like, yeah, I kind of am. So maybe I shouldn't. But then at the same time, like,
Starting point is 00:15:32 I know that I've got like the shit I want to talk about is like, you know, it's like my existential risk stuff and so on. And it really matters. And I think the Bunnyman community would benefit from it. So it's like, how do I navigate that? And like, and at the same time like i take you know i do get a couple of sick photos on the plier like maybe i shouldn't post them but like i want to in i want as many people to come and get to experience this experience for how do you do
Starting point is 00:15:55 that without like advertising it and yeah i i don't know what the right what the right path is there and i maybe i've probably myself like fallen too much into like the like posting of pictures um it's it's interesting but that's again like it's part of this tension that we have to figure out how to navigate but all of this stuff i really want i want to the strangeness of the event and the the way that it sort of doesn't track ideologically to any one thing and yet does mean something in general, it makes it really hard to share with people who have not been there. And it makes it really hard to attack in a way that is coherent. Now, if you've not been there and you see the attacks, it doesn't really matter because you don't know that it's incoherent. But while you guys were there, what I saw,
Starting point is 00:16:39 sort of stranger than the misinformation was, I don't want to say it's even strange at this point because I saw it with the submersible. We've seen this for a while. When people who are perceived to be better off in some way, experience something calamitous, there is a very loud contingent of people online who want to celebrate. I feel like there was an almost national holiday following the implosion of the submersible with nonstop, just hardcore socialists excited that billionaires died. This one was not quite so bad because I think one person died, which was framed as like, of course it was a disaster. Someone dies almost every year. This is an 80,000 person city. Yeah.
Starting point is 00:17:19 People die. At Burning Man, people died at all sorts of festivals. It is sad. It's like people are doing drugs. They're dehydrated. Things happen. If you take any random sample of 80,000 people on Earth, statistically, it's not that unlikely that one of them dies during that time period, during one week. Yes. That's part of life. So the reaction, though, you had right-wing people. So you had a lot of people on the right,
Starting point is 00:17:46 the sort of trad right, let's say, who were looking at this and they were saying, and this is kind of maybe what you expect, and this is what you would have gotten 30 years ago. You have people looking at that and saying, here are these devil worshipers in this kind of Sodom and Gomorrah-esque bacchanalia. Of course, God is smiting themiting them down one that's like a very brute basic one um two i saw like a generational one a lot of older um gen x people being like these millennials have reached middle age and they want everyone to care about this but it doesn't matter it's not cool and i'm like has it ever been cool like when has burning man ever really been a part of culture even like mainstream like subversive culture it's kind of an outlier. But then three,
Starting point is 00:18:26 you have your hard left and they're like, these rich people, right? All of a sudden, Burning Man is 80,000 venture capitalists. If you were just on Twitter, that is sort of, you're like, it's pagan venture capitalists. That's what everyone there is. And that was the commonality with everyone. I guess there was one commonality was there's a perception that everyone at Burning Man is rich. Crazy, because the Burning Man for me costs less than a trip to Disney World. It is a tent that I stayed. A yurt, actually, cardboard covered in aluminum foil, food, water, some costumes, and a pass for my Jeep and a ticket, which is like 500 bucks for the week, 550, I think, or something like that. So everyone's kind of coming down on this thing,
Starting point is 00:19:13 like projecting the thing that they hate most ideologically in the world onto this event that really has nothing to do with it or them or even mainstream culture at all it's its own sort of unique outlier what if all of you guys if i mean anyone jump in here like live what do you make of that uh that projection that weird thing that's happening where people are kind of throwing their they're casting their enemies so to speak onto this thing and then and then not only did they make up the person at burning, they then made up an event that is hurting that person and then they celebrated it. I mean, that's a fucking crazy cycle right there. Yeah.
Starting point is 00:19:50 I mean, I think it's part of like this cultural sickness that we have, which is schadenfreude. You know, it's wanting to feel pleasure at other people's misfortune, though in this case this wasn't even a misfortune they had to invent there was a misfortune and then they invented whatever their personal like belief system is that would enable to like enable to like point out what was wrong with it in the first place and why it shouldn't have existed um and frankly you know i don't know maybe this is wishful thinking but I'd like to actually think that this will make the event stronger because the people who are able to see through the nonsense will now be not unfortunately not everyone can go every year right like there is a hard cap of like 80 000 um these core principles are so valid like also valuable to society and so much the antithesis of this like you know not to you know coin a phrase but like it's such a win-win environment where somehow through adversity and all these differences, the sum of the whole is greater than the parts. And that's the exact opposite of this
Starting point is 00:21:12 very win-lose mindset that seems to have pervaded the online world and to an extent the real world as well. One of the most dangerous things we can have in the world is a false sense of zero sumness. There is plenty of scarcity out there, don't get me wrong. And there are truly win-lose dynamics that we have to figure out how to navigate. But what is even worse is when we falsely think something is win-lose when it's not. And Burning Man, because it is such a win-win environment, I think it's almost like there's this metaphysical entity that wants things to be win-lose and hostile that is collectively using all of the pathologies of all these very different groups of people to try and attack this thing that is threatening the abundant reality of these Burning Man principles. That sounds very hippie-ish. I've
Starting point is 00:22:02 probably still got a lot of it in my system. my system but um it it yeah it does make me sad but honestly i think it's gonna just make the event better i wonder if you can you said there are very important principles and i i agree with you in context i i felt um i know i got so much out of the experience even just honestly being off my phone for a week was a big part of that. Being isolated from, being quarantined from the internet and being forced to just be with human beings for seven days was very valuable. Back when I first started at Founders, it was like 2011, there was a guy, Luke Nosek on the team. He was a big burner. And he also, I mean, I came in through the Seasetting Institute. So this was like the primordial goof that led to, in terms of
Starting point is 00:22:51 intellectual goof, that led to, I guess, the Charter Cities movement and things like this that we're seeing today. It's these people. It's all of these people who, they want to build new places and new ways of doing things. And they're really experimental with community and whatnot. Luke, every time he would come back from burning man man like how do we bring that back here like what do we do here do what do you can you maybe you can't i think about this because uh in the context of like mars which was a really compelling idea to me and still is for so many reasons but it's a blank slate right burning man is sort of that it's a desert so you know it's it's a blank slate, right? Burning Man is sort of that. It's a desert. So it's really impressive that you can build so much, a city there for a week, or I guess there's a little bit of there's plenty before that, but most of the heavy lifting is done the week before
Starting point is 00:23:34 that it's gone. But maybe it's just, you need to be starting from nothing and it needs to be temporary or what? I mean, how much can you can you bring into the into the real world yeah it's difficult because like the real world already has these like very like entrenched starting conditions um i don't know like just operating from a principle of like like kindness that's one thing that i've found i've got this like um i know i noticed i as soon as i sort of logged back online like i was and there was all this like just like it was like a wall of negativity that i'd become that's vitriol man like like just like inoculated to and just like oh yeah this is normal and it like hit me i was like whoa like this is this is actually what the online world is like it doesn't
Starting point is 00:24:26 i think that was my because i just know the like you're talking about the nicest most open people ever who are too nice i was the asshole i feel in that group and and people were nice they were just fucking nice and so to see especially on tikt, just post after post after post of just hate. And it's not just for that, right? I'm on TikTok for work purposes. Okay. I'm on the Chinese spy app for a reason. I got to keep up with the kids. And it's like a hateful, it's just as hateful as Twitter, if not more. I don't even hateful is maybe the wrong word. It's like hater. It's like, who can we be fucking mad at? Who can we tear down? You want to find some weird social justice reason for it
Starting point is 00:25:05 it's it's all just based on this i think what you said this zero sum idea there's a limited amount of things and so we need to we need to take people down or something right in order like in order to to change the world you have to destroy first without you know instead of building it's it's like that's like destruction is the way to change as opposed to i just feel that those yes yes for sure and it's like those people they just want to burn shit down and i think that i'm i'm just tired of pretending that they don't so uh guys last thoughts on brain man before we move on i've got a question for for the two burners in the chat live in salon i would say my'm a burner. I was a guest happily. I will give you one day. You're a burner. is closer to the sort of human baseline? Which one is closer to... What if we took away the internet completely? Would the world be more like Burning Man or would the world still be
Starting point is 00:26:13 more like what we see on Twitter? I have a quick answer and then I'm going to give it to Liv. My sense is that it depends on what you mean by the world. When there's any group of people who have a lot in common ideologically, so Burning Men is a place where people are very different and diverse in lots of ways, but they're the same in one very important way, which is they're going to a place open. Specifically, they know that it is an open place meant for this kind of freedom orientation and meeting lots of different kinds of people went up. People are really primed for that.
Starting point is 00:26:50 So I think it's just I think if you have the values going in, it's very natural to be, you know, positive. I think when you have a collapse of values, there's a real struggle to regain that. And that is what is manifesting maybe as the vitriol. Yeah, I would agree. The other thing I would say is what the internet does that no other form of human interaction, certainly not Burning Man does, is that it reduces the amount of information transfer going on between humans, right?
Starting point is 00:27:26 By definition, I mean, this is as high fidelity as we get. You know, we're talking through a video link and so on. We can see each other. We can see each other's facial expressions. We can hear the tone of voice and so on. We say stuff. We can hear the ums and the ahs. Even that, you're more likely to have an argument with a friend over a video call than you are
Starting point is 00:27:44 probably face-to-face because there's even more information going on when you're more likely to have an argument with a friend over a video call than you are probably face to face because there's even more information going on when you're physically in and you're able to touch someone. Who knows what we can even smell biochemically, all this kind of stuff. Especially through the written form, there is just such a wealth of information being lost in every interaction that it invariably reduces the human experience down to this singular, very collapsed state, which by definition is dehumanizing. And when you dehumanize someone, that's when you get the whack shit happen. Like the conflicts gather. We tend to become more tribal.
Starting point is 00:28:24 We look for other ways because we can't physically reach out and hold someone's hand. We look for other means of connection. And it just gives space to this terrible tribalism to rise. And yeah, so that's why it's a great question because I go back and forth. I'm like, would we be better if the internet disappeared? And I don't know. I don't know the answer
Starting point is 00:28:47 it's crazy that we're asking that 10 years ago that was an unthinkable question but I find myself for the first time in my life actually having to sit with there are downsides obviously tremendous upsides but it's not just this utopian
Starting point is 00:29:03 everyone wins, everything is better scenario. We're losing a lot because of the internet. And I might make a lot of you mad at me for saying this, but I do think a big part of it is that we haven't figured out a model of capitalism that fits well with the internet. In many ways, yes, it's allowed. It's great. It's democratized the ability for people to build stuff and so on, which is fantastic. Capital know capital gets spread out more blah blah blah but at the same time because again like measuring you know a pnl like like uh you know quarterly earnings report and these kind of things they are again they are a very um monodimensional measure of like whether
Starting point is 00:29:42 something is good or not. Did we hit our quarterly earning report or whatever? It misses out all these other values. A social media network doesn't actually... It isn't optimizing for, are its users happy? It's optimizing for, are its users coming back and clicking refresh all the time? And those things are orthogonal, or if anything, they're actually anti-correlated. What makes someone keep coming back and clicking for more is actually probably, it seems to be more like things like outrage and negative emotions tend to create user engagement at present in the current setup, for example, social media and mainstream media, than what is actually making people smarter, happier, wiser, kinder, etc.
Starting point is 00:30:28 And so we need to figure out a way of measuring success of companies on the internet that isn't just simply a narrow metric of bottom line, or at least we need to internalize all these negative externalities into these- To carbon credits, but for happiness or something effectively exactly something like that yeah um awesome i want to know how live how live got out though we have to get the uh like did you see people having trouble getting out because that was a whole part of the narrative they're never getting out you know i saw some pictures anyone anyone who tried to leave on either the friday or the saturday had troubles getting out right because the mud was was
Starting point is 00:31:12 so thick um we were originally planning on leaving on sunday or monday um in the end we we decided to hike out on sunday because our camp leader it it rained again on Sunday morning and he was like, we were going to drive out with a friend in their RV and our camp leader was like, look, anyone who is up for hiking out, I think I would prefer you to because we don't know when the rain's going to stop. So if you're up to the walk and you can carry as much as you can, then please make the hike out. So we decided to do that. In retrospect, I wish I'd stayed
Starting point is 00:31:45 because the rain stopped. I wanted to ride out with the people we rode in. But in the end, basically, unless you desperately needed to leave on Friday or Saturday, which, as I said, 90 plus percent of people don't leave till sunday anyway then you basically were like nothing had changed um but no you couldn't easily leave i saw like some like you know i saw rv stuck in the mud etc people who tried to drive you know if you if you didn't have a big chunky four-wheel drive you weren't getting out when the when the mud was thick and when you say you hiked out you you hiked like it was like five miles to a road where they're something like that yeah we yeah our camp was on like the furthest point from the from you weren't actually with diplo were you no he he hiked famous now famous video was him and just rock and yeah these people said that he got a private like a private
Starting point is 00:32:38 shuttle or something he hiked like five miles to the road and then a fan saw him and they gave him a hit they ride they wrote him to town from what i understand he had like a show he had to be at that's the only reason why he was again he wasn't he was having the time of his life and yeah he made it he made him he made his flight and he got the show he is professional that was the reason because he had like a professional commitment he had to get out for the the only reason we left was because our camp lead was like look if you wouldn't mind just because we don't know how long it's going to be. And, you know, at some point we will need to think about food. Again, we were nowhere close to it.
Starting point is 00:33:10 And camps, that's the thing, camps want to provide for those who are maybe in shortage because, you know, our camp did have plenty and like maybe people in the area did were starting, you know, they might start to run low if the worst case happened, which it didn't. So, yeah. And I mean, it was fun. I liked i liked i like the experience of hiking out never done that before um but yeah about part of me which is you know we we had you know well so has stayed on for even more days yeah next year maybe there'll be another flood um and the shrimp last question no fairy shrimp i wish i. I love weird bugs. I didn't see any. That's too bad, man.
Starting point is 00:33:47 That was the one in a torrent of endless misinformation. That one came up and I'm like, I think that actually might be true. I think that there might be some fairy shrimp hatching from eggs. I kind of vaguely remember hearing about something like that when I was there, that this was like a potential. But so far, no hard evidence. The photos I saw were from years prior. Okay. I want to move on. Speaking of capitalism, we got to get to Elon Musk. So our boy Elon decided very publicly that he was going to sue the Anti-Defamation League. This is a phenomenal story. So River is going to give a little context
Starting point is 00:34:28 on the Anti-Defamation League in a moment. But very quickly, I'll just say, this is an NGO that has been going to Elon's Twitter's advertisers from the moment Twitter changed hands in an effort to have them leave the platform. The reason is because, and this is just one of many NGOs that have been participating in this kind of, I guess, activist effort. There is a perception among, let's just say it is the left that Elon's quote, free speech is bad for people. It is dangerous. It is going to lead, I think the real reason, to a Trump victory.
Starting point is 00:35:07 So there is an effort to punish him and until he relents and caves in and adopts the same speech codes that Mark Zuckerberg has adopted and that Twitter had before he took over.
Starting point is 00:35:20 I will say just broadly, cross-tech, there's been a thawing of that. Things are much better than they were two years ago, but we're also not deep in an election. And my sense is as we get closer to the election, all of the companies are going to readopt the speech codes. And there's just a question of whether or not that will happen on Twitter slash x.com. So first, before we get into the ADL versus Elon drama and sort of what it means, the NGO stuff, some background on the 18th Affirmation League.
Starting point is 00:35:49 River, go. Yeah, so the ADL, it's one of these organizations that's sort of seen as like an arbiter of acceptability. You can think of others, there's like GLAAD, they're like, you know, for the gay people, you have the Southern Poverty Law Center, and you have the ADL, which is a Jewish civil rights organization, but they also monitor other types of hate, as they say. But I think the perception that this is sort of a benevolent civil rights organization is not really correct if you look at their history. In 1993, it came out that the ADL had spent the last 20 years spying on American citizens all the way from sitting U.S US congressmen to college kids for a variety of things, the most strange of which was
Starting point is 00:36:48 people who oppose South African apartheid. So they were keeping files, like thousands of them, on all sorts of people who had attended anti-apartheid protests, had congresspeople who had introduced legislation against, like sanctions legislation against South Africa and they were selling this information to the South African government what is the reason because in general it is just a group that like you said it's like the glad but for Jewish people it's they go and police content to make sure that it's not you know hateful to the Jewish people I think is kind of roughly how I would think that's that's the perception of what they do um i would argue that the agl is very mercenary um and they have traditionally had a lot of connections to foreign governments for example
Starting point is 00:37:37 um turkey if you go to their website for their um courage to care award. You'll find that for 2000 and it goes from 2004 to 2006. It's missing 2005 because that's the year that they gave that award to Rachel Erdogan, who's the president of Turkey. And since he's become a dictator, they've taken it off. He was also denied the Armenian genocide, which is like the standard position in Turkey. And in 2007, the ADL actually lobbied, um, some would say for legal reasons, I'm not going to say on behalf of the Turkish government, but some people might say that, um, to stop a congressional
Starting point is 00:38:20 resolution recognizing the Armenian genocide. The fact that they lobbied against it is known. Their precise reasons are unknown, but people can draw their own conclusions. I think that some of it is, especially during the tenure of their old boss, Abraham Foxman, I think that it was political connections that he had abroad that was behind a lot of this, but I think there is something ideological about it. The sense that maybe, you know,
Starting point is 00:38:50 well, so maybe somehow if we recognize the Armenian genocide, it takes away from, you know, the Holocaust being this sort of unequivocal event that, you know, nothing else can be compared to. I don't know. I will say that it seems that there is broad contempt for the organization growing among Jewish, certainly writers online. Tablet has run this entire pretty robust series on all the different ways in which the group has kind of lost,
Starting point is 00:39:22 they're framing it as like lost its way. And that's kind of my read too, is that there used to be a group that did care really a lot about the Jewish image in American press, thinking that like this is going to lead in some way to violence. And maybe it did in certain places. Certainly before, but like it's certainly outside of America it has.
Starting point is 00:39:42 But I think that that was kind of roughly the thing is like, let's, let's control the image of, of Jewish people in America and not be violent. And then now it's just pretty much another left-wing NGO. At least that's the taboo. There is this broader kind of NGO thing that I really want to talk about. The way that we're a country in which you have a right to say whatever you want. That's kind of a big part of America. I would say that is maybe the one thing that people
Starting point is 00:40:13 kind of roughly agree on. They don't necessarily know what that means. And so they'll say things like hate speech needs to be illegal or whatever. But if you ask them how they think about free speech, they'll say they believe in it. They just define it in different ways now. That's a recent phenomenon. I think it's becoming less popular on the left. There have been times in the past where it's been a little bit less popular on the right. I think about the Virgin Mary paintings in New York City when I was a kid that were super subversive and left-wing artists were creating these things. You had a lot of right-wing Christian people with the amount about it.
Starting point is 00:40:44 But in general, we're kind of bought into the rough idea. Legally, the government can't do anything about speech that people don't like. And so, internet comes along, social media comes along. Now, everybody's speaking. A lot of people are saying things that other people don't like because everybody's saying something. How do you control that? It's like, what do you do? The platforms are going to have to do the censorship because the government will never do it. And the only way that you can make that happen is by pressuring them through their advertisers. And that's what we've seen from the ADL most recently. This is specifically
Starting point is 00:41:22 what Elon is charging them with. This is why he's suing them. Ironically, the Anti-Defamation League is being sued for defamation, potentially. We have to follow the lawsuit. It's still ongoing. And then we've talked previously about the Center for Countering Digital Hate, which is a sort of invented NGO that is ostensibly researching hate speech online. Again, creating fake studies, sending them to the activist press that hates the concept of free speech right now online under Elon specifically, and using that to scare away money and punish the platform. Elon's already said they are 60% down with American advertisers from where they were before he took over, all because of this. And it's a pretty big issue. This is a way that you create a de facto sort of censorship online. And it's a pretty big issue. This is a way that you create a de facto sort of censorship online.
Starting point is 00:42:06 And it's a completely unelected power. It's, I think, a very huge problem. Yeah. And the release statistics too, I can't remember if there's a Center for Digital Hate or the ADL that released these. I was reading them the other day. They're saying that, oh, increased use of the N-word is risen by this much, increased use of slurs against gay men have released this much. And I'm like, okay, well, I think a lot of that might also be people, like the only people I ever see post on Twitter are gay guys that I follow. So, I mean, I'm glad to have been liberated to be able to use that word. A full line, I don't want to drag him into this right now
Starting point is 00:42:47 I just had an acquaintance of mine he's a writer he's Jewish and he said that he saw he'd been seeing a lot more he's a left wing guy so he's already sort of not into the Elon speech apparatus he sort of credits Elon with a spike in anti-semitism on Twitter I don't see I don't know what anyone is talking about
Starting point is 00:43:04 I don't see this stuff and I wonder what anyone is talking about. I don't see this stuff. And, uh, I, I wonder if it's because I'm not like clicking shit. Like if you, it is, the algorithm has become much more TikTok. Like I will say the things that you look at or engage with even a little bit are pumped into your feed. Um, I wonder if that's, what's happening. I don't know. Do you have any, what is it? What are speech question generally, maybe, or the algorithm? Yeah, I mean, it's such a tricky, tricky line to walk as well. Because, I mean, I actually have noticed, like on my Burning Man post, I did notice one person made like a, you know, a comment on my partner who is Jewish. You know, a negative comment. I was like, huh, I've not seen that before
Starting point is 00:43:45 um so I don't know that's one data point but at the same time have I been looking for it no do I engage with it ever no although that said I mean I will actually go I mean I did go and click on that user because I was like who the fuck is this what the fuck is wrong with this person um so i don't know it's it's incredibly hard to measure um that said i hate the idea that anyone whether it's an ngo whether it's an individual who owns a platform or their you know their team of son can control what is and what isn't speech you know is it allowable speech because i mean i'm still reeling off the back of during of covid where like just the the degree of um censorship around legitimate solutions to the problem was so was so insane and there was such a like mono um monoculture about what is allowed to be
Starting point is 00:44:47 said. Like you couldn't discuss that, like the, you know, the overuse of like, you know, the fact that they were forcing, you know, mandating vaccines on young people where the risk benefit was very not, you know, very unclear whether vaccines were correct or not. You know, and I say this as a vaccinated person. But, you a vaccinated person. But after seeing just how bad the censorship can go in an unhealthy direction, which then creates a backlash, which is even worse than the first thing, I come away from that going, okay, we shouldn't be controlling speech. But at the same time, there still somehow is genuine antisemitism out there. So is genuine like there still somehow is genuine anti-semitism out there like yeah as there is you know so it's like uh i just don't know i'm not denying yeah i definitely know
Starting point is 00:45:30 that it exists and i'm wouldn't be surprised i know that it exists somewhere online i just haven't been seeing it and i agree with this is i think what i'm reacting to is what you're reacting to which is i know where they're trying to take this and i don't want that and i will i have to fight against that. It's specifically the guy in charge of the ADL is in a clip where he's going through, he's defending the concept of speech controls. And he compares anti-vaxxers to anti-Semites. In the clip, he's rattling off the things that are just beyond the pale.
Starting point is 00:46:03 And it's like, see, this is what we're talking about. You're using things that we all agree are abhorrent and then you're you're including things that we need to be able to debate and of course anti-vax that phrase note that word no longer means like i'm opposed to all vaccines which by the way even if you were you should be allowed to talk about that in my opinion i'm not at all i'm a what backs me up but like you should be able to discuss that they're talking about people who are opposed to vaccine mandates covid vaccine mandates that is what that is how the word was applied a couple years ago and we cannot live in a world where that is the stuff that's being being banned is me saying hey wait a minute i don't want to be forcibly vaccinated. That's terrifying to me. That is some authoritarian prison country shit.
Starting point is 00:46:47 And we got to resist it. And once you go too hard to it, I feel like it also makes things worse for the people you're ostensibly trying to protect. I mean, this is a little bit more of an extreme example, but Brazil recently passed or ruled, I guess, a judge ruled that people can be thrown into jail for saying homophobic slurs. And I'm like, I can't imagine like a worse thing for
Starting point is 00:47:11 gay people because then you're turning just like casual homophobes into political prisoners, essentially. And so that's just like feeding power to the cause. And I think like, obviously, like there's certain things that you shouldn't be able to say online, but I think once you become overzealous, um, it gives the impression that there's like this vast conspiracy against, you know, X group of like casual bigot or whatever that can then turn like, for instance, a casual anti-Semite or whatever into like a full-blown Nazi. Right. Yeah, that's the thing. They don't think about the second order effects. They just pay attention to the thing, right? It's like the short-term effect. Oh, this person said a bad thing. It's like, think about, you know, if you turn someone who is mild into basically, you know,
Starting point is 00:47:58 yeah, as you say, like a political prisoner, the backlash is going to be so much worse because then they're genuinely going to get people who are in the right now aligning with their cause. And that's even worse. On the narrow gay example, we actually have data to support this in America, which is for years, last, let's say five years, we've lived in a world that up until maybe a year ago was increasingly censorious on gay, lesbian, but really trans issues. And then the trans sort of censorship thing has had an impact on the entire rest of the alphabet, let's say. You just look at pulling data on acceptance of things like gay relationships, gay marriage is down, gay relationships, the concept of gay acceptance
Starting point is 00:48:46 is down among all age groups, including young people. That is, for someone who's, I grew up, it has only gotten better every single year of my life. It is really, really a big deal that it has gotten worse, just by the numbers. I'm not saying I'm like scared to be, you know, out there holding hands with my boyfriend. I'm saying that it is literally worse. We know that it's worse. We know that people's perceptions are worse and how could they not be? My perception is worse. I look at this crazy shit that I, and I can't, and I wasn't allowed to talk about it. I'm mad. I like, that is, that makes me, that makes me angry. I can totally empathize with where people are coming from, but there is a second order effect and it is, it is a bitch, but still that having been said,
Starting point is 00:49:28 it's like still not as scary to me as, as the medical stuff. Um, and, uh, I think it's a huge problem. Of course the media takes it as Elon's, you know, I'm going to sue the ADL, uh, the Los Angeles times referred to that as the single most anti-Semitic comment by a public figure in 100 years. This is crazy because as River pointed out in her Slack channel yesterday, Kanye West was out here supporting Adolf Hitler like a year ago. He was saying relatively nice things about him.
Starting point is 00:49:56 I think Alex Jones is uncomfortable. Making Alex Jones just like, get out of here. Also, Adolf Hitler had a side of him. Also, yeah, wasn't Hitler in the last hundred years exactly yeah yeah
Starting point is 00:50:09 yeah there was so much anti-semitism before Hitler too like it was like Americans were every all of America was anti-semites Russian pogroms like the history of anti-Semitism is real and long. And the fact that Elon's saying,
Starting point is 00:50:31 I'm going to sue the ADL, which tablet magazine is out here attacking every single day is the most anti-Semitic thing that's ever been said. That's not crazy. It's predictable in an information war, which is what this is.
Starting point is 00:50:44 And because I think people just, it's the same thing an information war which is what this is and and because i think people just it's it's it's the same thing with like black lives matter right it's when these organizations use such a good branding term that they become synonymous with the actual principle and that's what people need to realize is that these organizations are not they might be aligned sometimes they could be again either orthogonal or completely anti-correlated but they you know it's like how could you be against black you're of course black lives matter so you're against black you know you're now if you criticize the organization black lives matter now all of a sudden you're a racist it's like no i don't see that organization as the actual cause like it's it's maybe that's what we should actually have a ban on it's like
Starting point is 00:51:20 you are not allowed to name your organization after the principle itself or something like that. That would stop it. If we could just create very strange speech rules, right? Like three speeches done, but we're not going to ban, we're going to have like the weirdest rules ever. And you have to kind of navigate that. I would be interested. You have to put TM after it. You're like, are you talking about, you know, BLM TM or BLM, the actual principle? Like that's what people don't... And it's kind of this bait and switch thing that then these organizations do. It's like, oh, if you're against us, then you're against this cause. It's like, fuck you.
Starting point is 00:51:53 No, I'm not. I'm against what you're doing right now, but I actually agree with your cause. Yeah, I don't know. I think something about this is just very simple, which is the information war component of this that the media is engaged in. Of course, they're going to come after Elon. The LA Times is mad for a couple of reasons. One, they want money and they see social media, generally speaking, as where the money is.
Starting point is 00:52:14 This is what's leading to all the policy around the world that is forcing these companies to give publishers money. Two, the censorship stuff is a very popular opinion on the sort of press left right now, which dominates the entire media ecosystem. So they're going to go after him. They're going to say this. He is the biggest threat to them. He represents Elon, is the biggest threat to them and their cause of censorship online.
Starting point is 00:52:37 He represents the first example we have of a huge tech guy over the last five years who's really strongly, and I would say to a certain extent successfully resisted uh this bush now this sorry go ahead so i mean that's it i do think it is worth pointing out that i think elon himself has too much power um he you know he's just one person he is far from perfect um and like if he is not careful he can go too far in a different direction and that itself is is is a problem that needs to be examined so there is some truth to the general point it's like too much power in anyone's person hand anyone's person hands is a problem how do we build our system so that they are robust to not to to the whims of mercurial
Starting point is 00:53:26 people in general this is what jack said when before congress his sort of second appearance when he kind of re-emerged as like a very polished looking wizard um and laid down he was behind there was like a bitcoin clock behind him and he was just like laying down sort of like freedom, like freedom oriented principles one after another. He said, someone was like, should we be able to stop you from publishing X, Y, or Z? And he said, no, you should not have, I don't believe that you should have that power. I don't believe anyone, myself included, should have that power. And I think that he, I think that Jack is the only person I've seen to publicly really grapple with this unique power that now exists in the world and to fear and respect it. And I think he did the best that he could. And I think he really believed that
Starting point is 00:54:20 Elon would be better than him. And he really believed, Jack really believed that he failed in his use of that power over the course of especially the election. And yeah, it's a question that is open-ended. I think things are going to get a lot worse during the election. It does bring us to the media though, and the interesting way that they just almost transparently wage information war in the fucking cringiest ways ever. And this reminds me of the time just released its AI, like 100 most influential people in AI list, artificial intelligence list. Liv, you seemed like you had some thoughts on that before we started filming. What is your take?
Starting point is 00:55:02 Just like roughly, you know very lovely looking photo all sorts of people none of whom i recognize at all other i think there were like five that i recognized uh what what are your what are your thoughts i mean i guess i just have a general like distaste for these these lists in the first, because it's like, who are the people at Time? And don't get me wrong, I think Time, I've been impressed with Time's general platforming, particularly of some of the saner arguments about AI safety, or let's call it AI awareness of risk that perhaps other places haven't been. That said, just like when these lists do have real life impact, when someone decides these are the most 100 influential people and invariably they then miss people off or intrude people
Starting point is 00:55:58 who just happen to be, you know, who happen to have written a blog at the right time, Mark Andreessen, you know, on the list that actually probably don't deserve to be on there. It's annoying because that then sets the tone for the next few years. They're the ones who are going to get called on to speak at conferences or whatever and whatever. So, you know, again, it sort of like speaks to this problem of like, ultimately, there is these like centralized powers that um have real world impacts even though they don't necessarily have the the chops to to call that and you know the thing that just annoyed me with this list is like there are people on there that
Starting point is 00:56:34 probably shouldn't be and there are a lot of people missed off um who should be even in the top 10 and aren't included perhaps that's because they wanted to stay on below ground. I don't know. But yeah, it's just this general idea that anyone knows how to rank people of something such as major as AI, and also how US-centric it is. There's so many British people or foreign people that aren't included that should be. Well, I'm fine with the US-centric nature of it, but I think it's really crazy that time thinks that they're the ones who are going to create this list. Artificial intelligence is very, for most of my memory, it was a niche, very tight knit group of people working on something that most thought was science fiction and borderlines were stupid. There are important people in that space. I, as someone who is
Starting point is 00:57:28 definitely closer to it than Time Magazine, would never even dare be so arrogant as to think that I was going to list the most 100 important people in that group. It's crazy. I wouldn't even do that for something that I do myself, which is right. It's just like a tremendous arrogance. But as you mentioned, it does have an impact. This is what Time is attempting to do here. And they included some people who you would obviously have to include. Sam Altman is the one that comes to mind, right? You're running OpenAI. That's the most important AI company in the world. Demis Isavis, I would say, is even known. I'll put Demis above.
Starting point is 00:57:59 What was that you said? Demis Isavis. So Demis of DeepMind is there as well. Incredibly obvious one, important one. Yes. But then what Dennis Hasselbus. who is, she does not work on AI. She's not an engineer to the best of my knowledge. Maybe she has a background in engineering. It's not what her job is now. She researches like AI safety, she calls it. No, no, no, no.
Starting point is 00:58:32 She would be mad at you for saying that. She researches AI ethics, which for some reason, she doesn't like AI safety, which makes no sense to me because- It's funny because I saw her go out- All the same shit. While you're in AI ethicist.
Starting point is 00:58:42 She's like, where did this come from? It's like, where does it come from? You talk about AI ethics all the time. But the point is, she is someone who believes genuinely, or I don't know about genuinely, I actually don't think genuinely, but her argument that is endlessly parroted in the press is that the people working on AI right now are genuinely interested in genocide. That these things are being built in such a way that they will lead inevitably to genocide. They're white supremacists. And it sounds crazy because it is a crazy thing to say out loud, but I promise you that this is her argument. Please check out the piece.
Starting point is 00:59:15 I analyzed her entire, I watched her whole ass stupid talk on this thing. She really believes this. This is what she does. This is a woman who quit Google. She says that she was fired. She actually resigned and they accepted her resignation wisely. And that's kind of that was her claim to fame. She worked at the place that didn't do really much at all and got lapped
Starting point is 00:59:38 by OpenAI. And now she wants a seat at the table to talk about how this stuff should be regulated, which is really crazy. Places like Time elevate voices like that as a weapon against the table to talk about how this stuff should be regulated, which is really crazy. Places like Time elevate voices like that as a weapon against the technology industry. I know, Liv, you have very complicated views here probably because you study X-risk and you're super of the belief that this is going to kill us all potentially. No? Okay. So I don't want to put words in your mouth, but it seems like you probably are more into strong regulation generally speaking um and maybe i just actually give you that what take it what what do you what do you kind of think about that right there yeah i mean i honestly i
Starting point is 01:00:14 feel very almost bipolar on the topic of ai like some days you know when i when i've been having conversations with people who are like more on the accelerationist side um but are actually thinking about the problems properly. I'm like, okay, we're actually going to be all right because these guys are thinking about this. And then I'll speak to ones who are pouring money in and literally don't understand the principles at all. For example, not to pick on him, but he blocked me on Twitter after I criticized him for a second. So I feel free to. It's Mark Andreessen who completely missed the boat on AI and then writes one blog saying how, oh, intelligence doesn't equal control, we shouldn't be worried about it. And arguments, he clearly doesn't understand the concept at all, goes and gets put on the
Starting point is 01:00:57 list. And basically, right now, the actual race to develop AI is completely out of control. So that might be okay if we somehow figure out alignment in time as this wheel turns faster and faster, because that's the way it goes. The more progress is made, the faster it will get. But there is effectively no pause button right now. There's a lot of discussion of how we could build one and so on, but that doesn't exist. So all the while it doesn't exist, and all the while that bad actors exist in the world,
Starting point is 01:01:42 people who would use technologies for their own ends, then I think we are under, you know, insufficiently prepared for the types of risks that are going to be emerging from AI. So that's why I would technically fall into the AI safety camp in terms of I think that that is more neglected than progress. That said, I think AI, we need in many ways to solve some of our big coordination problems. Like- I'm glad to hear you say that. So I think, so just to defend Mark for a second, I'm not going to defend the list because the list is crazy. Like just clown world should not have happened. Like control alt delete the list. Like get the list off the table so bad. Mark is in, I think that he's at a place where he is experiencing some of what I've experienced
Starting point is 01:02:34 from AI safety people, which is just, I think maybe you were caught up in, I don't know what happened there, but I know what would happen with me. Like I've become very sensitive to the AI safety people because they come after you like you're an idiot and you've never thought of anything. And they're the only people who've ever thought of this issue in the entire world as if there are people working on this right now, who are actually working on it, who don't agree with them. And I think it's very complicated. I understand that it's complicated. I think that there is something from Eliza Yudkowsky where I became more on the acceleration side. Where I took a harder line on the safety
Starting point is 01:03:14 people is when Eliza Yudkowsky came out and made that comment, did that interview where he made that comment about bombing the data centers, which I know we can quibble all day on the exact phrasing of that and what was really meant. But it's like, I don't want to ever be in a place where we're talking about bombing data centers. And I think that it's like, at that point, I thought it wasn't even the violence of the rhetoric. It was just the hysteria of the rhetoric and the kind of surreal nature of it. It didn't feel like we were grounded in a real conversation anymore. It felt like we were talking about science fiction. And I know that he was a science fiction writer. I'm a science fiction writer. That kind of stuff really bothers me. And probably what's happening there is people react. And I've had friends of
Starting point is 01:03:59 mine, not you, but maybe you've been in my mentions on this issue actually but but even jeffrey fowler another one who i like a lot but like he comes in hard on this issue um it feels i can i can see where a quick block maybe could come from there well okay but i didn't i just i just pointed out like the did these arguments make no sense um i didn't did you say like i can pull up the exact tweet and we can you can share i don't know but i was like i was I just pointed out, like, did these arguments make no sense? I didn't. Did you say it like that? I can pull up the exact tweet and you can share it. I don't know. But I was shocked that he blocked me because it wasn't, you know,
Starting point is 01:04:33 it was, okay, well, anyway, where to start? So I don't know. First of all, the Eliezer thing, I'm not here to defend Eliezer. I don't agree with some of his conclusions. I certainly don't agree with the methodology he uses to discuss some of these things. So I don't like that he is considered the leader of AI safety because it's wrong. I think there are far better like someone like Connolly, for example. Check out Connolly. Go into his debate with George Hotz, which is one of the best things I've ever seen, where it's like the steel man for the acceleration is the steel man for the safety. And it goes back and forth. If people actually want to understand the tension of the issue, go watch that. Because those are two brilliant people having a really good faith discussion. And like truly, and you know, I'll leave it up to you to decide who wins, who loses. Everyone wins. That's a classic win-win discussion.
Starting point is 01:05:27 So I think, first of all, there's that. But that said, what annoys me about people quoting, oh, Eliezer said we should be bombing data centers. He was talking A in the hypothetical. So there was that. Yes, he was. He was saying if in a... Well, okay, would you...
Starting point is 01:05:44 Okay, let's say, let's say let's say i.e some you know uh kim jong-un has developed a pathogen that is as contagious as covid and as deadly as ebola we might need to go and bomb his labs would you or would you not be in principle against that um would you say no okay no you know that's for what he said at all eliza said he was talking about a treaty if we have a treaty to not be doing ai research and someone breaks the treaty you would have to bomb the data centers that's not comparable to like a concrete actual because because you don't see you don't you can't yes if there was if if fucking terminator evolved
Starting point is 01:06:25 on a on an island and they were like we can only stop it if we bomb it that would have been a very different conversation than like there are these nerds who refuse to listen to our international treaty better kill them that's that's like that and you said you don't want to defend him so i don't want to defend him but no no no no but i will point out one thing like is that you did once upon a time a few years ago we had a little disagreement where you were like, I don't believe that these, you know, AI safety people actually think that this is a genuine threat, because if it was, they would be talking about violence, and they're not. So now when they do go and talk about a thing of violence, you it that doesn't mean that i justify the violence i and i stand by what i said most of these people i don't think really believe it i think it's too abstract of a question i do believe he does and i'm glad you brought that up because i think it's like one of the smarter things that i've said i was like this it's like very obviously you're all lying i think it's a great argument like why they're clearly not lying
Starting point is 01:07:21 they're clearly not lying and i mean i i don't know he's not I don't know. He's not. And now we're like, great. They really believe it and they have to be stopped because I don't want to live in that world. Okay. Well, as long as you have really, really robust measures to be able to concretely know that you can stop No, I think the burden is on you to explain to me why I have to start bombing data centers. Like, I'm not the one who's recommending that.
Starting point is 01:07:44 I don't think we should. I'm not recommending it either. I'm just saying, like, he's got to start bombing data centers. I'm not the one who's recommending that. I don't think we should. I'm not recommending it either. I'm just saying he's got to come up with better art. It's not immediate to come up with all these weird, crazy arguments that can prove for sure the thing. He has to prove beyond reason that people potentially need to die to stop this thing from happening. And that's why I disagree with what he said because i think the the error bars are way too too high to ever justify uh at least at present to justify any any major major things which is why from my perspective i think we need to be talking more about like
Starting point is 01:08:16 for example even just having discussions about treaties or at least having some kind of like um um sane measurement of when systems are starting to get so powerful that we may not be able to control them how do you regulate sorry go ahead i was going to say when the issue with the tree like how effective would treaties be really because i think it would be difficult to get actors who would probably potentially do the most destructive stuff, Russia, China, North Korea, like pariah states like North Korea, Iran, to sign on to these treaties. I mean, if Zimbabwe signs on, sure, whatever, or the Netherlands, but they're not going to do it anyway, probably. So I don't know how effective that would be if we're really worried about AI terrorism or whatever from,
Starting point is 01:09:07 you know, states. Well, if nothing else, it would start a means, it would create a norm of paying attention to when people are building these, because right now it's just anything goes. It's like really anything goes. Anyone who can, like no one appreciates the dual news Literally anything goes. Anyone who can, like, no one appreciates the dual noose sufficiently like... So you want inspectors like nuclear inspectors that go into like Russia and cabin inventory or like whatever? Something like that. Now, of course, they can defect, they can do all sorts of things. But right now we have nothing. We have literally nothing. You know, this is like 1940. We've
Starting point is 01:09:43 discovered how to split the atom. What did we have in 1940? Well, don't say we had nothing. We had nothing. But wasn't that bad? How? It worked out for us, actually, to get to a place where we had the bomb and other people didn't. I wonder, this is, and I don't know the answer to this question, but a lot of the, you always see when it comes to the question of international sort of sharing of information to be like on the question of safety and whatnot,
Starting point is 01:10:10 it's always the countries that are behind that are really, really excited about this. And I think it's because they see this chance at getting more information. And I think they probably correctly perceive us to be in a kind of AI arms race at the moment. How do you think about that? There is an international piece that concerns me. Like it's just a foreign policy thing. Absolutely. No, I think it's a huge, huge concern. I also think it's a huge concern that anyone, that power becomes too centralized. That's also fucking terrifying. Who watches it? Who watches it? Yeah. And it's not like-
Starting point is 01:10:42 It's the worst version of the censorship conversation we we were having because now you're talking about this thing that controls everything yeah i don't we don't this is also maybe why i keep shutting down because i'm like i don't see i do not see a um i do not see a medicine that is that is not really really really toxic in itself this is nick bostrom in his book talks about his his suggestions are chilling they're cathartic and dystopian and um i don't i just don't want to live in that world and so what am i supposed to choose between here right exactly and it's uh daniel schmachtenberger calls it i don't know if you've seen him i did a long chat with him uh on like ai and this like moloch problem um I always call it Moloch. It's basically the arms race dynamic
Starting point is 01:11:30 where, well, if I don't do it, then those guys are going to do it, so I might as well do it anyway. Basically, these game theoretic drivers that essentially makes everyone trapped in this situation where no one wants a bad thing to happen, but it ends up happening anyway, because it's really hard to align. And he talks about this need for like some kind of third gravitational attractor, because right now we have these like seemingly two paths. We either go down the like completely decentralized, no rules, anarchy route, which will make, you know, leave it like things open to bad actors. You know, again, like, especially, you know, leave it like things open to bad actors. You know, again, like, especially when you combine AI with synthetic biology,
Starting point is 01:12:09 you're going to be seeing like pathogens we've never even imagined before. And like, this becomes more and more democratized until like any idiot with basically a few thousand dollars and a little lab they can build in their garage could in theory kill, you know, kill millions of people. Or in the other direction, you have more and more centralized control which leaves us vulnerable to like tyranny forever all of these these are terrible options and it seems like those are the only two it was either one or the other we need to i don't know what it looks like but some kind of like third don't like attractor that minimizes both risks and maximizes the best outcomes of those i think about mid journey and all of this sort of uh when i talked i interviewed grives a while back
Starting point is 01:12:53 and she had just like the day that i was interviewing her the day before that she was just getting off of this sort of i want to call it they refer to it as a hackathon it was sort of like a dreamathon where she was working with a majority of people to create different potential utopia images, images of what a utopian society could actually look like, which was very cool to me. The generative powers of AI, I think, are really interesting. Could we use the AI to help us conceive
Starting point is 01:13:20 of a new safety AI order? Do we need the AI to help us solve the AI problem? Yes, it seems like it. And that's the chicken and egg situation we're in. In many ways, I don't see a way for us to navigate all the increasingly interrelated crises we have, you know, like supply chain issues coming from climate change, which is coming from blah, blah, blah, blah, blah, blah. We need AI to solve many of these, but at the same time, the more powerful of AI systems we build, the more potential there are for them to either fall into the wrong hands or
Starting point is 01:13:59 some kind of accident happen and so on um and mid journey is an interesting example because it is like it's such a i don't know he like it feels like david has like touched on something where it's like i'm going to optimize for beauty yes as as a as a shared value and it's something that everyone seemingly like nods their head to there's another project that's a good way of putting it and it is so different it's also funny that he's escaped most of the conversation about safety because it's not words it's it's just like these yeah these beautiful he's he's optimizing and there's this another team um who i have chatted to a little bit who are like they had the idea of like getting as many people from again as many diverse backgrounds as possible to just input little
Starting point is 01:14:45 stories of meaningfulness, moments in their life in the last day or the last week or the last year that felt meaningful. In a few hundred words, write it down and add that to a database of basically human meaning. And then maybe we can start training in AI on that data set. So it's not just the internet, which has everything from really cool shit to horrific Nazis and other stuff. No, let's do stuff on what people, these uniquely human things that we feel meaningful, that contains this wisdom of humanity. And let's start trading it off that.
Starting point is 01:15:21 I think they're going to call themselves wise AI. That's directionally. Which I love and I think is really important is that you're talking about curating groups, curating communities, curating ideas, and being intentional about what you want to build and create and things like this. It is to sort of tie back to an earlier concept in this conversation, we're talking about the incentives of social media and what it drives. It was so obvious in the AI safety conversation. Let's say, I think it was a handful of months ago, you had Gary Marcus's open letter that was published. I don't believe the Lazar Joukowsky was on it. He might've been, but he then had that essay on,
Starting point is 01:16:01 I think he was on it. And then he had his essay about the data centers and whatnot. Timnit Geber goes after both of them. And I was surprised at first because I saw Gary and Eliza as two of the most ferocious critics of this stuff out there. And she is one of them as well. They critique it in different ways, but they're both very, very, very aggressive in their sort of tort attacks that they're throwing out there. And then it occurred to me that, and it was like, why isn't Timnit on the sort of open letter? Why isn't she signing her name to sort of do regulation and whatever? They're all in competition with each other for attention. And they all want the same thing.
Starting point is 01:16:45 They all want to be, they're all on that list. I don't know if Gary Marcus made it, probably not, which is hilarious. But the other two are, they want to be in that top five. They want the attention. Our social media incentivizes this kind of behavior. And so, of course, they're saying extreme things because how else are they going? Of course, behavior. And so of course they're saying extreme things because how else are they going? Of course, it's like bomb the data centers versus the AI genocide is coming on purpose. Sam Altman specifically wants to do it. And how insulting, by the way, to say that to a Jewish person. But that's the frame. They have to compete within that frame. And I think until we figure out- I will point out that there's only one party
Starting point is 01:17:26 in that group that you mentioned that's actually going after the others, whose name begins with T. I was as shocked as you were. I was like, when they were... I mean, I guess because to be fair, there is like a finite amount of like funding that can go into essentially you know uh research or you know sort of slow down dollars effectively um which is what
Starting point is 01:17:54 they're all kind of calling for right or you know like let's let's slow this down or like let's let's scrutinize this there is a finite pool of that and so so if she sees the existential risk concern people getting a lot of dollars, then it means that the algorithmic bias people are getting less. I don't think that's strictly true. I mean, I think all areas are horribly underfunded, personally. I even think, and I can't stand her methods and the way she goes about it in such a zero-sum way, but I do think she just still has some kind of point that i think any bias that exists is you know as systems become more powerful is going to become more and more amplified until it does actually become a problem so i think that should also be looked into but the point is it's like it's not
Starting point is 01:18:37 a competition um and like ultimately the the the the race dynamic this Moloch monster is by far running the show and is so powerful. That's really what everyone needs to be talking about. And not even individual VCs or whoever. Last question. It's a difficult one, impossible almost. So if you guys don't have, and this is for everybody, I mean, how do you create a more curated experience online that allows for more productive conversations and art and collaboration and things like that?
Starting point is 01:19:10 Because it does feel really hard. You're up against title forces here. I don't know if you can, but I do think... I don't know if you can, but I do think personally, I think that a vast majority of like censorship has to go away. I don't think that if you go back to like even like early to 2010s, you have Mark Zuckerberg saying like, I'm Jewish, but I'm going to allow Holocaust denial on Facebook because of like free speech. And it's like been totally memory hold. Like we are really not that far away from that sort of mindset dominating in tech and in social media. And I think that we do have to, we have to go back to where, you know, unless someone is being like threatened in a very specific, like violent way, you kind of have to let the riffraff in because eventually they'll
Starting point is 01:20:11 tire themselves out is the thing i i believe um i i think that too much curation might actually be the problem i feel like once you everything becomes incredibly curated, I mean, I guess it's fine for Instagram or whatever if you just want to look at like dogs or hot people or whatever all day. But if you want to actually if you're actually having conversations, I don't think curation is actually really something you should be looking for. I just want to know why time left
Starting point is 01:20:39 off the most important person in AI. Like, unbelievable oversight. Ja Rule, they didn't put him on. We need his opinion on everything. He could have sorted this conversation out from the start. Liv and Solana would have been
Starting point is 01:20:56 completely in alignment here if we just had Ja Rule. Ja Rule has probably solved the alignment problem. He's a baseman. He's sitting on a pen. Someone just has to ask him. He's a baseman. He's an independent. No one's asked him. Someone just has to ask him. He's got it done. Let's fund him. It's an interminable problem, it seems.
Starting point is 01:21:11 But I think we need to think more about this tyranny of metrics, where if you are optimizing for just one narrow thing, then you are leaving yourself vulnerable to basically being misaligned. It's like, we need to find a better way to capture all of the values, same as something like carbon credits and so on, to internalize the externalities of these competitive games that are going on between these companies that make us end up optimizing for things like outrage instead of joy um or you know black and white thinking instead of nuance so anything that can like somehow encapsulate a nuance and uh wider a wider band of metrics into into the games that are going on online um would be a good thing um how the fuck we do that i have no idea well maybe on the next spot thank you guys for uh thank you for joining
Starting point is 01:22:13 us today uh thank you for having me brandon again i love to say this joke i gotta stop saying it eventually you have no choice but to be here um lib i'll see you on the internet later guys bye

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.