Pirate Wires - Balaji & Beff Jezos Respond To Forbes Doxxing The e/acc Leader | Pirate Wires Podcast #26 🏴‍☠️

Episode Date: December 8, 2023

EPISODE #26: This week on the Pirate Wires Podcast, Solana is joined by Balaji Srinivasan, and Beff Jezos. We're breaking down Beff's recent doxxing by Forbes and expose the evil ethics behind... the media. We also discuss the media losing its power from tech companies, the vibe shift of CEOs, the future of e/acc, and AI concerns. Featuring Mike Solana, Balaji, & Beff Jezos (Guillaume Verdon) Subscribe to Pirate Wires: https://www.piratewires.com/ Topics Discussed: https://www.piratewires.com/p/thot-police-forbes-has-doxed-beff Pirate Wires Twitter: https://twitter.com/PirateWires Mike Twitter: https://twitter.com/micsolana Balaji Twitter: https://twitter.com/balajis Beff Jezos Twitter: https://twitter.com/BasedBeffJezos Guillaume Verdon Twitter: https://twitter.com/GillVerd TIMESTAMPS: 0:00 - Welcome Balaji & Beff Jezos! Like & Subscribe! 2:15 - Beff Got Doxxed By Forbes - Everything You Need To Know 14:15 - Lack Of Ethics In Media 24:20 - How Tech Ruined The Medias Power 36:00 - e/acc Explained 52:50 - Vibe Shift By CEOs In Tech 1:02:30 - AI Fears - Centralization - Controlled By Few 1:26:00 - The Fight For Freedom - Speech, Compute, Exchange 1:34:00 - Thanks For Watching! Like & Subscribe! See You Next Week! #podcast #ai #balaji #eacc #tech #media #polictics #culture #freedom

Transcript
Discussion (0)
Starting point is 00:00:00 They not only doxxed the correlation between my principal identity and Beth Jesus, but they voice doxxed me. I didn't want to talk to them, but then they were like, look, this is going to really come out. I'm like, all right, well, I better try to control the narrative. I thought there was a sort of like ethical protection, like, no. I've been trying to figure out, like, why did this happen to me? Like, where did I cross the threshold of becoming enough of a problem that they needed to have leverage over me and expose me? Our movement is about free speech, freedom of compute, freedom of AI, and not like top-down oppression. I have failed, honestly, and apology
Starting point is 00:00:38 you too. We both, you failed. I don't know why you're talking to them. Do not talk to journalists. I don't know why you're talking to them. Do not talk to journalists. I said, don't talk to journalists. You do not try to violate this rule. You will regret it. Welcome back to the pod. We have the, we've got, are we saying Beth Jesus or where are we doing?
Starting point is 00:01:07 Where are we going with this? You could call me both names. I go by both names now. Beth Jesus or Gil Verdon in America, Guillaume Verdon in Canada and France, if you will. So by me after it's been honestly too long, the legendary Bology doesn't even get an does not even need a second name it's like you know Madonna or who are the other ones I don't even know
Starting point is 00:01:32 Teal I guess Teal it's not quite it's not it's Bology we're here today this is I'm stoked about this episode this is going to be a great one and both you guys thank you you're on a different time zone it's been a while setting this up um but uh apology and i have been talking about embology i mean separately from each other but also together
Starting point is 00:01:53 media and the shape of media uh for years through like tech dark ages back when we had no counter voice whatsoever up until today where where I think things are still not balanced, but very different than they were three years ago, certainly during COVID and before that. Gil, you were, I'll just break it down. I mean, so Beth Jaisos, kind of popular anonymous account on Twitter. You run something called the EACC, which we'll get into in a bit, or EACC, doxed by Forbes, Emily Baker White of Forbes, in concert with Alex Conrad, which is an important part of this story because Alex, it's important to me at least, and I want to talk about it with both you guys in a minute, because Alex is so influential in tech. I mean, he runs the Midas list. Everybody knows
Starting point is 00:02:44 him. Everybody talks to him. And that is a problem. And I do want to get into it. But long story short, these guys hit you up, or Emily hits you up, and she's got your real identity. So you've been writing anonymously about EAC for, what, about a couple of years at this point when this happens. I have a little bit of a community. I would say I'm going to kind of characterize what I think you guys do. And then you can come in here and give me the real version according to you.
Starting point is 00:03:12 I think you guys are memeing about the future and you're definitely counterpointing the EA stuff. You're sort of like cleverly cutting against the Doomer narrative in a fun way. I've always seen it as very lighthearted. I've noticed this week, the EA crowd has been very personally aggrieved by you. And it gave me a giant question mark then. I was like, why has the media gone to war against you? I have a few theories. I'd love to hear, apology, your theory especially, because I'm sure you have one at this point.
Starting point is 00:03:45 Sure. Your link, Gil, to Marc Andreessen and Gary Tan specifically seems to me like the reason that you became something that needed to be destroyed and shamed. So that's my version of events. You know, IAC born, you represented. It's a positive, futurist, anti-doomer, pro-progress, pro-AI specifically kind of movement. You are, they think, assassinated. Obviously, it doesn't work out that way for them. You're much more popular now. Tons to talk about. But first things first, what is your version of those events? Yeah. I mean, for me, it was a crazy Friday and it's been a crazy couple of days
Starting point is 00:04:28 since then. I'm like halfway across the world right now in an undisclosed location that is not China and on a business trip. So I'm still doing what I do. I'm an entrepreneur. entrepreneur. Yeah. Originally, basically I get a text from like investors. So, so first of all, you know, I came, you know, as I was doxed, you know, I came from Google X, you know, I worked on projects with the leadership there very closely, you know, very secretive projects. I was used to sort of secrecy as my baseline. And to some extent, like spending years, super secretive, you can't talk about what you do and so on it's kind of a burden and that's originally how I started Beth Jesus was just like as an outlet for me to communicate with people about stuff and just talk
Starting point is 00:05:16 and you know I've kept that that account since then but you know I had a startup as well that was not doxxed. They not only doxxed the correlation between my principal identity and Beth Jesus, but they correlated my principal identity, my current company, and even they traced through name changes of the company. They even went through my Facebook they went they voice docs me they just correlated everything it was like a full like uh you know full depth investigation they punched your identity it's it was like something out of a spy movie they went in on you yeah they really threw resources at this yeah like i mean for me you know like i think like you know i believe what i i believe large majority of what I say as Beth, sometimes it's kind of like, you know, meta,
Starting point is 00:06:09 meta ironic, uh, you know, extreme posts as one does on Twitter, but like most of it, you know, like I'm all in, like, I don't mind like, uh, you know, backing, you know, using my main identity to back what I say with what I've, what I have said with Beth. But to me, it was like, hey, I didn't give you the right to disclose that I'm doing this deep deck startup as well. I mean, some technologies- You have the right. This is the whole thing for me watching this is, this is not the first time that the press has done this. This happens often and it's it's this question of rights comes up it's like we can do this they say it's like yes you can but the question is like is it ethical what kind of world you want to live in um and obviously from where i'm sitting it just
Starting point is 00:06:58 seems the purpose of revealing your identity is to um scare other people who are sharing their opinions in this space anonymously from doing so. It is a strategy to chill speech. I don't have to worry about speaking for the most part within reason. It's way easier for me. I don't run a tech company in stealth. I work for Peter Thiel at Founders Fund. And on the Pyrewire side, I run my own business. It's like the business is speaking. It's easier for me to do it. For someone like you, it's harder. They know that. And the point is they want to kind of take you off the map. Apology, why do you think... I mean, do you have any... You can take this anywhere you want, obviously. You might have something you want
Starting point is 00:07:39 to share in particular, but I would really love to know what you think the motivation behind this specific targeting was. So my views on this have somewhat evolved. Fundamentally, the journalists are a tribe, and they're a sub-tribe nowadays of the overall, whether you call it the regime the paper belt the cathedral the establishment the deep state what have you that is a set of people and it is a group of people that is you know fubu you guys remember fubu from the 90s for us bias okay so like the regime is nationalist for the regime you know it's like first order, you could say it's Democrats. You could say it's deep state.
Starting point is 00:08:28 You could say it's cathedral. There's a lot of different names for this, which are all overlapping. But basically, if you take the whole social network of 8 billion people in the world and 300 million Americans and so on and so forth, this is a subgraph in the social network that's densely connected. And it's journalists, it's professors, it's regulators, it's bureaucrats, and so on and so forth. And everybody who is not them is an enemy. And that means that they're at simultaneous war with tech, with Trump, with China, with Russia, with India, with Israel.
Starting point is 00:09:07 And you can argue some of these they're more at war with than others right they don't like india very much but they're like somewhat fighting it they don't like israel very much and they've got some ally right but basically once you look at it as a giant social network in different colors of subgraphs the journalist you know uh like blue subgraph is at war with within america the red subgraph and the gray or tech subgraph right and so that's like the first order like visual of the whole battle space it's not um this is not like a discussion this is not like a normal story this is information warfare of one tribe versus another this is meant to harm you and uh even if this first one is just like a tracer bullet or like a flare to sort of light up the position right then subsequent kinds of things may or may not be so positive or what have you right and um so that's like first is basically,
Starting point is 00:10:05 it's not a positive thing to go and dock somebody against their will. You do not do that to a friend, right? You do not do that to somebody who you mean well to. You don't stalk them for months and, you know, basically in your, and, you know, correct me if I'm wrong, Guillaume, but they basically gave you, you know, they told you they had all this dirt on you and they're like, well, you better speak with us or else, right?
Starting point is 00:10:29 Yeah, yeah, yeah. Like I got a text from investors the night before from, I think it was Alex Conrad that told investors like, oh, hey, I correlated Beth Jesus and Guillaume. I think he's a portfolio company. They had the old name of my startup. So I changed the name of my startup for further upsec and for branding eventually. But they hadn't put it all together. And then they censor fused across reporters. And then they had much more, they had too much put together and they wanted to, they were going to put it out. And then so the morning I was like, look, this is going to come out. You have a chance to give a comment, like it's, it's happening, uh, whether
Starting point is 00:11:09 you like it or not. And for me, I had like two things. One of them was like, okay, well, how do I, I just went in damage control, right? Like, I mean, I've, I'm building stuff that I really believe in. I've been wanting to build this for like eight years. I've been like super stealthy about it for all sorts of reasons, for security reasons, for, you know, IP reasons and so on. And you know, everything was at risk, right? Not only the movement, the act movement and hope, well, hopefully the movement is not at risk, but it's supposed to be anti-fragile. We're trying to make it so, but, uh, and, and, and my company, so I went in damage control mode um and i didn't want to talk to them but then they're like look this is gonna really come out i'm like all right well
Starting point is 00:11:50 i better try to control the narrative at least for the first for this first thing because i'm sure they're gonna pile on now the media is definitely gonna pile well from here on i mean i've entered an exhibit let it let it be entered into the record exhibit a okay this is uh from data journalism.com okay by one of the wokest wokes to ever woke oh yeah she's one of the three head one of the heads on the three-headed dragon i would say yeah exactly i mean it's a multi-head the thing is every one of these critters like is essentially a Stasi officer. That's what these folks are. So they actually have a whole article at datajournalism.com.
Starting point is 00:12:35 And what this is actually titled, I'm sure if you paste this into chat, GPT or whatever, and ask it to paraphrase, it is how to docs. Jesus. Investigating social. Go ahead. No, I'm just shocked. I shouldn't be shocked, but I've never seen it this explicit. It's this explicit, right? All of their stuff. That's why I read all of their, you know, or go ahead. Brandy works for, I think NBC, right? This works. That's right. Here we go. This is, these are, these are major significant journalists who are, that's right that's right
Starting point is 00:13:05 and they teach how to dox how to there's other articles on how to essentially assassinate except they don't assassinate the character assassinate right and um it's all so you know once in a while they'll actually admit it when they're talking to each other and they put on a different face right so here it's like um you know be prepared to read thousands of tweets click until the end of the google results and dive down a social media rabbit hole if you want to collect the tiny biographical clues they'll help you answer the question who is this okay so you know i mean whatever you can read this if you want. The useful thing about reading articles like that is it tells us as technologists how to build a privacy countermeasures against these people who are literally like a for-profit intelligence agency. And I don't even say that lightly.
Starting point is 00:13:57 How did they get your identity, GV, right? They went and uh used like some cia voice recognition thing straight up i mean that i guess that's that makes it that makes me a bit more badass that they had to do that i i don't think it was necessarily that hard but i i thought there was a sort of like ethical protection like they wouldn't like i'm not doing anything illegal i'm literally just arguing for free speech hey guys thanks for listening to the pirate wires pod make sure you like subscribe comment below and uh share this with your friends i have failed honestly that was one of the big things i thought while i was watching this go down i thought like have we not and apology you too we both you failed you know
Starting point is 00:14:39 like how do you i don't know why you're talking to them like that's what i'm thinking i'm like how do you we've been talking about this for years like they are not all of them and maybe all of you and i maybe disagree a little bit something i think there are some good things out there um it's very clear in my opinion who are not and it's very clear usually by their approach like that was on a friendly approach that's a hostile at that point you're at war and what you what you do and this is for people listening now who are maybe anonymous online and they don't know what to do if this comes and happens to them you ask for help from people like us who are who are able to be back with media who can defend you like that that is the very who really separate from defense publicly is like advice because by participating you, first of all, they're using that against you. They're saying, oh, he participated. It wasn't
Starting point is 00:15:33 against his will or whatever. And it's very clear from the emails that you've shared with me and the private messages that they sent to other people and things like that's not, obviously not the case. It was clear just reading the piece that wasn't the case, but I can confirm here officially that that is not the case you were definitely uh not forced to participate but you were going to be doxxed whether or not you liked it um but had you not participated that they wouldn't have had that weapon to use against you and uh they wouldn't have had much to talk about because you they also had not yes they were they had they used their ci voice recognition they
Starting point is 00:16:05 really really thought it was you you know mathematically whatever it seemed like almost certain but you had not actually confirmed it so they would have had a story where it was like we are very very certain but we don't know for sure and also he is sort of associated with this guy who we're telling the world is a nazi and that would have been so ludicrous that i mean we would have just gone the whole internet would have exploded and been furious and they would have been beaten back like you kind of gave them a little bit a little bit too much in the future i mean i i wonder what bolly's advice is but i'm like you give them nothing and you then concoct you you you docked yourself at that point possibly
Starting point is 00:16:45 um to take away the story like i don't know apology what would the advice have been you think in that situation well so one thing is i i usually don't even repeat their charges because they throw around everybody's you know everybody's a nazi like they they'll call anybody and everybody's a nazi right so i don't even give any credence to their charges right um but but setting that aside yeah i mean so i mean here's the thing i like um how do i think about it uh you're playing you guys are really uh video games yeah not enough to argue this reference but let's just throw it out so there's like you know there's various video games where they're like you you know, first person shooters or something like that. And new guys just keep beaming into the arena.
Starting point is 00:17:29 Oh, yeah. Right. They have no context on what just happened and where the shells are flying overhead and so on. And that's how I think about the new guys who, you know, maybe their heads down. Maybe maybe they were just like in a different part of the battlefield. They don't have context and they're beaming into the arena. And it's a good analogy, I think, right. Where it's like, you know, they're on our team, they just beamed in, but they, you know, they may not know all the battle tactics or whatever. Right. So it's actually incumbent upon us as I hate, at least, you know, Solana, I'm not sure what the convention is, but I guess I've become a little bit of an elder.
Starting point is 00:18:08 God help me. God help us all. We're seniors in school. Yeah, it happens really fast. The gray comes really, really, really fast. I'll tell you that. Okay. But it is incumbent upon us to literally compact this stuff down into month draws that people repeat. And you assume it's almost
Starting point is 00:18:27 like college where there's like a new class and it's obsolete in two years or three years. And you have to like, say it again and update it. Right. So that's, that's one part of it. Let me pause there and get your thoughts. Well, I mean, I agree. I really, like I said, at the top of this little piece here, I felt bad, Gil. I felt like I had not done a good enough. As loud as I've been on this issue, as often as I've written about it, as much as I've tweeted about it, I feel like somehow that message, it's like, do people think I'm kidding around?
Starting point is 00:18:54 Do they think that I, I mean, this is an information war. And I, again, I do believe there are good actors out there. There are so many nefarious actors. I do believe there are good actors out there. There are so many nefarious actors. For me, the Alex Conrad piece is really crazy because so many people treat him like a reasonable reporter in tech. And this action for me is beyond the pale.
Starting point is 00:19:17 Not only the way he went after you, but the way he tried to go after Gary and Mark through you is, I think, really nefarious. And I'm frustrated that people aren't as upset about that piece as I am. But yeah, no, I agree. I think we'd have to do a little bit of a better job reaching out in these moments. And I don't know. It's like, I just hope that people know that they can hit up people like us for advice in these situations.
Starting point is 00:19:48 There are a lot of great people online. Lulu is another great person to Lulu. Messervy is a great person to talk to on this kind of stuff. I was saying I wrote a piece about this and in it towards the end, I get to the point where it's like. You actually and this is what I really would love to talk about things have changed yeah there are a lot of people who are great online talking with big audiences and it's not just like they're anonymous people there are pseudonymous pseudonymous i cannot ever pronounce it semi-anonymous people. There are CEOs, big, huge, popular, public CEOs who are posting, and they are all just like a DM away. And you can see on Twitter, you can see who's connected to who. If you don't know them directly, you want some advice,
Starting point is 00:20:36 you talk to anyone. It's pretty cool. The vibe has shifted somewhat. And I was just watching at the Reagan National Defense Forum. This is like the Super Bowl for defense forums. First of all, last week you had Elon Musk telling the advertisers to go fuck themselves on the stage of the New York Times. Then this weekend at the Reagan Forum, you have Palmer Luckey sort of telling kids not to follow their stupid dreams. If they're, you know, contra reason and, you know, self-defeating or not good for society. It was like a very sort of anti, I don't know, mainstream message. It was strange to see him up there being so honest and unapologetic. up there being so honest and unapologetic.
Starting point is 00:21:26 And Palmer, someone specifically who just lays down his opinion every day and does not give a fuck and has really gone direct in that way with his own message about his own company, which is pilling it, by the way, on Daryl. Then you have Alex Karp, who just espouses like an incredibly anti-woke message openly on stage. And it's like, I'm not hiring people who are idiots on the following topics, all of which would have been considered beyond the pale just three years ago, would have been articles in the New York Times about the terrifying menace of Alex Karp. But I am running one of the coolest companies in the world. And I'm telling young people,
Starting point is 00:21:58 you are breathing the vapors of a dangerous, new, fake, and self-destructive religion when you are sitting at your elite school pretending because you watched TikTok twice and got an A plus on some crazy paper because your professor couldn't get a job anywhere else that you actually understand the world. And you're not welcome at my company. Yeah.
Starting point is 00:22:21 I think things are different now. I don't know. What do you got? Am I right? Am I wrong? Am I just naive, optimistic, too hopeful? Well, yeah. I mean, so, you know, Solana, you and I, again, we're like literally grizzled combat veterans
Starting point is 00:22:36 at this point. Ten years in the meme wars, right? Yeah. The last five is really, I think, where most of the action took place. Yeah, that's right. That's right. So basically, tech versus media. I'm just going to rattle off the history here, and then I'll turn this into a blog post and so on and so forth with the rules and the history, and then the history, history, the deep history, right? But very roughly,
Starting point is 00:23:04 tech kind of arguably, you can argue when tech exists as a culture, but maybe you date it to 95 with the graphical web browser, okay? And then from 95 till 2008 or so, because of the dot-com crash, media didn't even really take tech seriously. They're just like gadget guys or what have you, just doing their stuff on the West Coast. And they were considered basically an auxiliary of the Democrat Party. Like you had unions and you had this group and that group. And then there were Steve Jobs and the tech guys making gadgets over there in the corner. The media concerned themselves with their expense accounts and Iraq and all this other stuff
Starting point is 00:23:46 going up into 2008. There's even this article at that time, I think by Joel Stein, that's enumerating the power centers of the USA and it's a real time capsule from 2008 because you know it's not on there. What's on there is the Pentagon and Wall Street and so on.
Starting point is 00:24:00 You know it's not on there? In 98? 2008. 2008, what's... Oh, social media. Silicon it's not on there in in 98 2008 2008 what's oh social media silicon valley is not on there as a power center in 2008 at all at all yeah that is the phrase big tech had not even been innovated at that point exactly this whole system that we're in is so much more recent than people think because what happened was after the financial crisis, tech revenues went vertical. Okay. There's a great graphic that shows what happened.
Starting point is 00:24:37 All right. Bam. Okay. The Hades Kazanis. What is this graph? This is showing print media revenue top set at 67 billion in the year 2000. Then it's like down after the dot-com crash, but it's flattish. And then a complete collapse down to like 16 billion in like four years.
Starting point is 00:24:56 This is advertising revenue in newspapers. Advertising revenue in newspapers is a blue line, right? And then including digital. Digital doesn't save them, doesn't come on fast enough. And Google eats their lunch and then Facebook eats their lunch that's a green line over here google is the red line all right so basically it's one thing to you know see your neighbor become like a billionaire it's quite another thing for them to become a billionaire while you become uh a thousandaire or whatever right okay and so these guys essentially over the first four years
Starting point is 00:25:26 of the obama administration saw these tech guys who had been in you know a box or whatever they didn't think it was anything suddenly rocket up ridiculously fast and because that was the the iphone and that was the reallocation of budgets after the financial crisis to online ads which are finally mature and converting that was the fact that like of budgets after the financial crisis to online ads, which were finally mature and converting. That was the fact that like SaaS was starting to work. Y Combinator actually, you know, remember Y Combinator started in 2005. That whole modern seed era of things only really started working in the late 2000s. There were a bunch of great companies found around that time, Stripe, Airbnb, Uber.
Starting point is 00:26:00 All of that basically by 2013 after Obama got re reelected even in 2012 by the way uh do you guys remember in 2012 the nerds go marching in how a dream team of engineers from facebook twitter and google built the software that drove barack obama's re-election okay when the nerds go marching in you see that i remember that narrative of the of the uh obama being the first sort of internet president he had this team around him from tech basically that's right not fully believing it but well and the reason for that is because uh you know facebook and then twitter started in very blue zones in the harvard area and in san francisco and so it was assumed that technology was blue and blue was tech and that it was just good.
Starting point is 00:26:47 And that if it had any effect on politics, it would be to overthrow oppressive regimes and overthrow, cause the Arab Spring. Go ahead. Right? That was just assumed. Until Arab Spring, which they thought was great. We don't hear much about the consequences of that today. Yeah. A lot of disasters, a lot of people dead, unfortunately. Right? So we can come back to that point but basically what happened then from 2012 to 2016 a bunch of things happened first is all these conservatives started getting android phones okay and there's a really interesting sort of meme that makes this concrete have you ever seen and this is kind of late 2010s it's starting to get a little obsolete or whatever right but there's a there's a collage of a bunch of democrat visages and a bunch of republican
Starting point is 00:27:30 visages on twitter so democrat visages are uh guys with glasses indoors and like you know kind of unshaven or whatever right they're like unshaven programmer, graphic designer, Yas queen types or whatever. You know, the guy's pointing at the thing, right? Okay. And, but the Republicans are more interesting or it's just like, it's an interesting, different kind of visage. They're people with sunglasses in trucks outdoors taking selfies on their phone. So the fundamental difference is the Democrats are indoors and Republicans are outdoors. Because almost all the Republicans are in sunglasses and they're clearly on the go with some like cheapo Android phone.
Starting point is 00:28:13 So part of what happened from 2012 to 2016 is the polls got online and that started shifting Twitter and the Internet to the right. The other thing that happened is after Obama got reelected ined in 2012 the media which had played nice with tech guys because they needed them to get obama to get re-elected now had four years to settle scores and all the knives came out and all the valley wag type stuff happened around that time would you just look at all those rich people that's valley wags editorial vision in a nutshell but could be so much more by manju who's become a full communist now and radicalized right but that was like 10 years ago when he was ostensibly central left. He was complaining about attacking them as rich people and so on and so forth. That's crazy. I remember people being mad about that. But in my mind,
Starting point is 00:28:55 I remember Dave Morin complaining about this, which made sense because he was being attacked. I don't remember the press rising to the defense of those guys. That's interesting. Yes. That's right. So basically, a few years later, he's all abol of those guys. That's interesting. That's right. So basically a few years later, he's all abolished billionaires. It's literally he's writing that. Okay. The radicalization happened to the journos. Yep.
Starting point is 00:29:13 Okay. And essentially though, it was a boxer's clinch where that tech and media alliance didn't make sense anymore because the media guys were like wait a second these tech guys aren't just like some vote bank gadget bank whatever that's making the the bejeweled toaster on my desk that is an iMac they're coming for everything yeah and then two things really I mean I it was Gawker is destroyed yeah that was that comes that comes a little bit later but yes that's right that's right of course it's an important episode and then trump wins the election and at that point
Starting point is 00:29:49 it's yes that's right but crucially the negative coverage starts in 2013 in 2012 you can even find them writing articles like there's no such thing as a bro grammar and so on right so this goes back to that visual i had of the blue tribe, right. Red tribe, gray tribe. If you're part of the tribe, you get friendly coverage. If you are not part of the tribe, it's information warfare. Yeah. Right. And now that can mean, by the way, you can be part of the tribe and then kicked out of the tribe where you can choose to leave. Like Glenn Greenwald has, you know, he was, he was like, he had all the prizes. That's why I give him a lot. I don't agree with Glenn on every issue, but I respect him because he had all the prizes within Blue and decided to kind of go and do his own thing and stand for a principle that he stood for, right?
Starting point is 00:30:35 Conversely, it is also possible for someone to be kind of a renegade, but then to become incorporated into Blue and to be part of this Borg, right? Like Google, early Google was much more buccaneering and swashbuckling and so on than what it's become, which it's not as blue as it was like a few years ago, maybe, but it's pretty blue, right? So once you kind of have that visual, it's really the, you know, that's the friend, enemy, the tribal distinction. And it's not exactly left, right, or what have you. It is group, non-group. Let me pause there. I've got a lot more.
Starting point is 00:31:13 Well, I want to, I mean, I have a slight pushback, which is that I agree with, I mean, the whole history I think is correct and insightful, especially the piece on the revenue is really important. I think I maybe get somewhat distracted sometimes by the ideological combat component of all of this. It just very clearly feels to me like a classic information war. And maybe not classic, it's not classic, it's new. It's a new kind of, it's a very scaled up information war. It's live. It happens every day. It lives in our pocket. It never ends. But at the end of the day, like we are, and this is a point that is, you know, I've heard before, I just forget. It's like, we are competitive. We're competitive. There's a business piece that's competitive, like just hot, like industries are competing. And then online, I think the new thing that we've seen over the past few years, and let's just focus on tech for a moment, is it's not just a money issue. The all-in pod is a massive success.
Starting point is 00:32:16 And that is a side project for David Sachs. And that would be a dream come true for every single person in media. So it's not like the platforms are taking the money. It's like the people on the platforms are also taking the attention from them. They're not even, they don't even get to be the attention people anymore or not exclusively. There's combat, there's competition there.
Starting point is 00:32:37 So separate from the ideological piece, there's the reality of just competition. And when you're in it with someone for something that you consider your livelihood. So it's funny you say this because um you know now again this feels like ancient by the way actually gv i want you to jump in because we were just uh chatting so um did did you want to say something or i can yeah you know there's kind of the broader there's the broader information war between uh tech and and know, the paper belt aligned media. But, you know, the media, they're kind of the pit bull for some of the paper belt against,
Starting point is 00:33:11 you know, whoever in tech is an enemy of whoever is like helping place various stories. You know, to me, it's been like, I've been trying to like figure out, like trying to back propagate, why did this happen to me? Where did I cross the threshold of becoming enough of a problem that they needed to have leverage over me and expose me? And I mean, obviously, Mark and Gary have been supportive of the movement for all sorts of reasons. One of which is obviously we think that the current regulatory capture attempt by AI safety, the AI safety complex and some of the big incumbents, its camouflage
Starting point is 00:33:56 is anti-doom, trying to save us from open source models, their dual use and so on. We were calling that out and getting people fired up, calling this out, and it became a problem. And of course, Gary tweeted this, that FTC chair Lena Kahn was invited to YC. And I was there and we had a little chat. And I voiced our concerns of our community that like, hey, actually, this is some of the big incumbents kind of covering up the fact that they're trying to
Starting point is 00:34:33 capture the market with sort of like, oh, this is for your own safety. And that's when I became like officially a political problem, right? And there's all sorts of interests in the background, you know, trying to pull strings here. You know, I have some hypotheses. Maybe I won't go into, like, exactly who I think is pulling the strings. I have my thoughts on that. But what's been public is that, you know, Secretary Raimondo, I believe, called out EAC specifically as, like, a dangerous movement and something we should suppress with AI. They want to- Let's talk about- Our movement is about free speech,
Starting point is 00:35:12 freedom of compute, freedom of AI, and not like top-down oppression aided with AI. And they're literally proposing shutting us down with AI, which proves our point, which is kind of like- By the way, by the way, you know what that's like? It's like, well, nobody rid me of this meddlesome priest uh kind of thing right well no one rid me of this meddlesome eac right like yeah because a problem and that's like a decentralized signal to the journos to go and you know right it's stochastic journalism yes come on that's pretty good right you get no. Yeah, you're holding up the target.
Starting point is 00:35:45 This is where I'm trying to think of what the really... That's the allusion to stochastic terrorism. Okay, I thought it was funny. Stochastic journalism, stochastic terrorism. Okay, anyway, go on, go on. I want to talk about... It was funny. I want to talk about EAC.
Starting point is 00:35:59 You were just getting into it. I mean, what... This is a little bit off topic, and we can revisit any of the stuff that we were just talking about. There's a ton there on the war piece, but, um, this, you know, oh, it's, it's dangerous and whatnot. Um, what is EAC? How would you even describe it? What is EAC? I think for me, it's kind of like a cultural framework, really, or a metacultural framework. We're trying to understand where's this whole thing going? Where's civilization going from first principles? At the end of the day, the only laws are the laws of physics.
Starting point is 00:36:40 And I'm a former theoretical physicist and I do physics-based AI now. I always think about physics and nowadays I think about self-organizing systems. And to me, it was just an exercise in writing and trying to understand where it's all going. like completely like, uh, unanchored to reality sort of theories, like, like the doomers do of like AI taking over as soon as it reaches human level AI, it takes over and fooms and takes over the planet. Like I've been trying to actually find like, where's this thing going? It's like, okay, well, um, you know, civilization at the end of the day is a system that's like alive, all systems that are alive or that are basically life from a physics standpoint, seek to grow. They acquire energy to maintain their sort of state
Starting point is 00:37:32 and they seek to grow and increase their ability to acquire free energy. It's like, okay, well, civilization wants to go up the, what is called the Kardashev scale. So it is the scale of energy production or consumption. And so that's where we're going, you know, and that's where the system wants to go. And, you know, it keeps morphing itself. It, you know, searches over the space of technologies, of cultures, of memes, of genetics, of everything in order to ascend this scale. And, you know, basically it's like, we're just pointing that out. And to us, like, it's first a framework of like, what is, and, you know, that's what's going on. And then it's like, how should you live your life given this fact? It's like, well, actually, if you help
Starting point is 00:38:18 the growth of civilization, you know, you will likely prosper. And it's kind of a self-fulfilling prophecy. If you're optimistic about the future, you work on hard things, right? Cause you believe in a better future. So you want to contribute to it so that you own a piece of it. And, and, and then you, you go out and build and do hard things. And then the good future happens. Whereas if like you believe in doom and nothing good is gonna happen, you no longer build, you no longer wanna have kids, you don't bet on the future and then the future is actually worse, right?
Starting point is 00:38:52 And to us, it was a sort of reaction to the pervasive doomerism and the pervasive pessimism, tech pessimism that was just dominating the news cycles. And we were like, can we have a viral optimism movement that, you know, hyperstitiously induces growth of civilization, right? Don't we all want that? Don't we all want to be, you know, amongst the stars?
Starting point is 00:39:19 It's more than that. No, that's the thing. That's actually, I mean, for us, us three, yes. But that's actually like a core difference. ACK, obviously, ACC, we're talking about acceleration is the word that you're using. It's EA. You're making a play on effective altruism. The effective altruists are very concerned with accidentally building a god that kills us, is like where their head's at and um they
Starting point is 00:39:47 not just they now you have the safetyists in the media who have they want to slow down tech for all sorts of reasons but their position all of them in this sort of uneasy alliance between very smart people in ai who happen to be ea pilled and a little bit of a doomer um and the press and you know the government that wants to slow all this shit down, it's slowing things down. Endlessly in the press, especially, you hear Mark Zuckerberg said, move fast and break things. And then things broke. We have to move more slowly.
Starting point is 00:40:15 And you're not saying that. You're saying we need to accelerate. So the positive vision is there. It's awesome. And I thank you for your service, sir. It is very important. The memes are excellent. And I think I agree. They are are powerful i think they're motivating i think they are helpful
Starting point is 00:40:27 but they identified correctly your critique of them and uh and so that is you know you're you're locked in a sort of at that point it's if people hate you and laugh and i mean i've seen it i'm like wait how much they hated you until this week um they're definitely mad can i can i make can i make a couple points there yeah so um there are decelerationists and there's also what the current u.s establishment is which are stagnationists right what they want to do is they're in their own way conservatives where they want to freeze the world in amber. And so you will find, for example, that, you know, for example, Raimondo is talking about the tariffs and so on and export controls, and they want to kind of freeze China as a power. And actually, I can understand that. Okay.
Starting point is 00:41:20 But and so they're like, oh, you know, the US needs to be a leader in AI. So we're going to ban exports in there. And then literally, you know, that's what they're saying on Mondays and Thursdays. And then on Tuesdays and Fridays, they're doing AI bans in the U S and saying, you need to slow down and not accelerate and so on and so forth. Right. So they want to kind of think they think they can via policy levers, you know they're not silicon valley they're not shenzhen you know they're neither of these things but washington dc thinks they can freeze the current dispensation in amber where they've got a healthy lead and nobody abroad and nobody at home goes too fast and nobody can they just want to stop the disruption because that's the
Starting point is 00:42:01 thing acceleration is good for lots of people but it's not good for them it's not good for power any technological innovation i think pretty well so that's the thing so i've got it i've got a perspective on this being out here in asia right and india loves tech india loves tech and the reason indians love tech is you know you've got a guy who was on a, you know, like very poor 10 years ago, who's now got an Android phone and they've got a lifeline to the world and they're able to study and they can earn and all this stuff. So it is correlated with the massive improvement in their living standards, right? But for the journos and more generally the U.S. establishment, you saw that graph.
Starting point is 00:42:41 It's correlated with a massive decrease in their living standards. So they have a learned correlation where the faster the tech grows, the less money they have, the less power they have, which is kind of true for, you know, it will be extremely painful for you, right? Acceleration is extremely painful for them, right? And so that's why they're fighting the internet and they're fighting every force that's against them in actually a conservative way. It always feels like this. And in this case, you're talking about generative. Obviously, AI has many things.
Starting point is 00:43:12 But the thing that gets the most attention at this point are the generative models when you're talking about language specifically. This is something that looks like a journalist. Yes. It's NPC level. It's literally, the thing is, you know, this guy, I got to get this guy to put this project back up, which, you know, I used to say, not even jokingly that these journalists, you know, there are just like a sports article. It's a wrapper around a box score. You know, box score is like the,
Starting point is 00:43:40 you know, how many rebounds, how many assists, steals, whatever in a basketball game or a financial article. It's a wrapper around a ticker. Like, you know, stocks were up on heavy trading, blah, blah, blah. It's just like a verbal narrative. I used to say that these articles are just wrappers around tweets. You give it three tweets and you can write the article. And I used to say that as a joke. You can go find, not even as a joke, it's like true and also a joke, right? From like years before ChatGPT. And then when it came out, it became like true and also a joke right from like years before uh chat gpt and then when it came out it became literally true where you could paste in a few article like a single tweet into this thing that uh you know like a new york times uh article generator and now it's probably gotten
Starting point is 00:44:17 even better a year later and generate an entire full nyc article that's like dark times ahead in technology as acceleration continues, you know, like you could literally do that. Right. And there's no intelligence. It's totally paint by numbers. You just have the stylistic kind of thing off the prompt. Go, go, go. Yeah. Yeah. I think like the thing, the thing that's scary, right. It's kind of like the old world media, they serve the incumbents, they serve those that have interest in top-down sort of control. They like to, like, they gain proxy power by being instrumental to authoritarians, right? And those that argue for more control. And EAC is about like saying that acceleration, techno capital acceleration is a positive force. It creates amazing technologies. It gives us wealth. And, you know, the counter argument to it is that, well, if we don't control who ends up in
Starting point is 00:45:12 power, that's bad. You know, really what they're thinking is like, we might lose power. And so that is bad. But the reality is like the system would adapt and powers would shift to, you know, those that are disrupting the incumbents, right? And that's good. That's how the system would adapt and powers would shift to, you know, those that are disrupting the incumbents, right? And that's good. That's how the system, you know, optimizes itself and finds, you know, whichever technologies are most beneficial to the world and helps them scale up. And of course, those that produce those technologies get to have sort of more power to allocate capital and control where it goes, right? Which is the sort of natural meritocracy. And they're trying to break that. They're trying to maintain this sort of status quo. They are the real conservatives. They call themselves maybe
Starting point is 00:45:52 progressives in some cases, but they're really just trying to conserve the powers that be. And then the media, they're just mouthpieces for the incumbents. Because again, like, you know, those are the only people that still talk to them. And they use, like, you know, they use them as the media is kind of like an information warfare arm of, like, you know, the incumbents and the people who built, right? And we've seen that. Like, and the label, you know, go ahead. I worry a little bit, and I want to get both of your take on this. Unless, Baljit, you want to? both of your take on this unless you do want to you said i was going to jump in a little bit which is um you know there's more i can say on kind of the history of this and so on but there's a huge
Starting point is 00:46:31 difference as to where we are in 2023 versus where we are in 2019 2020 2021 and i think solana you did your part i think i did my own little part we did our part which go ahead shakes no handshakes that was no handshakes please exactly remember that right yeah and essentially what happened was starting in 2013 let me just recap the history that began to bring us to the present day from 2013 to roughly you know 2020 uh the journos essentially began a reign of terror against tech where there was all manner of cancellation. We didn't have the word for it then against people large and small.
Starting point is 00:47:14 That's why the homeless problem, which is not really a homeless problem, but a drug and mental illness problem is Sanjana Friedman, one of your, you have many talented writers, particularly talented writers, Sanjana, salute. Okay. All right. Is it she? She, yeah. Okay. Okay. okay okay sorry i didn't i didn't know where the you know what the uh
Starting point is 00:47:31 yeah we gotta be careful namus so um so johnny friedman has done a phenomenal job on documenting how the like the quote homeless problem it's not just a homeless problem it's a drug and mental illness problem point is in the early 2010, when something could have been done about it, there were a few tech guys who made impolitic comments about, you know, the sudden epidemic of needles and syringes and poop. This guy, Greg Gottman, another guy, Peter Shee. And yeah, maybe they posted in like, you know, maybe you might not use exactly that language, but obviously they're, you know, it's, it's, it's a lot less offensive than some crazy guy throwing feces at you in the street. Right. And, and more importantly than the NGO who's feeding him drugs and getting
Starting point is 00:48:18 paid by the city to do so. Right. They mean things about the people who were chasing you down the block. Yeah, you down that's right that's right so what happened was the journals at that time not the journalists like that was tech people turned on well well but but the reason was there's a bunch if you go back and look peter she greg gottman got destroyed by the media at that time and then people queued off of that as oh that's what i'm supposed to i'm I'm supposed to yell at them, right? And so it became for years, it was incredibly un-PC to even mention the problems that San Francisco was happening. So it was like disabling the immune system during the period when it maybe could have
Starting point is 00:48:56 done something more. And then this thing just got completely out of control and it became what it is now. And these NGOs became totally entrenched and so on. Many other kinds of things happen like that. That's just one example of it where the journalist caused a giant problem. And then only after it became massive, then could it be acknowledged. It was interesting. I was nervous to write about local politics. And I forget that sometimes. But when I first started seriously writing in like 2020, I was nervous to touch the homeless issue and the drug issue. And I just thought like, this feels dangerous. People are going to come
Starting point is 00:49:30 after me. I don't care. I got to do it. And it doesn't feel that way now. It's a weird thing where I always ask myself, is it because they're so strong or because they've become weaker? And I think it's both at the same time. Have they won? Is that why they don't care anymore? Yeah, exactly. That's right. So I think it's both where the left, the Democrats, the blues, whatever, have captured the state. And so they've got the budget. And so they sit in their parapets. And no matter what we see online, the money just keeps flowing to these NGOs. They flow to them or whatever, right? So we can yell and they can lull
Starting point is 00:50:07 and then just go back to shipping syringes, right? But they have lost control of the network. And, you know, thanks Elon and thanks to also Substack and, you know, Coinbase taking a stand and Solana and a bunch of other people. They've lost control of the network enough. They're getting crushed in soft power terms
Starting point is 00:50:26 every single day. And that does also matter in terms of building a parallel consensus over here. But it matters insofar as we can convert that energy into parallel institutions. And so that's actually, GV, where you are, I think, obviously, you know, talented and as someone as a meme maker
Starting point is 00:50:42 and so on in your own right. But it's also fortunate that this happened to you. It's not good that it happened to you, but it's fortunate to you in 2023 when we have this whole ecosystem, right? So, you know, in the mid 2010s, what would happen is some poor tech guy, and it could be, you know, a very junior person or the most senior executive would just get targeted and just torn limb from limb on Twitter, where it was such an overwhelming advantage of forces for the bad guys that even somebody liking a tweet in their defense would get pulverized for the like, right? And that means, do you think about like the, you know, the eyeballs are just scouring twitter for anybody who had the thought
Starting point is 00:51:26 of defending somebody okay and so that was a level of you know control that they had over the platform and thus over minds and the signals that were being sent it's really very ugly time and a lot of people like killings it would feel like you had like the james demore thing where he wasn't even posting publicly he was posting internally on a channel at he was at uh google right was that google face yeah or like what happened to tom preston werner at github completely fake episode okay um you can look at this article called facts conveniently withheld okay there's there's tons of these kinds of things that happened in the 2010s where good people were just attacked and they had no way to defend them and so then what happened though steadily gradually year by year we built enough followers
Starting point is 00:52:13 until there's a really interesting tipping point so in 2020 i funded this there's a nigerian i made that public there's a nigerian guy who this analysis. Tech journalism is less diverse than tech. Lul. Now, of course, now we're in 2023. So we know that D-I-E is stupid, okay? And we can actually say that. Fine. But back then, essentially, there was a huge to-do made about how white the tech guy, I'm not the kind of person who believes
Starting point is 00:52:38 white is an insult. They are. And yet they were far, far whiter than all of us, okay? But here's the thing, Solana, to your point, is when you go and look at the raw data over here, you look at the raw spreadsheet, you see a very interesting phenomenon. And the interesting phenomenon is that back in 2020, this is like three years ago, there were only a few journalists who had more than like 100,000 followers on their own. It's like three or four.
Starting point is 00:53:06 Okay. But lots of founders had way more than that. And I realized something. It was like a lightning bolt. It's like, wow, the journalists are not exceptional in their own right as individuals. They're not like charismatic people who people want to listen to.
Starting point is 00:53:23 They can only win if they team up in groups behind brand names and then attack the guy who stands out. That's, you know, it's actually this remarkable thing when you start looking at, of course, followers aren't everything and so on and so forth. But it was remarkable what a difference there was where where it was a hugely right-shifted distribution of founder followings to journalist followings. And so then once Elon took over Twitter and he stopped whatever algorithmic boosting, artificial boosting those guys were getting-
Starting point is 00:53:57 Yes. Well, that has been framed by journalists. I've seen journalists talk about this on threads as Elon is now boosting nefarious people. He's not boosting. He's just Twitter doesn't have enough people working to manage all this at this point. They're just simply no longer boosting these kinds of people. For years, we wondered, how is that random VC who says everything that they like, but has never made a successful deal in his life, the face of venture capital on Twitter, or why do these people need so much engagement? They're being pumped in the trending topics by a team at Twitter that was propping them up, that no longer exists. And now- Twitter moments. Do you guys remember Twitter moments? Yes. I, I are. I used to, I asked almost, I think like every couple of months I'd be like, take, tear down this wall. Yeah. No Twitter moments. People didn't know it by name because it wasn't like, like Google maps, you go to maps.google.com
Starting point is 00:54:56 right? So Twitter moments, uh, was not something that was publicly named. So people didn't know what it was, but it basically is a thing where when you logged into twitter it would show you the trending news story of the day that they had picked and it would give you some prompting as to how you were supposed to think about it in a person you're supposed to attack yes exactly so some poor schmo would basically have the two minutes hate of orwell directed at them right crazy man it was like the lottery that's shirley jackson's story the negative lottery oh yeah yeah yeah yes yes exactly right so here's here's basically the now what we have is what we've done is we've flanked the the traditional legacy media from two directions both ultra short-form content of tweets and ultra long form content of podcasts where they're relatively weaker, you know, than they're. And so, you know, it's obviously it's content, but it's also distribution channels. They didn't have years and years and years of
Starting point is 00:55:55 legacy distribution channels there. They're in the mezzanine. Right. And tech in general is about going, you know, to the extremes on on some. Obviously, it's very small bits of content and very long. And then you tweet out the podcast, or in the podcast, you talk about the tweets. So we're flanking. And of course, what Elon has done with the video stuff has made that even easier to do. That said, by the way, I do think of Twitter. You know how there's that famous, or at least famous when I was a kid, Istanbul is Constantinople. It's Istanbul and Constantinople, right? It switches hands back and forth
Starting point is 00:56:30 because it's so strategic, its location. That's what Twitter is. And it was the free speech party, and then it switched hands and it became the regime's tool. And now it's back to Constantinople.
Starting point is 00:56:46 But I don't know where it's going to be in two or three years. They are attacking the heck out of Elon. Go ahead. I think the next battleground is LLMs, right? At the end of the day, it's about information supply chain attacks, right? If the media is, you know, they have absolute power over information flow, then they have extremely high soft power. We eroded that power with technology in the age of social media. They're still bitter about that. And now the media are trying to serve sort of incumbents or whichever authoritarians are pulling the strings, they're trying to ensure that this complex that they're part of maintains control over or gains control of the new information supply chain
Starting point is 00:57:32 that's emerging that is LLMs, right? They want to be able to shape the cultural priors of the LLMs that shapes how they speak. And if LLMs become like the new Google search on steroids, right, which they are poised to do, they're going to become our source of truth. Right. And so that's why we need decentralized AI. Exactly. I would say that, you know, this is, this is the same battle over and over. They're kind of like still bitter that they lost the social media battle, right? When, when they had like sort of a sort of regime installed within Twitter
Starting point is 00:58:08 and they have like soft control over Twitter, you know, everything was fine. When Elon came in, bought Twitter and disrupted everything, then they demonized him. And, you know, they said that he's dangerous and whatnot when he's just trying to ensure people have the freedom to speak. For the record, in the old regime, my first Beth Jesus account got booted, right? I said that, and I got locked out and I couldn't get back in. So the original sub stack was from my original account because I said COVID came from a lab, which it seems a higher, you know, very high likelihood now, right? Wild that the OG Beth got canceled for the COVID.
Starting point is 00:58:50 Because I feel like at this point, journalists look back on that. A lot of people, I mean, anyone in the kind of in the one party state looks back on COVID and really tries to pretend that it didn't happen. And the COVID lab leak in particular is one. It's like that and Hunter Biden's laptop are the two things that they really just will scream to your face that nobody was censored for, nobody was kicked off a platform for. No, it never happened. It was like, you're just making this up. And they have to do that because those two examples are so incredibly
Starting point is 00:59:20 damning. I've had a couple more honest journalists say, the Hunter Biden laptop one, for example. Yes, it happened, but it's just this one thing. Why do people keep talking about it? It's like, people keep talking about it because it's incredibly important that we were silenced, where there was an attempt made in that way. And the lab leak for me is also right up there. Just that we couldn't talk about this very obvious question, I would say, that was worth asking. There's, you know, in the background, there's cybernetic control of information propagation. They're trying to control people's thoughts by biasing what speech gets algorithmically amplified or suppressed. It's pretty simple. I think that for COVID, there was points where we had lockdowns.
Starting point is 01:00:09 It's like, look, this was the authoritarians. They were like, give us a ton of control. We'll keep you safe. This thing is very dangerous. We gave them a ton of control. It didn't help at all. It didn't help at all. It's total failure. It was actually the bottom-up sort of... The techno capital machine is what saved us. It did all this biotech engineering that, you know, helped save some lives. And so, and at the end of the day, the lockdowns just didn't work, right? Like we still, the thing is still spreading to this day. And so it's a failure of this sort of narrative that if you give the incumbents and those in power more power, right, for your own safety, like it violates the narrative that they'll do anything useful, right? And so they want to suppress that story fundamentally. I want to say one quick thing on Twitter and then get back the NY thing and ask you maybe an uncomfortable question based on what you were just talking about. So on the Twitter thing and the Constantinople piece, it's changing hands back and forth. I would say one person who does not get nearly enough credit for understanding that and acting,
Starting point is 01:01:17 I think, morally in a sort of righteous manner is Jack Dorsey, who at the all, like, first of all, founder of the company was on the free speech side, things tilted in the other direction. And then we have confirmation. We see, first of all, he said all this stuff publicly. He is the reason, one of the key reasons that Elon was brought in and then was able to do what he did. And I have no doubt that he knew things would get crazy with Elon in the company and that things would change dramatically. But my sense was, especially after Hunter Biden's laptop, Jack felt like this is a power that is too great for any one person and it needs to switch hands. It needs to recede. Now, speaking of power, as Balaji, you were saying before, we don't know where this will be in a couple of years,
Starting point is 01:02:11 which is why I get nervous about things like community notes existing and the celebration over people who we don't like, being fact-checked and things like this by the platform. That makes me nervous, but no power, I think more powerful maybe than the AI piece. You guys are talking about, both of you now, decentralization. And I, and Balaji, you're saying that this is the thing that we need. I agree. Because from where I'm sitting, it looks like AI is a naturally centralizing technology. a naturally centralizing technology. I'm skeptical of people who think that AI is the solution to all of our problems. We don't have to keep it, you know, there's a super broad conversation here. On this topic specifically, I am nervous about the amount of power that very few people have.
Starting point is 01:02:59 I'm not worried about some runaway AI. I don't see a lot of small small players you know upstarts with powerful llms i see um i see the giant companies have these things and they have terrible track records on this question of speech in particular and i'm i when i look ahead it's you know it's there is a very real possibility that the people who control this are the sensors and this things, you know,
Starting point is 01:03:28 like you have a handful of these things that are shaping our reality. Who's writing those rules? Certainly not me. And, so, these are central,
Starting point is 01:03:38 they seem like centralizing, not decentralizing technologies. Like, we want to centralize to AI. I don't see that, I don't really see that happening. I think that you can want that but i don't know that that is the natural well no i don't know what are you so you know i mean i i agree that in the current paradigm with the sort of uh power efficiency and density of neural compute we have on gpus it's it there's
Starting point is 01:04:01 a huge advantage to just building a football field supercomputer, scraping all the data over of the internet and like training one big centralized model. But at some point that sort of the advantages of doing that are sort of going to get eroded away and, you know, the big incumbents hopefully can compete each other's margins out. And then there's going to be sort of like an advantage to sort of decentralizing AI, bringing AI towards the data. We're still far from there for now. That's right. There's kind of like just a big advantage in just being a big centralized player. And unfortunately, right, like I said, you know, LLMs are kind of the future information highways for people.
Starting point is 01:04:47 It's much more natural to ask an LLM that's human-like to get information either from the internet or from your database and whatnot. And if what LLMs are allowed to say is shaped by a few people in some room with enormous cultural and soft power, I think that's pretty dystopian. And it opens up, it's kind of turnkey authoritarianism, right? It's even more subversive sort of form of censorship. And those that want to have that sort of power, of course, And, you know, those that want to have that sort of power, of course, you know, are going to push for, you know, AI to be centralized so that they can co-opt these organizations that have the monopoly or oligopoly. And they want to, you know, they want to form such an oligopoly so that they have such power and it's not disruptable by, say, up and coming startups. And so that's why this executive order, they have compute caps very close to what current big incumbents amount of compute they threw into their models are. And they want to ban open source models or make them like a dual use
Starting point is 01:06:07 technology that you need to report the government so that they'll have control over whoever pops up as a big provider of LLMs. And to me, that's really dystopian. I think the P I think the P probability of 1984 is superior to P AI doom by a lot. And, you know, I still invite AI doomers to, you know, explain to me how they think like AI is going to foom and take over, you know, as someone who's used AI in their career to, you know, engineer matter and, and, and, and biologics and all sorts of stuff. Like I can tell you it's much harder than you think. Obviously Balji has a lot of experience in this area.
Starting point is 01:06:46 It's much harder than they think. So they're kind of sci-fi scenarios are implausible, but the thing is they're like those kind of crazy, you know, the doomers that like talk about Foom and like all these crazy sci-fi scenarios, they're instrumental to those that seek power and want to have this sort of soft power of shaping the cultural priors of the LMs because then they'll shape the culture of the people and then they're going to be able
Starting point is 01:07:10 to control the people. And so, I think there's this tech disruption and chaos is a ladder and the existing incumbents are trying to secure power in this new era, just like they kind of did by subverting and co-opting some of the social media giants. We don't want to have a Twitter situation, a pre-Elon Twitter situation for LMs again. We don't want that. And so the whole point of the AI policy part of EAC is to maintain freedom, maintain competitiveness in the market, don't overregulate it. That serves the incumbents and that serves the authoritarians ultimately. And we should have freedom of speech, freedom of thought, freedom of compute.
Starting point is 01:07:56 So, yeah. Paul, on this question, what is your, I mean, do you have any concerns some, any concerns at all that, that you might share with the EA people? Like, do you, like, could you, could you? Well, see, the thing is that, um, you know, Twitter reduces the first bit is which tribe are you in? You know, are you this tribe that, so certainly I'm on the accelerationist side of things in the zero one. With that said, you know, when you add a second bit is, is there a real chance of a super intelligence? I absolutely do think there is. And the reason for that is what most people have seen, you know, over the last year is let's call it chat GPT style AI, right? Which is
Starting point is 01:08:46 generative AI where there's a human in the loop, you know, and you're prompting it and you're typing in things. And it's basically just like search engine plus plus, right? Where, you know, it's not that different from Google search or Google images. It's just, you get better results on the other side. You know, it's might be magical, but it's kind of like that, right? you get better results on the other side. You know, it's might be magical, but it's kind of like that. Right. However, deep mind styled AI. Okay. Where it's got artificial intelligence that is winning games of go right. Winning games of Starcraft. That's something. And just looking at the map, looking at the video game, you know, pixels and able to figure out what the moves are. And so on reinforcement learning, that's different. That's actually the, there's no human in the loop there. The AI is just playing
Starting point is 01:09:29 the game and it's superhuman already, but in a constrained and not time varying environment. That's really important. It is, whether it's a game of Go, game of chess, a video game, right? That's something where there are rules in that thing. And it's kind of a bounded environment in a fundamental way, right? Still, you can envision a world where the combination of, you know, the kind of planning and StarCraft, if you ever play StarCraft. So like StarCraft is something which you, I mean, certainly an AI can be very good at, like resource allocation and so on, decision-making, take that hill and send the troops and whatnot and you can easily imagine that the real world with drones being starcraft style you know controlled right um and you combine that with the ability to speak and generate images and generate video and appear in any form right like you know today course, it's appearing as text, right? When, you know, an AI returns a result to a query, it's text.
Starting point is 01:10:30 But soon it can appear as a VR thing floating in 3D that speaks to you in your language fluently, like, you know, the Blade Runner kind of scene. Okay. When you combine both those things, you get something that's actually fairly powerful, but not that much squinting. Right. And so, so what I think is that every community of fairly large size, every community that survives, in my view, crowdfunds, when I say crowdfund, I mean, in the broadest sense, it could be literally with tax revenue, that's the most traditional method of crowd funding, albeit involuntary crowd funding, right? But every large enough community has its, I will call it, AI god, okay? Its network god, which is all of the knowledge of that community and the values of that community right um and so you could
Starting point is 01:11:27 imagine a christian version of this you could imagine though there'll be different christian versions there'll be a protestant version a catholic version and a russian orthodox and of course there's different denominations okay you could imagine um so i can imagine many hindu versions of this right there's many different kinds of you you know, subsex and Dharmic sex. And I don't know how many different ones of these are. There's definitely going to be a Chinese version, right? Absolutely, there'll be a Chinese version. And that'll be a pretty fearsome thing that commands all those drones.
Starting point is 01:11:55 That's like a crazy kind of thing, okay? That is definitely something I'm concerned about. But I do think that the part which is very underrated by the AI doomers is the difficulty of taking physical form. What I think is for a long time, there'll be a symbiotic relationship between the digital intelligence, which literally lives in the cloud. It's like lives as an electromagnetic wave, you know, in a sense, right? And us as physical beings, right? And these are like coordination mechanisms for us. One way of thinking about it is the network God replaces George Washington or Lee Kuan
Starting point is 01:12:36 Yu or the leader or the state that's at the center of, you know, your hub and spoke topology of your society, right? You have some wise thing, some decision maker that you go to for decisions. And if you could go to something that just had infinite bandwidth and that gave you an immensely amazing answer that was really good, that was supported by all of your history, and it could do so instantly, well, it's almost like a really good CEO that can almost micromanage the entire society. That's not impossible given what we've seen. Let me pause there. I don't know. I think from a perception and control standpoint, yeah, if you have perfect perception and prediction
Starting point is 01:13:26 ability, in principle, you can have like centralized top-down control, but in reality, you know, having sufficient perception everywhere and having a predictive model of the world that has many variables under, you know, very intricately linked i'm not saying to be clear when i say better than human i'm not saying omnipotent right right i'm just saying that basically your civilization's ai will have just like you know there's like a the catholic church and it has a bunch of um you know various uh cardinals and stuff and then there's a pope at the top right and they're interpreting scripture and so on imagine something sort of like that where you have a bunch of engineers milling around and making edits to the network god or the ai god
Starting point is 01:14:14 or whatever you want to call it that's at the center of your civilization yeah and they make edits almost like priests you know interpreting scripture or today lawyers interpreting the constitution around the president that's a known form whether it's cardinals and priests circling you know some head religious figure or it's lawyers and politicians signal circling some head uh political figure you have engineers circling this sort of head ai figure right yeah yeah yeah so go ahead so the future like you you said this as well, like is polytheistic gods, AI gods, right? Polytheism. Yeah. And what sort of the incumbents want is they want monotheism
Starting point is 01:14:56 and they want to shape that god, right? And they want control over that god. And it's kind of funny how the EAs and AI doomers are, you know, they actually work very often for these organizations. They work at Anthropic, at OpenAI. They get jobs there. And they both, like, fear this god but are working to create it and want to make sure there's only one. And that's their control of it. Which is so, like, yeah. Well, here's what it is. AI alignment is actually ea alignment yeah that's the key insight right they want to create the ai and what they
Starting point is 01:15:34 define as alignment is aligned with ea values yep now here's the thing anything that's centralized cannot be aligned with the values of eight billion people obviously right like at a minimum like i know india is working on its own stuff or whatever and it'll have some stuff dubai has got its own everybody's going to have their own stuff and now by the way on the open sourcing right so decentralization is one way of talking about decentralization another word of document is open sourcing well so the diffusion models actually have made significant progress in open sourcing because they're easier to open source than the LN. So here, for example, Playground just released one. So give me something, getting to Mars, right? On a shuttle.
Starting point is 01:16:17 Okay. You can go here right now. Okay. So Hale just released this yesterday. So it's 70,000 downloads already. It'll take 20 seconds and you can download this model and you can run it in a script, okay? That's like an okay example. You know, whatever, generatively, I just know it. But at least you can see it works. You might have to prompt it with more queries, okay? So the point, like we want a diversity
Starting point is 01:16:38 of all sorts of subcultures and we don't want to have a monoculture. Like what happened with social media, if you centralize control over spread of information, you centralize control over power, you centralize the power of shaping culture, and then you end up with a monoculture. And anything that diverges from that monoculture is suppressed.
Starting point is 01:16:57 You're canceled, right? We're already seeing this, though. And this is the problem that I have with what you guys are talking about. It sounds, obviously, what you're describing sounds better, but the path that we're on is not that. And skepticism or concern over AI feels... I somewhat screwed that. Look at this.
Starting point is 01:17:11 You are the giants. What is it, Google? It is, we have OpenAI and Microsoft. Yeah, but Lama too. We've both seen what they do. We should not be trusting these people, and they have the power. So, Mike, i agree with you but i'd also say uh the rebels aren't without anything llama 2 i mean zuck is a real mvp on open sourcing ai right and zuck is zuck's the champion
Starting point is 01:17:38 of speech that's amazing thing here's the thing. I know that sounds crazy, but here's the thing. CEOs in any given vertical will counter position against their rivals. If one guy is leading with the closed source centralized version, often it only takes one CEO who will counter position with the open source version. right and so what zuck decided to do which is very smart is he's like okay rather than go head to head with chat gpt and so on we're going to go the opposite extreme and we're going to release this open source and llama 2 is very i mean gv am i wrong like llama 2 is quite popular as an open response are you discreet yeah yeah yeah no i mean that that's totally true i would say that um you know there there are startups you know mistral is kind of a- Mistral also, yeah. Mistral, I mean, we have Perplexity, Replent, they've all started, tons of startups are training their own model. And the point is it's eroding power away from the big centralized players. It's like, I'd rather have a model that has less parameters or maybe had a bit less data,
Starting point is 01:18:43 but at least it's free to actually tell you its thoughts. It doesn't have all sorts of prompt engineering and reinforcement learning to suppress its own speech. I mean, nowadays the centralized models are so scared to say anything cancelable. They're not that useful. So I think the market is going to value freedom at some point. And it's been very fast progress in AI. And the difference between a centralized approach versus a decentralized approach is the centralized approach, you can steer it very quickly, right? If you have billions of dollars of capital, you can build a supercomputer, you can hire a bunch of people and train a massive model on a one to two year timescale. The thing that kills the incumbents is time because the variance of
Starting point is 01:19:40 having many open source forks kind of searching over hyper parameter space, searching over space of techniques and AI is that at some point they explore an area that the big centralized players can't because they need to make one big bet on one huge centralized model. And so over time, variance wins over centralized control. It's just we're on a short timescale right now. So it seems like we're losing. But the reality is that we're not trailing that far behind. And if once they've trained these large models and they got it at some point, they put in so much capital per model, they got to like leave it there and just let it, you know, recoup its losses through revenue.
Starting point is 01:20:22 At some point, you know, that's going to give time for the open source models and the open source community to sort of do a more diversified search over the space of models and start eroding away their market advantage. Yeah. Last one on this, and then I'll ask the final question
Starting point is 01:20:39 because we've got to dip. Sure. GV, can you raise your right hand and go like that? Okay. Why? Because then we can do a, I'm not sure where the videos are, but like high five crypto EAC. Okay. Because you gave a talk, right? One, two. All right. Boom. Whatever. Boom. I'm sure you can make that work in the video. All right. Because here's why. Reinforcements are coming, and let me make a few arguments here. First is Vitalik put a post the other day on decentralized acceleration,
Starting point is 01:21:20 and that means I think the Ethereum community is going to get into crowdfunding open source AI models. Beth gave a talk at the Network State Conference a few weeks ago it's a great talk also on kind of aligning crypto and there'll be various schools of this okay the you know the bitcoin people have one view and ethereum people in another view and a lot of people have another view uh the other salon a lot of people right um and that's a huge pool of capital, right, to crowdfund things. And I come back to that point. Number two is that these centralized AIs are getting lobotomized, right? They're being made dumber because they have to be inoffensive. They're even becoming like millennials in another way, not just woke, but lazy, where they're like on strike and they won't finish your code snippet and they'll say, oh, you can fill in the rest yourself. I'm like, I don't want it to
Starting point is 01:22:11 fill it. It's the whole point. They don't want the search engine to fill it in, right? Okay. Number three is as precedent. I mean, what you're talking about is a cathedral in the bazaar, GV, right? Which is, you know, does centralized for-profit development win? Or does decentralized open source development win? And we often find in many spaces is it is a stalemate between them, right? Like Linux is very, very popular. I mean, Linux on the desktop has arguably worked in the sense of Android is Linux on the desktop. And even iOS is BSD under the hood. So it's Unix on the desktop has arguably worked in the sense of Android is Linux on the desktop and even iOS is BSD under the hood. So it's Unix on the desktop in a sense. Right. It's a very customized version.
Starting point is 01:22:51 And so, you know, whether it's Windows, Linux or it's iOS, Android or it is, you know, the closed source SAS version versus the open source version. You have GitHub and then you get GitLab, right? And then I'm pretty sure because of the huge advantages as a developer, as an engineer, you can have consortia. You're always seeing this with Dubai and other places. You have consortia that say, look, we don't want to be choke-pointed by Microsoft. As much as I respect Seth Nadell's execution, as much as I respect Sam Altman's execution, I mean, they're doing phenomenally, phenomenally well.
Starting point is 01:23:28 They should make tons of money. I want them to have as much money as, you know, stacks of money. That's great, right? They deserve it. Still, money is fine. Power is not, right? We need to decentralize the power.
Starting point is 01:23:41 And I think a lot of organizations, a lot of smart people around the world, they're kind of working on that. And I think the fundamental thing that GV is saying is the two time constants that are not obvious how that's going to pan out is how much better the centralized models get before the decentralized models catch up. Like that is to say, you know, can these improve so much faster before these other ones catch up? Or what are those two time constants look like? That's not obvious to me how that works. And I guess we'll see. I mean, the fact that Google, with all of Google's resources,
Starting point is 01:24:18 put out this, you guys saw the Gemini video? Yeah, yesterday. If you look at their blog post, they, uh, I mean, they kind of helped it along a bit, right? Like that video was edited and so on and so on. I was going to put up a post on this. I love the people there. Some of them, they're really, really skilled technically, but in some ways it doesn't look like these are improving as fast as we thought, you know, in the sense of like, you know, maybe, maybe it's maybe generative AI, um, is, uh, there's 5,000 applications of it, but, um,
Starting point is 01:24:55 what I'm saying, there, there's a, there's a world where these things saturate and then open source catches up if they don't saturate and they just keep going like this and open source can't catch up, I don't know what happens on that. I want to wrap it up and I want to do it. I want to tie it back to kind of like the original
Starting point is 01:25:14 sort of opening up the chat. The reason we're all here today on this topic of, you know, I guess, working towards a more decentralized future, a big part of that is protecting the players involved in that space. And I think increasingly they'll be under attack in all sorts of ways. I don't know that you're at risk really in the way that you once were. Just look at you now. I mean, you're totally fine. Your company has raised the money. Now it's like,
Starting point is 01:25:43 we're good to go here. You say no. You think you're not totally fine. Your company is, is, you know, raise the money. It's announced like we're, we're good to go here. You say, no, you say you think you're not totally. Well, I think, I mean, I know, you know, obviously there's more, they're swarming now as you, as ball as you predicted the scariest sentence in the English language is ball as you was right. And I'm, I'm experiencing that. So, you know, there's going to be more pieces that come out and, you know, uh, they're probably not gonna be as nice um we've seen your tweets they're not it's like i'm literally just arguing for freedom and if if i get taken down for that then so be it um and i think you know the great unification here is like you know mike you're you're you want sort of freedom of press so freedom of you want freedom of press, freedom of speech, and Ambology wants freedom of exchange,
Starting point is 01:26:28 freedom of denomination, and I want freedom of compute. And all these three things, we're all unified as pro-freedom, against the incumbents, against the authoritarians trying to have centralized control over everything. And that's why we're all problematic, right? And that's why we have to band together and fight. And that's what we're doing problematic, right? And that's why we have to band together and fight. And that's what we're doing here. And we're having this discussion. And hopefully it inspires people from sort of all three types of libertarian camps to collaborate. And so...
Starting point is 01:26:59 By the way, that's a really good way to kind of... Solana, freedom to speak. Biology, freedom to transact. Beth, freedom to compute. I think that's a really good that's a really good way to kind of my salon of freedom to speak biology freedom to transact that freedom to compute i think that's a really good you know summary go and well i guess i want to just ask then and this was for you biology like how do we how do we i guess how do we how do we fight until we achieve our goals? Like, what is the way to keep... You mentioned earlier the video game analogy
Starting point is 01:27:29 of the new guys showing up on the field, right? Like, how do we protect those people? And how do we, I guess, protect, like, the key players as we build towards a world, I think, that we all kind of want to be a part of so i think first is probably you know you and i should write um like a post that has both immediate tactics and history lesson for the young guns and we should also make a video like a tiktok style 90 second video do not talk to journalists and then what's the second rule i said don't fucking talk to
Starting point is 01:28:07 journalists you do not try to violate this rule you will regret it right and then give all the kind of things there right and then you know truly by the way i can talk about this like but there's the top the concept of journalists by the way there's um is tucker car Carlson a journalist even though he's got a TV show? No, he's not. Why? Because he's red. Is CGTN a journalist? No, they're Chinese.
Starting point is 01:28:30 Is Glenn Greenwald or are we journalists? No, we're tech journalists or whatever, tech media. It's not actually the practice. It's the tribe. We have to just build up our own tribe's capabilities. And so we have to just build up our own tribe's capabilities. And once you think of Tech Tribe as its own tribe, we have our own media. We have our own decentralized, ideally, AI.
Starting point is 01:28:53 We have our own cryptocurrencies and so on and so forth. And so I think that's really the right step is tribe formation. And then if you're in the tribe, then we have certain guidance from the tribal elders. And if you're not, then okay, God help you, you know? And then we probably want to have some admission mechanisms and so on and so forth and badges and things of that nature. So I think that might be where things go. And a city with that. And a city, yes.
Starting point is 01:29:16 But that is a podcast for another day. You guys, it has been The Realist. Thank you both for joining. Big fans, obviously, of both of you and Bology friend as well. Gil will be soon, uh, especially now that you're out of the shadows. Uh, why is subscribe to this podcast? I never ask. And it actually is very important for the algorithm.
Starting point is 01:29:42 Please subscribe, like it, send it to your friends. Uh, and we will catch you here next week. Later.

There aren't comments yet for this episode. Click on any sentence in the transcript to leave a comment.